Follow us on Twitter
twitter icon@FreshPatents


Gesture patents

      

This page is updated frequently with new Gesture-related patent applications.

Electronic programming guide with expanding cells for video preview
This document describes methodologies for an electronic programming guide with expanding cells for video preview. These techniques and apparatuses enable improved navigation for video and channel previewing based on gestures performed on a mobile device acting as a remote control to a remote display device.
Google Inc.


Monitoring
A method comprising: recognizing a first two-handed gesture and a second two-handed gesture in a monitored scene space to at least partially define a shape and position of a computer-implemented virtual boundary in a corresponding monitoring space, wherein the first two-handed gesture in the monitored scene space specifies a first two points in the monitoring space and the second two-handed gesture in the monitored scene space specifies a second two points in the monitoring space; causing implementation of the computer-implemented virtual boundary in the monitoring space corresponding to the monitored scene space, wherein a shape and position of the computer-implemented virtual boundary is at least partially defined by the first two points in the monitoring space and the second two points in the monitoring space; and processing received data to generate a response event when there is a change in a portion of the monitored scene space relative to the computer-implemented virtual boundary in the corresponding monitoring space.. .
Nokia Technologies Oy


Contemporaneous gesture and keyboard for different levels of entry authentication
A restricted access device such as a cellphone, a tablet or a personal computer, analyzes contemporaneous keyboard inputs of a password and gestures to authenticate the user and enable further access to applications and processes of the restricted access device. The gestures may be facial gestures detected by a camera or may be gestures made by an avatar rendered on a display of the device.
International Business Machines Corporation


Keyboard entry as an abbreviation to a contemporaneous gesture authentication
A restricted access device such as a cellphone, a tablet or a personal computer, analyzes contemporaneous keyboard inputs of a password and gestures to authenticate the user and enable further access to applications and processes of the restricted access device. The gestures may be facial gestures detected by a camera or may be gestures made by an avatar rendered on a display of the device.
International Business Machines Corporation


Contextual contemporaneous gesture and keyboard entry authentication
A restricted access device such as a cellphone, a tablet or a personal computer, analyzes contemporaneous keyboard inputs of a password and gestures to authenticate the user and enable further access to applications and processes of the restricted access device. The gestures may be facial gestures detected by a camera or may be gestures made by an avatar rendered on a display of the device.
International Business Machines Corporation


Contemporaneous facial gesture and keyboard entry authentication
A restricted access device such as a cellphone, a tablet or a personal computer, analyzes contemporaneous keyboard inputs of a password and gestures to authenticate the user and enable further access to applications and processes of the restricted access device. The gestures may be facial gestures detected by a camera or may be gestures made by an avatar rendered on a display of the device.
International Business Machines Corporation


Recognition of free-form gestures from orientation tracking of a handheld or wearable device
A user performs a gesture with a hand-held or wearable device capable of sensing its own orientation. Orientation data, in the form of a sequence of rotation vectors, is collected throughout the duration of the gesture.
Intel Corporation


Methods and devices for providing optimal viewing displays
Systems and methods for presenting a user interface in a first mode and a second mode based on detection of a touch gesture is described herein. In some embodiments, a first user interface may be presented on an electronic device's display.
Amazon Technologies, Inc.


Radar-based gesture-recognition through a wearable device
This document describes techniques and devices for radar-based gesture-recognition through a wearable device. The techniques enable an easy-to-use input interface through this wearable radar device, in contrast to small or difficult-to-use input interfaces common to wearable computing devices.
Google Llc


System and a blended reality user interface and gesture control system
A blended reality user interface and gesture control system includes one or more sensors, a head-mounted display, and a blending engine. The blending engine is configured to receive a live reality and virtual reality feeds, track movement of a user using the sensors, detect a command based on the tracked movement, blend the live and virtual reality feeds into a blended view based on the detected command, and display the blended view on the head-mounted display.

Motion and gesture input from a wearable device

This disclosure relates to detecting hand gesture input using an electronic device, such as a wearable device strapped to a wrist. The device can have multiple photodiodes, each sensing light at a different position on a surface of the device that faces skin of a user.
Apple Inc.

Method of instant sharing invoked from wearable devices

Techniques are disclosed herein for establishing a file transfer connection via wearable devices (e.g., head-mounted wearable devices). A first wearable device generates a gesture-based connection request to connect with a second wearable device.
International Business Machines Corporation

Analysis of user interface interactions within a virtual reality environment

The disclosure describes systems and methods of analyzing interactions with a user interface for an application, where the user interface is implemented at least partly within a virtual reality environment. Certain embodiments provide for receiving interactions that include gestures, spatial contexts, and applications contexts, and receiving results from the application, such as application behavior or error conditions.
Adobe Systems Incorporated

System and gesture detection for a remote device

A method for operating a mobile device includes detecting a gesture by the mobile device. Detecting the gesture includes receiving a reflected millimeter wave signal by the mobile device, generating a first message in accordance with the detected gesture, and transmitting the first message from the mobile device to an external remote device.
Infineon Technologies Ag

Method and device for controlling unmanned aerial vehicle with gesture

A method and device for controlling an unmanned aerial vehicle with a gesture are provided. A camera is arranged in the unmanned aerial vehicle, and the method includes: detecting a gesture in an image by using a gesture detection framework; judging whether the gesture is a predetermined gesture for controlling the unmanned aerial vehicle; acquiring a motion trajectory of the gesture in a case that it is determined that the gesture is the predetermined gesture for controlling the unmanned aerial vehicle; and controlling, based on the motion trajectory of the gesture, the unmanned aerial vehicle to perform a control operation corresponding to the motion trajectory of the gesture, where a correspondence between a motion trajectory of a gesture and a control operation is predetermined..
Beijing Zero Zero Infinity Technology Co., Ltd

Method and using gestures to control a measurement device

A method and system are provided for controlling a measurement device remotely through gestures performed by a user. The method includes providing a relationship between a command and a gestures.
Faro Technologies, Inc.

Electronic apparatus, processing system, and processing program

Provided is an electronic apparatus that perform an appropriate process according to a gesture of a subject person, the electronic apparatus including: a first input unit that inputs a detection result of a biosensor detecting a change in biological information of a subject person; a second input unit that inputs a recognition result of a recognition device recognizing an action of the subject person; and a processing unit that performs a process according to the action of the subject person based on input results of the first and second input units.. .
Nikon Corporation

Home device controller with touch control grooves

A home device controller can include one or more touch grooves, a touch slider operatively coupled to each of the one or more touch grooves, and one or more processors. The processor(s) can receive signals from each touch sensor, each signal corresponding to a touch gesture performed by a user interacting with the one or more touch grooves.
Brilliant Home Technology, Inc.

Method and process for determining whether an individual suffers a fall requiring assistance

A method for monitoring an individual in a dwelling so as to know when such individual falls or indicates the need of assistance. A plurality of 3d motion and sound sensors are located in the dwelling and provide data to a computerized monitoring system.
Cerner Innovation, Inc.

Display method and apparatus

The present disclosure discloses a display method and apparatus. A specific implementation of the method comprises: acquiring an experimental environment image; presenting, on the experimental instrument image using an augmented reality approach, a status image indicating an experimental status of an experimental instrument when performing the target experiment; in response to detecting a gesture operation of a user, determining experiment effect information of an experiment operation based on a corresponding relation between the gesture operation and the experiment operation of the target experiment, wherein the experiment effect information comprises an experiment effect image; and displaying, using the augmented reality approach, the experiment effect image in the experimental environment image having presented the status image.
Baidu Online Network Technology (beijing) Co., Ltd.

Image segmentation with touch interaction

In one embodiment, a method includes detecting one or more objects in an image, generating at least one mask for each of the detected objects, wherein each of the masks is defined by a perimeter, classifying the detected objects, receiving gesture input in relation to the image, determining whether one or more locations associated with the gesture input correlate with any of the masks, and providing feedback regarding the image in response to the gesture input. Each of the masks may include data identifying the corresponding detected object, and the perimeter of each mask may correspond to a perimeter of the corresponding detected object.
Facebook, Inc.

Hand gesture recognition for virtual reality and augmented reality devices

A system for hand gesture recognition includes a display, camera, memory, and processor. When the processor is to execute instructions, the processor is to estimate one or more motion vectors of an object using a pair of consecutive frames and estimate an average motion vector of the object.
Intel Corporation

Systems and methods to present reactions to media content in a virtual environment

Systems, methods, and non-transitory computer readable media are configured to receive a recording of an expression of a content provider in response to a digital environment. The expression can be based on at least one of gestures, body movement, speech, and sounds of the content provider.
Facebook, Inc.

Intergrated wearable security and authentication apparatus and use

Embodiments shown provide a wearable device capable of acquiring images association with an unknown person facial features of the user gesture commands using an integrated wearable security system. The system enables a user to discreetly capture images of individual or environment using an integrated camera, video, and audio component.

Measuring somatic response to stimulus utilizing a mobile computing device

A mobile computing device for measuring somatic response of a user to stimulus includes motion sensors, a volatile memory, and a processor for: executing a baseline calibration process including receiving first and second supervised data from the user, and first and second sensor data from the motion sensors, while the user performs a triple whip gesture, calculating signal strength of the first and second sensor data using a k-means clustering algorithm, and executing a classification process including reading third unsupervised data from the user and third sensor data from the motion sensors while the user performs the triple whip gesture.. .
Sensie, Llc

Head worn wireless computer having high-resolution display suitable for use as a mobile internet device

A handheld wireless display device, having at least svga-type resolution, includes a wireless interface, such as bluetooth™, wifi™, wimax™, cellular or satellite, to allow the device to utilize a number of different hosts, such as a cell phone, personal computer, media player. The display may be monocular or binocular.
Kopin Corporation

Virtual keyboard

One or more computing devices, systems, and/or methods for facilitating user input are provided. For example, a keyboard display trigger event for a computing device may be identified (e.g., a user interacting with a text input interface).
Yahoo!, Inc.

Electronic device and control method

The electronic device (1) includes a proximity sensor (18), a touch panel display (display 14) configured to display an icon that serves to enable a predetermined mode and gesture detection via the proximity sensor (18) and a controller (11) configured to, when a touch on the icon is detected, start the predetermined mode, start gesture detection via the proximity sensor (18) and change a characteristic of the icon. The controller (11) may change the color or the shape of the icon..
Kyocera Corporation

Writing gesture notification method and electronic system using the same

A writing gesture notification method is provided. The writing gesture notification method, applied to an electronic device of an electronic system, the electronic system includes the electronic device and an input device, the writing gesture notification method comprising: obtaining input information from the input device; obtaining first contact information and orientation information of the input device from the input device; sensing second contact information via the electronic device; comparing the input information, the first contact information, the orientation information and the second contact information with preset information to generate a comparison result; and providing a notification via the electronic device according to the comparison result..
Asustek Computer Inc.

Electronic device and control method therefor

An electronic device including a housing having a grip portion for gripping the electronic device to input content; a microphone included in the housing; a memory included in the housing; a sensor located at one side of the housing and configured to sense movement of the electronic device corresponding to the input content; and a controller configured to operate in a first mode in which a sound acquired through the microphone and the content input by the electronic device are stored in the memory, operate in a second mode in which the sound stored in the memory is reproduced, and operate in a third mode in which at least a portion of the stored sound is editable according to a gesture acquired through the sensor.. .
Lg Electronics Inc.

Multi-task machine learning for predicted touch interpretations

The present disclosure provides systems and methods that leverage machine learning to predict multiple touch interpretations. In particular, the systems and methods of the present disclosure can include and use a machine-learned touch interpretation prediction model that has been trained to receive touch sensor data indicative of one or more locations of one or more user input objects relative to a touch sensor at one or more times and, in response to receipt of the touch sensor data, provide one or more predicted touch interpretation outputs.
Google Inc.

System and gesture control

A system with a processing system, an input device integrated within the processing system and coupled with the processing system, and a sensor arrangement integrated with the processing system and configured to monitor an area above the input device, and a controller coupled with the sensor arrangement to detect predefined input actions, wherein the controller is coupled with the processing system and wherein the predefined input actions are combined with inputs from the input device.. .
Microchip Technology Incorporated

Virtual touchpads for wearable and portable devices

Systems and methods for using portions of the surface around a wearable smart device as virtual touchpads for data and command entry are disclosed. For wearable smart devices worn on the wrist like smartwatches, the skin area over the back hand as well as the surface area over the arm adjacent to the device may serve as virtual touchpads.
Innoventions, Inc.

Sensorized spherical input and output device, systems, and methods

Described herein are embodiments of electronic sensorized spherical input and output devices, systems, and methods for capturing gestural input from a user's physical interactions with a spherical device. In one embodiment, the spherical input and output device includes a number of sensors along the surface area of the sphere in a configuration conforming to a user's fingers and hands.

Gesture control of gaming applications

Methods, systems, and computer program calibrate a gaming application. An image of a player is received.
At&t Intellectual Property I, L.p.

Electronic device, computer-readable non-transitory recording medium, and control method

An electronic device comprises: a proximity sensor; and a controller configured to determine a direction of a gesture in accordance with an output from the proximity sensor and a state of the electronic device.. .
Kyocera Corporation

Controller for finger gesture recognition and recognizing finger gesture

A controller for finger gesture recognition, including a griping body, a manipulating component and a sensing component, is provided. The griping body includes a head portion and a griping portion which is opposite to the head portion and includes a plurality of finger contact areas.
Htc Corporation

Method and device for enabling virtual reality interaction with gesture control

The present invention is a method for enabling virtual reality interaction with gesture control, comprising the following steps: displaying a photographed second image in response to a gesture detecting signal; recognizing an actual gesture action in the second image and converting the same into a graphic pointer; and displaying a first image and displaying the graphic pointer at a position corresponding to the first image for interaction. A user may conduct control directly with a gesture to enable virtual reality, thereby solving the problem where the user cannot see an actual scenario as well as increasing the ease of use for virtual reality..
Arcsoft (hangzhou) Multimedia Technology Co., Ltd.

Input method touch device using the input method, gesture detecting device, computer-readable recording medium, and computer program product

An input method for being loaded into a processor to execute following steps: triggering an input device to perform a gesture input process, comprising: recording a triggered site of a gesture as a sampling point once every unit of time, and recording a turn variation of an interval link between one said sampling point and a next said sampling point; comparing the turn variation to a variation threshold, and when the turn variation is greater than the variation threshold, marking the relevant sampling points as a featured sampling point; chronologically linking plural said featured sampling points into featured line segments, recording proportions of lengths of the featured line segments, and forming a graphic code according to the proportions; and comparing the graphic code to codes contained in a lookup table, so as to perform a predefined function represented by the graphic code.. .
Idesyn Semiconductor Corp.

Operation input apparatus and operation input method

A position and posture acquisition unit acquires information on a posture of a user wearing a wearable display apparatus. An operation input unit accepts an operation input through a gesture by the user.
Sony Interactive Entertainment Inc.

Motion-gesture detection equipment, treadmill equipment and motion-gesture detection method

A motion-gesture detection equipment includes a display screen, an image-acquiring device and a processor. The image-acquiring device is to acquire in real time motion-gesture information of an exercising person.
Boe Technology Group Co., Ltd.

Automatic detection of panoramic gestures

Aspects of the disclosure relate to capturing panoramic images using a computing device. For example, the computing device may record a set of video frames and tracking features each including one or more features that appear in two or more video frames of the set of video frames within the set of video frames may be determined.
Google Llc

Voice activated modular controller

A modular controller may be mounted in an opening, such as a standard single wide or double wide electrical junction box, in a wall or other surface. The modular controller may include a power module and a front module.
Amazon Technologies, Inc.

Structural modeling using depth sensors

Techniques are presented for constructing a digital representation of a physical environment. In some embodiments, a method includes obtaining image data indicative of the physical environment; receiving gesture input data from a user corresponding to at least one location in the physical environment, based on the obtained image data; detecting at least one discontinuity in the physical environment near the at least one location corresponding to the received gesture input data; and generating a digital surface corresponding to a surface in the physical environment, based on the received gesture input data and the at least one discontinuity..
Qualcomm Incorporated

System and content presentation selection

A system and methods for content presentation selection. One method includes displaying, on a display of a portable device, a plurality of tiles.
Motorola Solutions, Inc.

Methods and graphical user interfaces for editing on a multifunction device with a touch screen display

In some embodiments, a device displays content on a touch screen display and detects input by finger gestures. In response to the finger gestures, the device selects content, visually distinguishes the selected content, and/or updates the selected content based on detected input.
Apple Inc.

Gesture recognition devices and methods

Devices and related methods are disclosed herein that generally involve detecting and interpreting gestures made by a user to generate user input information for use by a digital data processing system. In one embodiment, a device includes first and second sensors that observe a workspace in which user gestures are performed.

Input determination method

Methods and systems for determining intent in voice and gesture interfaces are described. An example method includes determining that a gaze direction is in a direction of a gaze target, and determining whether a predetermined time period has elapsed while the gaze direction is in the direction of the gaze target.
Google Llc

Smart playable device and charging systems and methods

This disclosure is generally directed to a smart playable device and charging systems and methods. The playable device can include any device that is suitable for sports, games, or play, such as balls, discs, staffs, clubs, and the like.
Play Impossible Corporation

Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs

An electronic device with a display and a fingerprint sensor displays a fingerprint enrollment interface and detects, on the fingerprint sensor, a plurality of finger gestures performed with a finger. The device collects fingerprint information from the plurality of finger gestures performed with the finger.
Apple Inc.

Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs

An electronic device with a display and a fingerprint sensor displays a fingerprint enrollment interface and detects, on the fingerprint sensor, a plurality of finger gestures performed with a finger. The device collects fingerprint information from the plurality of finger gestures performed with the finger.
Apple Inc.

Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs

An electronic device with a display and a fingerprint sensor displays a fingerprint enrollment interface and detects, on the fingerprint sensor, a plurality of finger gestures performed with a finger. The device collects fingerprint information from the plurality of finger gestures performed with the finger.
Apple Inc.

Method and electronic device for providing multi-level security

A method and electronic device for providing multi-level security is provided. The method includes detecting a type of a gesture performed by a user on a portion of the electronic device, identifying the user based on fingerprint data of the user, wherein the fingerprint data of the user is obtained while the gesture is being performed, determining at least one activity to be performed based on the type of the gesture and the user, and controlling the at least one activity to be automatically performed in the electronic device..
Samsung Electronics Co., Ltd.

Distributed networking of configurable load controllers

A touch-control device is described, comprising a first load controller connectable to control a first endpoint electrically coupled to the load controller; a touch-input surface associated with the first load controller; a network interface communicatively coupled with a network interface of a second touch-control device, wherein the second touch-control device includes a second load controller connectable to control a second endpoint electrically coupled to the second load controller; and a processor configured to generate a first gesture signal representative of a first gesture at the touch-input surface, select the second endpoint as a target device, the selecting based at least in part on the first gesture, and control the target device based, at least in part, on the first gesture signal.. .
Ube, Inc. D/b/a Plum

Method and device for gesture control and interaction based on touch-sensitive surface to display

The present disclosure discloses a gesture control and interaction technology based on “touch-sensitive surface to display”. A novel interaction method is proposed by combining the touch-sensitive surface technology with display technology.
Beijing Luckey Technology Co., Ltd.

Vehicle and control method thereof

A vehicle includes a display for providing a character input interface, a touch input apparatus for sensing a plurality of sub gestures sequentially through a touch portion, and a controller for controlling the character input interface to display, when a difference between a direction of a first sub gesture of the plurality of sensed sub gestures and a direction of a second sub gesture sensed within a first delay time period starting at a time at which the first sub gesture ends is greater than or equal to a reference direction difference, a character corresponding to a gesture including the first sub gesture and the second sub gesture.. .
Kia Motors Corporation

Method and using gestures across multiple devices

Method and apparatus for implementing gestures across user interface display apparatuses, including detecting and saving, at a first user interface display apparatus, an initial user input; determining whether the initial user input is within a predetermined proximity to a boundary with a second user interface display apparatus; detecting and saving additional user input continuing from the initial user input; when the initial user input is within the predetermined proximity, incorporating additional information from a transition message received within a predetermined time period from the second user interface display apparatus to the saved user input, the predetermined time period corresponding to a message time between the first and second user interface display apparatuses from a time of the initial user input; and implementing the saved user input on one or more of the first and second user interface display apparatuses.. .
Nureva Inc.

Apparatus, method and recording medium for controlling user interface using input image

A method of controlling a user interface using an input image is provided. The method includes storing operation executing information of each of one or more gesture forms according to each of a plurality of functions, detecting a gesture form from the input image, and identifying the operation executing information mapped on the detected gesture form to execute an operation according to a function which is currently operated..
Samsung Electronics Co., Ltd.

Method and detecting gesture in user-based spatial coordinate system

Disclosed are a method and an apparatus for accurately detecting a gesture from a user's motion using a user-based spatial coordinate system. A method for detecting a gesture in a user-based spatial coordinate system comprises the steps of: setting a user-based spatial coordinate system using a first body coordinate corresponding to a first body part of the user as a starting point; analyzing the motion vector state of a second body coordinate corresponding to a second body part of the user in the user-based spatial coordinate system over time; and detecting the user's gesture on the basis of a change in the motion vector state..
Vtouch Co., Ltd

System and sharing data-content between applications

A method for sharing data-content between applications on a display of a computing device. The method includes providing an application to be currently displayed on the display of the computing device; selecting a data-content from the currently displayed application upon recognizing a pre-determined activation gesture; presenting at least one user-interface-icon associated with at least one application, other-than the currently displayed, on the display of the computing device; and sharing the selected data-content from the currently displayed application to an application associated with a user-interface-icon selected from the at least one user-interface-icon..
Fastfingers Oy

Method for operating an operator control device and operator control device for a motor vehicle

An operating gesture of a user and at least one spatial position in which the operating gesture is performed are sensed without contact by a sensing apparatus of an operating device of a motor vehicle. Then a function of the motor vehicle is controlled according to the operating gesture if it was sensed that the at least one spatial position lies within a predetermined interaction space.
Audi Ag

Augmented reality user interface

A technique for interacting with a computing device includes operating an ar (augmented reality) headset as a ui (user interface) component of the computing device. The technique includes pairing the ar headset with the computing device to establish a communication pathway between the two.
Getgo, Inc.

Detecting, representing, and interpreting three-space input: gestural continuum subsuming freespace, proximal, and surface-contact modes

Systems and methods for detecting, representing, and interpreting three-space input are described. Embodiments of the system, in the context of an soe, process low-level data from a plurality of sources of spatial tracking data and analyze these semantically uncorrelated spatiotemporal data and generate high-level gestural events according to dynamically configurable implicit and explicit gesture descriptions.
Oblong Industries, Inc.

Device and bidirectional communication between a vehicle and a passerby

A device and method for a vehicle to provide a bidirectional communication between the vehicle and at least one passerby. The device comprises a sensor for detecting a passersby near the vehicle and an array arranged on the vehicle and formed by a plurality of loudspeakers and a plurality of microphones for providing an acoustical beamforming focused in a predetermined region around the position of a detected passerby for the sending of acoustical signals to the passerby and for the receiving of acoustical information from the passerby.

System and controller-free user drone interaction

System and method for controlling an aerial system, without physical interaction with a separate remote device, based on sensed user expressions. User expressions may include thought, voice, facial expressions, and/gestures.
Hangzhou Zero Zero Technology Co., Ltd.

Gestural control of an industrial robot

A robot system is configured to identify gestures performed by an end-user proximate to a work piece. The robot system then determines a set of modifications to be made to the work piece based on the gestures.
Autodesk, Inc.

User notification of powered system activation during non-contact human activation

The present disclosure relates to a non-contact power closure member system for operating a liftgate of vehicle. The non-contact power closure member system includes at least one sensor for sensing an object or motion when a key fob is located within a predetermined distance of the vehicle.
Magna Closures Inc.

Vehicle and controlling the same

A touch input device for a vehicle is disclosed. The touch input device is installed next to the driver seat, and configured to receive the driver's touch input.
Kia Motors Corporation

Input device for electronic device and vehicle including the same

An input device for an electronic device includes a gesture input device enabling a user to input a gesture by touching, and physically rotating, a rotation sensor sensing the physical rotation of the gesture input device, and a controller controlling the gesture input device. The gesture input device has a concave, downwardly inclined shape, and includes a curved section in a center portion of the gesture input device, and an inclined section around the curved section, and the inclined section includes a touch sensor configured to sense a user's touch input, and at least one rib raised or lowered according to a control signal from the controller..
Kia Motors Corporation

Human-computer-interaction through scene space monitoring

A method including causing computer implementation of the at least one virtual boundary in a monitoring space corresponding to a monitored scene space; processing sensor data to generate a response event when there is, relative to the at least one virtual boundary, a change in a portion of the monitored scene space; recognizing a first hand gesture relating to a first hand of a user in the monitored scene space and in response to recognizing the first hand gesture in the monitored scene space in relation to a portion of the virtual boundary enabling user location of the portion of the implemented computer-implemented virtual boundary and recognizing a second hand gesture relating to a second hand of the user in the monitored scene space and in response to recognizing the second hand gesture in the monitored scene space performing a command relating to a user located portion of the computer-implemented virtual boundary.. .
Nokia Technologies Oy

Method and identifying input features for later recognition

Disclosed are methods and apparatuses to recognize actors during normal system operation. The method includes defining actor input such as hand gestures, executing and detecting input, and identifying salient features of the actor therein.
Atheer, Inc.

System and interactive 3d surgical planning and modelling of surgical implants

A method and system for interactive 3d surgical planning are provided. The method and system provide 3d visualisation and manipulation of at least one anatomical feature in response to intuitive user inputs, including gesture inputs.
Conceptualiz Inc.

Smart multi-touch layout control for mobile devices

Embodiments for manipulating an object, such as an image, are described. For example, a content application renders the object and supports multiple manipulation modes.
Adobe Systems Incorporated

Mobile terminal and controlling the same

Disclosed are a mobile terminal and a method for controlling the same. The mobile terminal includes a touch screen configured to display a first page; and a controller configured to generate a panel region having a predetermined transparency on a preset region of the touch screen, if a preset touch gesture is applied to the touch screen while the first page is being displayed, and configured to display, on the panel region, at least one first function icon corresponding to information displayed on the first page.
Lg Electronics Inc.

Methods and systems for quick reply operations

A method and system for performing quick reply operations are disclosed. The method for quick reply operations includes displaying a text box; determining a display area of the text box; and receiving a user instruction.
Tencent Technology (shenzhen) Company Limited

Projection-type video display device

In a projection-type video display device that displays an image by projecting the image onto a projection object starting from a desk surface, a display screen on which a position and a gesture operation of a user are reflected is displayed in a predetermined area. An illumination and a camera are disposed in the projection-type video display device installed on the desk surface, a gesture using user's fingers is detected, and, for example, a menu screen and an operation guide are displayed according to the gesture.
Hitachi Maxell, Ltd.

Gesture recognition method, apparatus and device, computer program product therefor

Hand gestures, such as hand or finger hovering, in the proximity space of a sensing panel are detected from x-node and y-node sensing signals indicative of the presence of a hand feature at corresponding row and column locations of a sensing panel. Hovering is detected by detecting the locations of maxima for a plurality of frames over a time window for sets of x-node and y-node sensing signals by recognizing a hovering gesture if the locations of the maxima detected vary over the plurality of frames for one of the sets of sensing signals and not for the other of set.
Stmicroelectronics S.r.l.

Transportation means, user interface and assisting a user during interaction with a user interface

The invention relates to a transportation means, a user interface and a method for assisting a user during interaction with a user interface (1). The method comprises the steps: detecting a crossing motion of an input means (2) of the user in relation to a border of a detection region for detecting gestures freely executed in the area, and in response thereto, displaying this crossing motion by means of a light strip (8) in an edge region of a display device (4) of the user interface (1)..
Volswagen Aktiengesellschaft

Technologies for micro-motion-based input gesture control of wearable computing devices

Technologies for detecting micro-motion based input gestures include a wrist-wearable computing device that includes sensors from which values for micro-motion states can be determined. Each micro-motion state is indicative of a motion-related characteristic of the wrist-wearable computing device that is used to determine whether a sequence of detected gesture steps matches an input gesture model associated with an input gesture.
Intel Corporation

Gesture recognition system and gesture recognition method using the same

The present disclosure discloses a gesture recognition system. The gesture recognition system includes a processor configured to send a pulse signal and recognize corresponding hand gesture command according to an algorithm, an ultrasonic generator configured to receive the pulse signal sent by the processor and then send an ultrasonic wave, and a microphone including a signal pick-up unit configured to receive the ultrasonic wave reflected by a target object and generate a reflected signal and a signal processing unit configured to compare the reflected signal and the pulse signal so as to generate time data and frequency shift data, and then transmit the same to the processor.
Aac Acoustic Technologies (shenzhen) Co., Ltd.

Systems and methods for compliance illusions with haptics

Systems and methods for compliance illusions with haptics are disclosed. One illustrative system described herein includes a user interface device including: a sensor configured to detect a gesture; a haptic output device configured to output haptic effects; and a processor coupled to the sensor and the haptic output device, the processor configured to: receive a sensor signal from the sensor; determine a user interaction in mixed reality; determine a haptic effect based in part on the sensor signal and the user interaction; and transmit a haptic signal associated with the haptic effect to the haptic output device..
Immersion Corporation

Electronic device with adjustable reflective display

An electronic device may have a display. Input-output circuitry in the electronic device may be used to gather input from a viewer of the display.
Apple Inc.

Method for optically measuring three-dimensional coordinates and controlling a three-dimensional measuring device

A method for scanning and obtaining three-dimensional (3d) coordinates is provided. The method includes providing a 3d measuring device having a projector, a first camera and a second camera.
Faro Technologies, Inc.

Methods and systems for controlling vehicle body motion and occupant experience

In one embodiment, one or more suspension systems of a vehicle may be used to mitigate motion sickness by limiting motion in one or more frequency ranges. In another embodiment, an active suspension may be integrated with an autonomous vehicle architecture.
Clearmotion, Inc.

Apparatus operation device, apparatus operation method, and electronic apparatus system

In an endoscope system according to an embodiment of the present invention, it is not necessary to touch an actual keyboard, mouse, or button to operate an endoscope apparatus, and thus it is not necessary to shift an instrument from one hand to the other hand or peel of/put on gloves in a medical setting. Accordingly, a necessary operation can be easily performed by using a line of sight and a gesture.
Fujifilm Corporation

Lighting device including solid state emitters with adjustable control

Lighting devices and methods utilize multiple independently controllable groups of solid state light emitters of different dominant wavelengths, with operation of the emitter groups being automatically adjusted by processor(s) to provide desired illumination. Operation of the emitter groups may be further affected by sensors and/or user input commands (e.g., sound patterns, gesture patterns, or signal transmission).
Cree, Inc.

Long-hold video surfing

This document describes methodologies for long-hold video surfing. These techniques and apparatuses enable improved navigation for video and channel previewing based on long-hold gestures performed on a mobile device acting as a remote control to a remote display device.
Google Inc.

Realtime recording of gestures and/or voice to modify animations

Techniques of compressing a number of frames of a presentation generated in a virtual environment per time period. Along these lines, the animations in each chapter of a presentation is expressed in a number of frames.
Google Llc

Media effect application

Exemplary embodiments relate to the application of media effects, such as visual overlays, sound effects, etc. To a video conversation.
Facebook, Inc.

Gesture-based access control in virtual environments

Techniques of access control in vr environments involve defining a series of gestures that users attending a private meeting within a virtual environment carry out to be allowed into the private meeting. Along these lines, when a user sets up a meeting to take place within a virtual environment, the user may define a series of gestures (e.g., swipes, circles, etc.) that may serve as an effective “secret handshake” that gains admittance to the private meeting.
Google Llc

Generating virtual notation surfaces with gestures in an augmented and/or virtual reality environment

In an augmented reality and/or a virtual reality system, virtual annotation surfaces, or virtual sheets, or virtual whiteboards, may be materialized in in response to a detected gesture. A user may annotate, adjust, store, review and revise the virtual annotation surfaces, and allow for collaboration with other users, while in the current virtual environment, and/or within another virtual environment, and/or outside of the virtual environment..
Google Inc.

System, method and apparatus of eye tracking or gaze detection applications including facilitating action on or interaction with a simulated object

Techniques are disclosed for facilitating action by a user on a simulated object in an augmented reality environment. In some embodiments, a method includes, detecting a gesture of the user in a real environment via a sensor of the device; wherein, the gesture includes, movement of eye ball or eye focal point of one or more eyes of the user.
Augmented Reality Holdings, Llc

Object creation using body gestures

An intuitive interface may allow users of a computing device (e.g., children, etc.) to create imaginary three dimensional (3d) objects of any shape using body gestures performed by the users as a primary or only input. A user may make motions while in front of an imaging device that senses movement of the user.
Microsoft Technology Licensing, Llc

Augmented reality dynamic authentication for electronic transactions

A system for authorizing an electronic transaction in an augmented reality environment comprises an augmented reality user device and an authentication server. The augmented reality user device includes a display that overlays virtual objects onto a field of view of the user.
Bank Of America Corporation

Virtual reality dynamic authentication

A system for performing authorization of a user in a virtual reality environment includes a virtual reality user device. The virtual reality user device includes a display configured to display a virtual environment.
Bank Of America Corporation

Personal authentication method and apparatus based on recognition of fingertip gesture and identification of fake pattern

Disclosed herein are a method and apparatus for authenticating a user based on a fingertip gesture. The authentication apparatus may display a pattern generated based on geometric information about a hand geometry or size of a user, and may recognize a fingertip gesture via interaction with the user with respect to the pattern.
Electronics And Telecommunications Research Institute

Touchscreen with three-handed gestures system and method

A user interface verification device and a method of use is presented for recognizing a three-hand gesture on a touchscreen of the device. The gesture is recognized by detecting a plurality of contact points in at least two, disparate touching zones, and simultaneously detecting additional contact points in a third, disparate touching zone.
Bby Solutions, Inc.

Gesture recognition cloud command platform, system, method, and apparatus

Systems and methods described herein are for transmitting a command to a remote system. A processing system determines the identity of the user based on the unique identifier and the biometric information.

Processing of gesture-based user interactions using volumetric zones

Systems and methods for processing gesture-based user interactions within an interactive display area are provided. The display of one or more virtual objects and user interactions with the one or more virtual objects may be further provided.
Intellectual Ventures Holding 81 Llc

Information privacy in virtual reality

Systems and methods are described that include generating a virtual reality experience in a virtual reality environment, detecting, a first gesture from a first user accessing the virtual reality environment, the first gesture being configured as a command to initiate a privacy mode with a second user accessing the virtual reality environment, and generating a prompt for display to the second user, the prompt corresponding to the command. In response to detecting a second gesture from the second user, the second gesture determined to substantially match the first gesture, initiating the privacy mode between the first user and the second user in the virtual reality environment, and sharing communications occurring in the virtual environment from the first user to the second user and from the second user to the first user while modifying, for users other than the first user and the second user, the communications occurring between the first user and the second user..
Google Inc.

Systems and methods for tracking motion and gesture of heads and eyes

Apparatus, systems and methods configured for tracking head and eye movement are disclosed. In one example, an apparatus comprises an imaging sensor, a spatial orientation sensor and a computing device comprising a processor and a memory communicatively coupled with the processor.
Vuelosophy Inc.

Techniques for gesture-based initiation of inter-device wireless connections

Techniques for gesture-based device connections are described. For example, a method may comprise receiving video data corresponding to motion of a first computing device, receiving sensor data corresponding to motion of the first computing device, comparing, by a processor, the video data and the sensor data to one or more gesture models, and initiating establishment of a wireless connection between the first computing device and a second computing device if the video data and sensor data correspond to gesture models for the same gesture.
Intel Corporation

Concurrent detection of absolute distance and relative movement for sensing action gestures

A gesture detection system uses two radar tones to concurrently detect absolute distance and relative movement of a target object. A radar-based detection device alternates transmitting a first radar tone and a second radar tone via a radar-emitting device, and then captures a first return signal and a second return signal that are generated by the first radar tone and second radar tone reflecting off the target object.
Google Inc.

Calibration depth-based interfaces with disparate fields of view

Various of the disclosed embodiments provide human computer interfaces (hci) that incorporate depth sensors at multiple positions and orientations. The depth sensors may be used in conjunction with a display screen to permit users to interact dynamically with the system, e.g., via gestures.
Youspace, Inc.

Hands-free gestures for account authentication

Systems and methods are provided for authenticating an account via a hands-free gesture, such as a tap, pattern of taps, or other physical gesture not requiring a user to hold a computing device. A user can initiate a transaction to purchase an item by interacting with a first computing device (e.g., electronic kiosk, point-of-sale terminal, automated checkout device, etc.).
Ebay Inc.

Conversational analytics

A computer-implemented method includes determining a meeting has initialized between a first user and a second user, wherein vocal and video recordings are produced for at least the first user. The method receives the vocal and video recordings for the first user.
International Business Machines Corporation

Continuous score-coded pitch correction

Vocal musical performances may be captured and continuously pitch-corrected at a mobile device for mixing and rendering with backing tracks in ways that create compelling user experiences. In some cases, the vocal performances of individual users are captured in the context of a karaoke-style presentation of lyrics in correspondence with audible renderings of a backing track.
Smule, Inc.

Method and system for hand washing compliance

A method and system for user hand washing compliance is described. An exemplary embodiment of the method may comprise the steps of: receiving a request from a user to enter a hand-washing-compliance-area, wherein the request is received by an entry-sensor; the entry-sensor may cause initiation of a hand-washing-cycle log entry; releasing water from a washer for pre-rinsing hands of the user; releasing soap from a soap-dispenser for soaping the hands of the user; releasing water from the washer to rinse the soaped hands of the user; and providing a means-to-dry-hands from a hand-dryer for drying the hands of the user.

Clustering photographs for display on a digital picture frame

A digital picture frame including a camera integrated with the frame, and a network connection module allowing the frame for direct contact and upload of photos from electronic devices or from photo collections of community members. The integrated camera is used to automatically determine an identity of a frame viewer and can capture gesture-based feedback.
Pushd, Inc.

Screen zoom feature for augmented reality applications

A computing device has a cpu and a digital camera configured to capture a digital image of a portion of a machine and send the digital image to the cpu for processing. An interactive display of the computing device renders the digital image.
Caterpillar Inc.

Augmented reality headset and digital wallet

An augmented reality system includes an augmented reality user device and a digital wallet. The digital wallet includes items associated with a user.
Bank Of America Corporation

User terminal device for providing translation service, and controlling same

Provided is a user terminal device for providing a translation service. The user terminal device comprises: a communication unit for performing communication with an external device; a display for displaying messages transmitted and received by communication with the external device; a sensing unit for sensing a gesture for the user terminal device; and a processor for, if a preset gesture is sensed, providing a translation service for at least one message from among the displayed messages..
Samsung Electronics Co., Ltd.

Cognitive contextualization of emergency management system communications

Software that contextualizes communications during an emergency event by performing the following steps: (i) receiving an input communication written, spoken, or communicated via gestures by a first user, wherein the input communication includes natural language-based input information relating to an emergency event; (ii) determining an output communication to be sent to a second user, wherein the output communication is based, at least in part, on the input communication, and wherein the output communication includes natural language-based output information relating to the emergency event; (iii) determining a cognitive state of the first user and a cognitive state of the second user; and/or (iv) modifying the output communication based, at least in part, on the cognitive state of the first user, wherein modifying the output communication includes modifying natural language content of the output information.. .
International Business Machines Corporation

Method for adjusting photographing focal length of mobile terminal by using touchpad, and mobile terminal

A method is provided for adjusting a photographing focal length of a mobile terminal, including: when a mobile terminal enters a camera mode and displays a shooting preview screen on a display, receiving a focal length adjustment start instruction when a touchpad is touched; executing the focal length adjustment start instruction to display a focal length indication bar on the display, so as to prompt the focal length adjustment; obtaining gesture information generated by an operation on the touchpad after the focal length indication bar is displayed; then determining a focal length adjustment instruction using the gesture information; and adjusting a focal length using the focal length adjustment instruction, and displaying, on the shooting preview screen, an image after the focal length is adjusted.. .
Huawei Technologies Co., Ltd.

Selective rejection of touch contacts in an edge region of a touch surface

The selective rejection of touch contacts in an edge region of a touch sensor panel is disclosed. In addition, by providing certain exceptions to the rejection of edge contacts, the functionality of the touch sensor panel can be maximized.
Apple Inc.

Systems, articles and methods for wearable electronic devices employing contact sensors

Wearable electronic devices that employ one or more contact sensors (e.g., capacitive sensors and/or biometric sensors) are described. Contact sensors include electromyography sensors and/or capacitive touch sensors.
Thalmic Labs Inc.

Information processing method, terminal, and computer storage medium

The present disclosure provides an information processing method, terminal, and computer storage medium. The method includes: rendering one or more virtual resource object in a gui, at least one virtual resource object being a user character object that performs a virtual operation according to an input user command; detecting a skill operation trigger gesture on at least one skill object deployed in a skill operation area; detecting movement of the skill object according to a skill release trigger gesture, to determine a release location of the skill object; detecting a distance between the release location and the user character object; adjusting, when the distance is greater than a skill release distance, the user character object to move towards the release location; and performing a skill release operation on the skill object based on the release location when detecting a skill operation release gesture on the skill object..
Tencent Technology (shenzhen) Company Limited

. .

. .

Digital picture frame with improved display of community photographs

A digital picture frame including a camera integrated with the frame, and a network connection module allowing the frame for direct contact and upload of photos from electronic devices or from photo collections of community members. The integrated camera is used to automatically determine an identity of a frame viewer and can capture gesture-based feedback.
Pushd, Inc.

Generating communicative behaviors for anthropomorphic virtual agents based on user's affect

Systems and methods for automatically generating at least one of facial expressions, body gestures, vocal expressions, or verbal expressions for a virtual agent based on emotion, mood and/or personality of a user and/or the virtual agent are provided. Systems and method for determining a user's emotion, mood and/or personality are also provided..
Ipsoft Incorporated

Method and device for controlling 3d character using user's facial expressions and hand gestures

Facial expressions and whole-body gestures of a 3d character are provided based on facial expressions of a user and gestures of a hand puppet perceived using a depth camera.. .
Korea Institute Of Science And Technology

Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs

An electronic device with a display and a fingerprint sensor displays a fingerprint enrollment interface and detects, on the fingerprint sensor, a plurality of finger gestures performed with a finger. The device collects fingerprint information from the plurality of finger gestures performed with the finger.
Apple Inc.

Depth sensing camera glasses with gesture interface

Glasses wearable by a user and responsive to gestural commands and method for operating the glasses to control one or more devices. In a preferred form, the glasses include a plurality of spaced, depth cameras that react to a user's hand command based, at least in part, on the distance of the hand from the depth camera.

User interface device for industrial vehicle

A processing device comprising a graphical user interface in an industrial vehicle is provided. The processing device comprises a touch screen display that receives touch gesture commands from a vehicle operator, memory storing executable instructions, and a processor in communication with the memory.
Crown Equipment Corporation

User interface device for industrial vehicle

A processing device comprising a graphical user interface in an industrial vehicle is provided. The processing device comprises a touch screen display that receives touch gesture commands from a vehicle operator, memory storing executable instructions, and a processor in communication with the memory.
Crown Equipment Corporation

User interface device for industrial vehicle

A processing device comprising a graphical user interface in an industrial vehicle is provided. The processing device comprises a touch screen display that receives touch gesture commands from a vehicle operator, memory storing executable instructions, and a processor in communication with the memory.
Crown Equipment Corporation

User interface device for industrial vehicle

A processing device comprising a graphical user interface in an industrial vehicle is provided. The processing device comprises a touch screen display that receives touch gesture commands from a vehicle operator, memory storing executable instructions, and a processor in communication with the memory.
Crown Equipment Corporation

Body posture detection system, suit and method

A body posture detection system includes an inertial measurement unit, at least two ultrasonic transceivers and a processor. The inertial measurement unit is configured to retrieve an orientation vector of a first portion of a human body.
Htc Corporation

Mediated reality

A method comprising: enabling viewing by a user of a virtual scene of a virtual space; mapping a three-dimensional gesture of the user to a corresponding three-dimensional gesture in the virtual space; and in response to determining that the corresponding three-dimensional gesture in a virtual space is a first predetermined gesture or predetermined gestures in relation to a first portion of the virtual space, analyzing the virtual space over time to detect an event in the virtual space relating to the first portion of the virtual space.. .
Nokia Technologies Oy

Method and system to predict vehicle traffic behavior for autonomous vehicles to make driving decisions

Responsive to sensor data received from one or more sensors of an autonomous vehicle, one or more predicted trajectories are generated, with each of the predicted trajectories having an associated probability. One or more driving scenarios that trigger gesture recognition are identified.
Baidu Usa Llc

Games controller

The invention provides a controller and an actuator for mounting to a controller, enables a user with access to a plurality (two or more) of control functions from a single actuator. In particular, the present disclosure provides an actuator which in normal use is hidden from view for example, but not limited to, by being mounted to the rear of a controller.
Ironburg Inventions Ltd.

Transcutaneous electrical nerve stimulator with user gesture detector and electrode-skin contact detector, with transient motion detector for increasing the accuracy of the same

Control means for controlling the output of the stimulation means in response to the determined user gesture, electrode-skin contact integrity and transient motion.. .

Touch free operation of ablator workstation by use of depth sensors

An inventive system and method for touch free operation of an ablation workstation is presented. The system can comprise a depth sensor for detecting a movement, motion software to receive the detected movement from the depth sensor, deduce a gesture based on the detected movement, and filter the gesture to accept an applicable gesture, and client software to receive the applicable gesture at a client computer in an ablation workstation for performing a task in accordance with client logic based on the applicable gesture.
Biosense Webster (israel), Ltd.

Socially enabled, body worn communication device and use

According to another embodiment, each bracelet includes an accelerometer and a microphone and associated control circuitry so that should two users decide to exchange contact information, each will create a common gesture (such as a ‘high-five” slap) and simultaneously shout out a predetermined selected audible word, phrase or sound (such as: “alright” or “great”). Once confirmed, the communication circuit of each bracelet will transmit and receive contact information.

Arrangement for, and , associating an identifier of a mobile device with a location of the mobile device

A mobile device is moved and operated by a user in a venue, and transmits a device identifier (id) that identifies the mobile device. A camera system is deployed in the venue, and images the user in the venue.
Symbol Technologies, Llc

Apparatus, controlling presentation of content using a multi-media table

Media content presentation systems and methods are operable to control content presentation on a touch-sensitive display of a multi-media table that is communicatively coupled to a media device that is operable to access the media content. An exemplary embodiment receives, at the multi-media table from the media device, a media content event; presents the media content event on a portion of the touch-sensitive display of the multi-media table; detects a gesture-based touch movement made by a user on the touch-sensitive display; determines an intended user command based on the detected gesture-based touch movement; generates a media device command when the intended user command is configured to control presentation of the media content event; and communicates the generated media device command from the multi-media table to the media device, wherein the media device controls the media content event in accordance with the received media device command..
Echostar Technologies L.l.c.

Remote document execution and network transfer using augmented reality display devices

An augmented reality user device includes a display, a physical identification verification engine, a gesture confirmation engine, and in interface. The display overlays a virtual file document onto a tangible object.
Bank Of America Corporation

Mobile device gesture and proximity communication

A transaction service and associated mobile application are described that enables users to communicate with mobile devices in a local area. Communications may cause the transfer of money from the owner of a first device to the owner of a second device.
Honey Inc.

Authentication screen

Techniques are disclosed relating to authenticating a user via a lock screen. In one embodiment, a computer device presents a two-dimensional matrix of elements on a display of the computing device and detects a continuous gesture performed by a user on the display over the two-dimensional matrix of elements.
Ca, Inc.

A computerized system including rules for a rendering system accessible to non-literate users via a touch screen

Computerized system operative to perform selectable system-actions responsive to user input, the system being accessible to non-literate users via a touch screen defining touch screen locations respectively corresponding to the selectable system-actions, the touch screen being operative to detect and distinguish between first and second gestures, the system comprising a processor-controlled touch-triggered actor which, responsive to at least each first gesture applied by end-user to an individual location within the touch screen, from among plural touch screen locations, performs individual action/s which corresponds to the individual location; and a processor-controlled touch-triggered oral presenter which, responsive to at least each second gesture applied by end-user to an individual location within the touch screen, from among the plural touch screen locations, presents an oral characterization of individual action/s which corresponds to said individual location.. .
Googale (2009) Ltd.

Graphical user interface for calibrating a surround sound system

A method and a system for calibrating a surround sound system are disclosed. The calibration system can provide a graphical user interface for display comprising a visual representation of the room hosting a multichannel surround sound system.
Dts, Inc.

Gesture-based selection and manipulation method

A method for deleting at least one content item in a list of content items is presented. In an embodiment, the method first displays a list of content items along the first direction on a display of the computing device.

Detection of cleaning gestures

Embodiments of the disclosure provide an electronic device with a cleaning gestures feature. The electronic device includes a touchscreen configured to display information and receive touch inputs, a non-transitory computer-readable medium having processor-executable instructions stored thereon, and a processor configured to execute the processor-executable instructions to: (a) detect a cleaning gesture received at the touchscreen, the cleaning gesture comprising one or more touch inputs on the touchscreen; (b) in response to receiving the cleaning gesture, enter a cleaning mode, the cleaning mode comprising disabling one or more functions and/or gestures of the touchscreen; and (c) conditionally terminate the cleaning mode..
Rauland-borg Corporation

Circular user interface

A circular user interface (ui) that permits a user to select options and input subjective data (e.g., values and arrays) with a circular gesture-based interaction style in an easy, efficient and fast way. The ui is useful on circular devices that provide a display of, e.g., health and behavior measurements; it uses the circular shape of the device to guide the user and shows guiding animating elements on the display..
Koninklijke Philips N.v.

System and provisioning a user interface for sharing

Methods and systems for tracking and sharing user data may include presenting a first portion of content to a first user, detecting the first user and second user in proximity to an information handling system, displaying a user interface element associated with the content on the information handling system, sharing the user interface element with the second user using the information handling system, and presenting a second portion of content on the second information handling system to the second user. The content associated with a user interface element.
Dell Products L.p.

System and provisioning a user interface for scaling and tracking

Methods and systems for interfacing with users may include displaying content associated with a plurality of user interface elements, tracking a view of the user on at least a portion of the content to determine an interest of the user, detecting the user in proximity to an information handling system, displaying the portion of content corresponding to the interest of the user on the information handling system, sharing the portion of content associated with the plurality of user interface elements on a display, and scaling the size of the portion of content on the display to match the size of the portion of content displayed on the information handling system. The sharing may be based on a gesture from the user.
Dell Products L.p.

Touch and non-contact gesture based screen switching method and terminal

A touch and gesture input-based control method for a mobile terminal or handheld display is provided for facilitating the switching operation between for an object in response to a gesture input subsequent to a touch input. The method includes detecting a touch input; selecting at least one object corresponding to the touch input; detecting a gesture input; and performing switching corresponding to the gesture input in a state where the object is held at a position of the touch input.
Samsung Electronics Co., Ltd.

Electronic device and controlling the electronic device

An electronic device includes a touch interface and a processor configured to identify a current touch displacement while a touch gesture of a user is received through the touch interface, to identify an action corresponding to a current state of the electronic device and the identified current touch displacement, and to perform a task according to the identified action.. .
Samsung Electronics Co., Ltd.

Methods and systems for defining gestures for a user interface

To develop a user-interface prototype, a gesture patch is selected, a gesture is specified for the gesture patch, a first layer patch corresponding to a first layer of the user-interface prototype is selected, and an output of the gesture patch is coupled to an input of the first layer patch. The patches are graphical elements.

Device and recognizing hand gestures using time-of-flight sensing

An electronic device includes at least one laser source configured to direct laser radiation toward a user's hand. Laser detectors are configured to receive reflected laser radiation from the user's hand.
Stmicroelectronics Sa

Aircraft having gesture-based control for an onboard passenger service unit

An aircraft is provided that includes a passenger service unit for a passenger seated in a seat in its cabin. The aircraft includes a camera configured to acquire an image of the passenger, and a control module configured to at least receive the image of the passenger.
The Boeing Company

Haptic feedback for steering system controls

A system for haptic feedback for steering system controls includes a touch sensor input detection module and an actuator haptic response driver module. The touch sensor input detection module acquires a touch sensor input from one or more touch sensors of a steering system and identifies a touch gesture type of the touch sensor input.
Steering Solutions Ip Holding Corporation

Selection of an object in an augmented or virtual reality environment

A method of selection of an object in an environment including a plurality of real and/or virtual objects is described. The environment being displayed to a user through a display device includes an assignment of a gesture path to each object of the plurality of objects and the gesture path includes a series of gestures to be performed by the user to select the object..
Thomson Licensing

Embodied dialog and embodied speech authoring tools for use with an expressive social robot

A social robot provides more believable, spontaneous, and understandable expressive communication via embodied communication capabilities by which a robot can express one or more of: paralinguistic audio expressions, sound effects or audio/vocal filters, expressive synthetic speech or pre-recorded speech, body movements and expressive gestures, body postures, lighting effects, aromas, and on-screen content, such as graphics, animations, photos, videos. These are coordinated with produced speech to enhance the expressiveness of the communication and non-verbal communication apart from speech communication..
Jibo, Inc.

Photoelectric proximity sensor for gesture-based control of an aerosol delivery device

An aerosol delivery device is provided that includes at least one housing, a heating element, a photoelectric proximity sensor and a control component. The at least one housing encloses a reservoir configured to retain an aerosol precursor composition.
Rai Strategic Holdings, Inc.

Highlight-based movie navigation, editing and sharing

Methods and apparatuses for highlight-based movie navigation, editing and sharing are described. In one embodiment, the method for processing media comprises: playing back a movie on a display of a media device performing gesture recognition to recognize one or more gestures made with respect to the display; and navigating through the media on a per highlight basis in response to recognizing the one or more gestures..

Intelligent virtual assistant system and method

An intelligent virtual assistant is provided for respectively customizable interactive audio/video content to each of a plurality of computing devices during a networked communication session. Input is received from at least one device, and is determined via information provided in or with the input, that the input is at least one of speech input, facial input, gesture input and textual input.
Touchcast Llc

Continuous gesture recognition for gaming systems

A method for controlling a wagering gaming apparatus includes displaying a game on a display screen, receiving, from a sensor device, a plurality of location data points corresponding to a plurality of locations of an anatomical feature of the player in three-dimensional space as the anatomical feature of the player moves in the three-dimensional space, analyzing a first group of the location data points to identify a first input command, the first group of location data points comprising sequential location data points, causing a first action to be taken in the game, the first action being determined based on the first input command, and analyzing a second group of the location data points to identify a second input command, the second group of location data points comprising sequential location data points. The first group of location data points and the second group of location data points at least partially overlap..
Igt

Handwriting-based predictive population of partial virtual keyboards

A “stroke untangler” composes handwritten messages from handwritten strokes representing overlapping letters or partial letter segments are drawn on a touchscreen device or touch-sensitive surface. These overlapping strokes are automatically untangled and then segmented and combined into one or more letters, words, or phrases.
Microsoft Technology Licensing, Llc

Gesture identification with natural images

A method for gesture identification with natural images includes generating a series of variant images by using each two or more successive ones of the natural images, extracting an image feature from each of the variant images, and comparing the varying pattern of the image feature with a gesture definition to identify a gesture. The method is inherently insensitive to indistinctness of images, and supports the motion estimation in axes x, y, and z without requiring the detected object to maintain a fixed gesture..
Pixart Imaging Inc.

Visual tracking method and robot based on monocular gesture recognition

The present disclosure discloses a visual tracking method based on gesture recognition and a robot thereof. By recognizing a feature gesture, an accurate offset angle between a robot and a tracking target is obtained in real time, accurate tracking is facilitated, and the tracking is more natural.
Nanjing Avatarmind Robot Technology Co., Ltd.

System and enhanced command input

A portable electronic device having an input device for receiving a gesture based input from a user is used to control a navigation operation of an appliance. The portable electronic device receives via the input device the gesture based input and uses one or more parameters stored in a memory of the portable electronic device and one or more characteristics associated with the gesture based input to cause the portable electronic device to transmit a navigation step command to thereby control the navigation operation of the appliance..
Universal Electronics Inc.

Omnidirectional gesture detection

An omnidirectional electronic device is disclosed. The electronic device can perform operations associated with a combination of inputs that can, in some cases, be recognized irrespective of the position or orientation in which they are applied to the electronic device.
Apple Inc.

Enhanced interaction touch system

A touch-sensitive apparatus includes an touch sensing part which operates a touch sensor arrangement to provide signal data representing a touch within a touch-sensing region on a front surface of a light transmissive panel; the touch-sensitive apparatus further includes a computer vision system part which operates a camera system to image a scene located externally of the touch-sensitive apparatus, and operates a computer vision controller to detect, based on image data generated by the camera system, at least one object within the scene. The touch-sensitive apparatus enables user interaction by touch control, gesture control and hover control..
Flatfrog Laboratories Ab

Movement detection detecting a hand movement

The invention relates to movement detection apparatus for detecting a hand movement like a hand gesture which may be used for controlling a computer or another device. A light emitting device emits light into tissue at the wrist of a person and a light detection device detects light, which has travelled through the tissue, at the wrist and generates a light detection signal based on the detected light, wherein a hand movement determination unit determines the hand movement based on the light detection signal.
Koninklijke Philips N.v.

Gesture operation method based on depth values and system thereof

A gesture operation method based on depth values and the system thereof are revealed. A stereoscopic-image camera module acquires a first stereoscopic image.
Metal Industries Research & Development Centre

Gesture-based control and usage of video relay systems

A system comprising a web-enabled video camera, at least one microphone, a remote server, and a command processing unit, which allow users to communicate remotely with video and audio. The system further is controlled by either gestures from a user, or a remote control, and can have gestures customized by the user for specific inputs.
Purple Communications, Inc.

Gesture-based control and usage of video relay systems

A system comprising a web-enabled video camera, at least one microphone, a remote server, and a command processing unit, which allow users to communicate remotely with video and audio. The system further is controlled by either gestures from a user, or a remote control, and can have gestures customized by the user for specific inputs.
Purple Communications, Inc.

System and controlling a user experience

System and methods for controlling a user experience are described. In an aspect, an interface can comprise an interface device for rendering content to a user, a sensor having a gesture zone associated therewith configured to detect a dexterous gesture of a user within the gesture zone and generate a sensor signal representing the dexterous gesture.
Comcast Cable Communications, Llc

Detecting user focus on hinged multi-screen device

A mobile computing device is provided that includes a processor, an accelerometer, two or more display devices, and a housing including the processor, the accelerometer, and the two or more display devices, determine a current user focus indicating that a first display device of the pair of display devices is being viewed by the user, and that a second display device of the pair of display devices is not being viewed by the user, detect a signature gesture input based on accelerometer data received via the accelerometer detecting that the mobile computing device has been rotated more than a threshold degree, determine that the current user focus has changed from the first display device to the second display device based on at least detecting the signature gesture input, and perform a predetermined action based on the current user focus.. .
Microsoft Technology Licensing, Llc

Unmanned aerial vehicle and photographing subject using the same

An unmanned aerial vehicle is provided, which includes an aerial vehicle body; a camera mounted on the body; a sensor module installed in the body to sense surrounding environment information; a radio communication module installed in the body to perform radio communication with another communication device; at least one processor installed in the body and electrically connected to the camera, the sensor module, and the radio communication module; and a memory electrically connected to the processor, wherein the memory, during flying of the unmanned aerial vehicle, stores instructions to cause the processor to recognize a user's throwing gesture using the unmanned aerial vehicle, to determine a user direction based on a first motion vector generated by the throwing gesture, to predict a camera direction in a standstill location that is a target point of the unmanned aerial vehicle based on the throwing gesture, and to control a photographing direction of the camera.. .
Samsung Electronics Co., Ltd.

Headset display device, unmanned aerial vehicle, flight controlling unmanned aerial vehicle

A headset display device, uav, flight system and method for controlling uav is provided. The device includes: a collecting module configured to collecting gesture image information; a processing module configured to analytically process the gesture image information collected and translate the gesture image information collected into a control instruction for controlling the uav; send the control instruction to the uav; a display module configured to receive a flight image and/or flight parameters returned by the uav and display the flight image and/or the flight parameters on a display interface; and an optical assistant module configured to perform a left-right split screen on the flight image, in such a manner that left and right eyes of a gesture operator feel seeing a fused stereoscopic image while respectively watching an image displayed on left screen and an image displayed on right screen at the same time..
Shanghai Hang Seng Electronic Technology Co., Ltd

Interactions between one or more mobile devices and a vr/ar headset

A system, a machine-readable storage medium storing instructions, and a computer-implemented method are described herein for a virtual reality engine that generates first object data based on a first physical gesture applied to a first object presented in a mobile device view displayed at a first mobile device. The virtual reality (“vr”) engine generates second object data based on a second physical gesture applied to a second object presented in a second mobile device view displayed at a second mobile device.
Zynga Inc.

Gesture recognition system using depth perceptive sensors

Acquired three-dimensional positional information is used to identify user created gesture(s), which gesture(s) are classified to determine appropriate input(s) to an associated electronic device or devices. Preferably at at least one instance of a time interval, the posture of a portion of a user is recognized, based at least one factor such as shape, position, orientation, velocity.
Longwood Gardens, Inc.

Microsoft Technology Licensing, Llc

. .

3d projector for mobile phone

A 3d projector for mobile phone, comprising an inverted-l shaped frame, a signal receiver disposed inside the inverted-l shaped frame for receiving a mobile phone application program, a main pyramidal prism disposed directly on top of a display of the mobile phone, and an auxiliary pyramidal prism disposed directly on top of a front camera of the mobile phone. With the 3d projector for mobile phone as described, when such 3d projector is used, signal can be received through the inverted-l shaped frame, achieving upwards or downwards movement or rotation of the main pyramidal prism.

Computer-implemented controlling a remote device with a local device

A computer-implemented method is presented for controlling a remote device with a local device which may have a smaller screen than the remote device. At least a part of a content of a screen of the remote device is displayed on a screen of the local device with a magnification m.
Teamviewer Gmbh

Non-leading computer aided detection of features of interest in imagery

An illustrative embodiment of a computer-implemented process for non-leading computer aided detection of features of interest in a dataset, designates a particular formation using a computer recognizable gesture to identify a gestured location in an analyzed view of the dataset in response to a user identifying the particular formation in the analyzed view. The dataset is generated by a computer and representative of a portion of an object characterized by the dataset.
International Business Machines Corporation

Augmented reality therapeutic movement display and gesture analyzer

Systems and methods for displaying augmented reality clinical movements may use an augmented reality device to display aspects of a clinical movement. The systems and methods may use a motion capture device to capture the clinical movement.

Search of nas data through association of errors

A computer-perceptible search input, whether typed, spoken, based upon machine vision, detection and/or interpretation of gestures, for example, may be received by a computing device from a single user. The received input by the single user may be matched with one or more stored digital items based upon prior inputs by the single user that previously led the single user to access the digital item(s).
Western Digital Technologies, Inc.

Information processing method, terminal, and storage medium

Information processing method, terminal, and storage medium are provided. The method includes: performing rendering in a graphical user interface, to obtain at least one virtual resource object; performing rendering, at one of a pre-set location and a wheel rendering location in the graphical user interface, to obtain a skill-release supplementary control object, when detecting a skill-release trigger gesture on at least one skill object located in at least one skill operation area in the graphical user interface, the skill-release supplementary control object comprising a skill-release control halo object and a virtual joystick object located within a radiation range of the skill-release control halo object; controlling, when detecting a drag operation on the virtual joystick object, a skill release location of the skill object to be correspondingly adjusted in the graphical user interface; and performing a skill release operation on the skill object when detecting a release operation of the drag operation..
Tencent Technology (shenzhen) Company Limited

Second touch zoom control

A second touch zoom solution allows maintaining selection control, and movement of a selection point, during zoom operations not possible through traditional pinch-zoom. A first finger touch to a touch screen establishes a hot spot, selection, or one-finger gesture.
Onshape Inc.

Freehand table manipulation

Recognition of freehand input enables gestures and objects to be recognized as tables and actions taken in relation to tables. For example, drawing a rectangle intersected by horizontal and vertical lines will create a table object that functions as a table within a productivity application, but may inherit visual cues from the strokes used to draw it.
Microsoft Technology Licensing, Llc

Gesture based smart download

An aspect includes detecting a user gesture at a sender device, the user gesture indicating a direction relative to the sender device. One or more candidate receiver devices in the indicated direction and in a line-of-sight of the sender device are located.
International Business Machines Corporation

Display device and display control method

A display device includes a display section, a gesture acceptance section, and a display control section. The display section has a touch panel function.
Kyocera Document Solutions Inc.

Electrical device for hand gestures detection

(4) initiate action(s) to the controlled unit. The action(s) are associated with selected hand gesture(s) based on the estimation..

Interactive system and glasses with gesture recognition function

Glasses with gesture recognition function include a glasses frame and a gesture recognition system. The gesture recognition system is disposed on the glasses frame and configured to detect hand gestures in front of the glasses thereby generating a control command.
Pixart Imaging Inc.

Systems, devices, and methods for gesture identification

Systems, devices, and methods adapt established concepts from natural language processing for use in gesture identification algorithms. A gesture identification system includes sensors, a processor, and a non-transitory processor-readable memory that stores data and/or instructions for performing gesture identification.
Thalmic Labs Inc.

Augmented reality therapeutic movement display and gesture analyzer

Systems and methods for displaying augmented reality clinical movements may use an augmented reality device to display aspects of a clinical movement. The systems and methods may use a motion capture device to capture the clinical movement.

Gesture identification

Various methods and systems are provided to allow a user to perform finger gesture motions, such as making typing, swiping, tapping, or other types of finger motions to provide device input, such as typing, clicking, or selecting data into a webpage, application, operating system, or toolbar, that was normally performed by the user having to use a keyboard, mouse, stylus, microphone, touchscreen, or another input device.. .
Paypal, Inc.

Gesture recognition with sensors

A sensor for motion or gesture sensing may be configured to emit radio frequency signals such as for pulsed range gated sensing. The sensor may include a radio frequency transmitter configured to emit the pulses and a receiver configured to receive reflected ones of the emitted radio frequency signals.
Resmed Sensor Technologies Limited

Method and following an operator and locking onto a retroreflector with a laser tracker

A three-dimensional (3d) coordinate measurement system includes: a retroreflector; a laser tracker including: a first light source; a second light source; at least one camera proximate the second light source; and a processor responsive to executable instructions which when executed by the processor is operable to determine, in a first instance, that a follow-operator gesture has been given by an operator and in response turn the laser tracker to follow movement of the operator.. .
Faro Technologies, Inc.

Actuation of a locking element for a vehicle using an ultrasound sensor

A method to actuate a locking element for a vehicle including detecting a movement of an object using an optical sensor, activating an ultrasound sensor in accordance with the detected movement, detecting a gesture using the ultrasound sensor, and actuating the locking element in accordance with the detected gesture.. .
Volkswagen Aktiengesellschaft

. .

. .

Personality-based chatbot and methods including non-text input

The methods, apparatus, and systems described herein assist a user with a request. The methods in part receive input from a user that includes a voice input, a gesture input, a text input, biometric information, or a combination thereof; retrieve or determine a personality type of the user based on the input; determine a distress level or an engagement level of the user; determine a set of outputs responsive to the received input; rank the outputs in the set based on the retrieved or determined personality type and the determined distress level or engagement level; deliver a ranked output to the input in a modality based on the retrieved or determined personality type and a type of device configured to deliver the ranked output to the user, wherein the device comprises a navigation system, a car, a robot, or a combination thereof; and weigh the ranked output for future interactions..
Mattersight Corporation

Method and identifying gesture

The present disclosure relates to a method and a device for identifying a gesture. The method includes: determining a depth of each pixel in each of a plurality of images to be processed, in which the plurality of images to be processed are separately collected by the plurality of cameras, and the depth is configured to at least partially represent a distance between an actual object point corresponding to each pixel and the mobile apparatus; determining a target region of each of the images to be processed according to the depth; and determining a gesture of a target user according to image information of the target regions..
Beijing Xiaomi Mobile Software Co., Ltd.

Fingerprint sensor having rotation gesture functionality

The present disclosure relates to a fingerprint sensor having a capture surface for capturing characteristic features of the surface of a finger of an operator and an associated analyzing unit, where the analyzing unit and the fingerprint sensor are designed to capture a movement of the characteristic features of the finger across the capture surface; and the analyzing unit is furthermore designed to detect a rotation movement of the finger and to associate a parameter of the rotation movement to a change of a control parameter, as long as the axis of rotation defined by the rotation movement intersects the capture surface.. .
Preh Gmbh

Systems and methods for selecting a symbol input by a user

In one embodiment, a method includes providing for display a first set of touch-screen keys corresponding to a first set of symbols; providing for display, at least partially underneath the first set of touch-screen keys the first set of touch-screen keys, a second set of touch-screen keys corresponding to a second set of symbols; detecting a touch gesture by a user over the first and second sets of keys intending to input a first symbol; determining, based on an amount of lapsed time between the detected touch gesture and a previous touch gesture, a context associated with the detected touch gesture; and selecting, based at least in part on the context, a symbol in the first set of symbols or a symbol in the second set of symbols as the first symbol that the user intended to input.. .
Facebook, Inc.

Electronic system, touch sensitive processing apparatus and method thereof for switching to normal operation mode upon receiving touch gesture in power saving mode

The present invention provides a touch sensitive processing method for switch an electronic system into a normal operation mode upon receiving a touch gesture in a power saving mode. The electronic system includes a host and a power supply module.
Egalax_empia Technology Inc.

System and method to perform a numerical input using a continuous swipe gesture

There is provided a gesture-based gui (system, method, etc.) to facilitate input of numerical data using a continuous swipe gesture. A gesture-based i/o device displays a gui presenting a gross number and a gross number control to initially define a specific number for further defining with specificity.
The Toronto-dominion Bank

Method and controlling display of video content

A controller for controlling the display of secondary digital content displayed in an overlay above a primary video stream. The controller includes a touch interface device, a processor, and a memory storing non-transitory instructions.
Seespace Ltd.

Virtual input device system

The described technology is directed towards virtual input devices that take application program-directed input from automation and/or remote devices, such as over a network, instead of via actual user input via a physical device, for example. This allows an automation framework to insert input into an application program, such as for automated testing without modifying any of the application's other components.
Home Box Office, Inc.

Terminal and controlling the same based on spatial interaction

A terminal and method for controlling the terminal using spatial gesture are provided. The terminal includes a sensing unit which detects a user gesture moving an object in a certain direction within proximity the terminal, a control unit which determines at least one of movement direction, movement speed, and movement distance of the user gesture and performs a control operation associated with a currently running application according to the determined at least one of movement direction, movement speed, and movement distance of the user gesture, and a display unit which displays an execution screen of the application under the control of the control unit..
Samsung Electronics Co., Ltd.

Operating environment with gestural control and multiple client devices, displays, and users

Embodiments described herein includes a system comprising a processor coupled to display devices, sensors, remote client devices, and computer applications. The computer applications orchestrate content of the remote client devices simultaneously across the display devices and the remote client devices, and allow simultaneous control of the display devices.
Oblong Industries, Inc.

Electronic device comprising electromagnetic interference sensor

An electronic device includes an electromagnetic interference (emi) sensor that senses emi patterns from an outside of the electronic device and disposed at a location touchable by a body of a user of the electronic device, a memory that stores one or more databases associated with at least a part of the emi patterns, gestures of the user, and functions executable by the electronic device, and at least one processor electrically connected with the emi sensor and the memory. The at least one processor is configured to sense an emi pattern, which is changed according to a gesture among the gestures of the user, from among the emi patterns from the body of the user using the emi sensor, and to execute a function, which corresponds to the sensed emi pattern based on at least a part of the one or more databases, from among the functions..
Samsung Electronics Co., Ltd.

Applications, systems, and methods for facilitating emotional gesture-based communications

The disclosure of the present application provides applications, systems, and methods for communicating emotions in conjunction with text-based, electronic communications between at least a first user and a second user. More particularly, applications and methods are provided that enable a user to incorporate emotional cues into text-based electronic communications through the use of an electronic gesture board configured to receive touch, motion, and/or gesture input, assign an emotional value to such input, and transmit the assigned emotional value to a second user such that the second user receives a visual depiction associated with the touch, motion, and/or gesture input such as one or more gesture icons, face icons, body icons, sign/symbols, or vibrations or other haptic feedback corresponding with the input emotional value.

Gesture-controlled virtual reality systems and methods of controlling the same

Gesture-controlled virtual reality systems and methods of controlling the same are disclosed herein. An example apparatus includes at least two of an on-body sensor, an off-body sensor, and an rf local triangulation system to detect at least one of a position or a movement of a body part of a user relative to a virtual instrument.
Intel Corporation

Contextual pressure sensing haptic responses

A method of generating haptic effects includes detecting an input of pressure applied to a device using a gesture and determining a level associated with the gesture based on the pressure input, as well as determining a selection of an item at the level based on the gesture and a context associated with the item at the level, along with generating a contextual haptic effect comprising haptic parameter based on the context of the item at the level.. .
Immersion Corporation

Multifunction buckle for a wearable device

The present disclosure relates to the field of electronic engineering. The present disclosure envisages a multifunction buckle that integrates multiple health monitoring devices.
Cu Wellness, Inc.

Multifunction modular strap for a wearable device

The present disclosure relates to the field of electronic engineering. The present disclosure envisages a multifunction modular strap that integrates multiple health monitoring devices.
Cu Wellness, Inc.

Detection of computerized bots and automated cyber-attack modules

Devices, systems, and methods of detecting whether an electronic device or computerized device or computer, is being controlled by a legitimate human user, or by an automated cyber-attack unit or malware or automatic script. The system monitors interactions performed via one or more input units of the electronic device.
Biocatch Ltd.

Frictionless access control system with ranging camera

An access control and user tracking system includes ranging cameras installed in thresholds of access points for generating three dimensional models of users passing through the access points in order to detect unauthorized individuals or hand gestures of the users, which can indicate unsafe conditions or that lights and/or equipment should be turned on. The ranging cameras can be point range finding measurement sensor scanning cameras, structured light cameras, or time of flight cameras and can be installed along the top or in the corners of the thresholds..
Sensormatic Electronics, Llc

Synchronization system comprising display device and wearable device, and controlling method

Disclosed are a synchronization system comprising a display device and a wearable device, and a controlling method. The synchronization system comprises: a display device for receiving a preset touch gesture, converting a displayed background image to a preset image, displaying the preset image, and transmitting a synchronization signal to a wearable device; and the wearable device for, when receiving the synchronization signal from the display device, displaying an image which is identical to the preset image of the display device, and synchronizing data with the display device..
Lg Electronics Inc.

Generating and displaying supplemental information and user interactions on interface tiles of a user interface

Technologies for displaying supplemental interface tiles on a user interface of a computing device include determining supplemental information and/or available user interactions associated with a user interface tile displayed on the user interface. A supplemental interface tile is displayed in association with the user interface tile in response to a user selecting the user interface tile.
Intel Corporation

Method and applying free space input for surface constrained control

A free space input standard is instantiated on a processor. Free space input is sensed and communicated to the processor.
Atheer, Inc.

Portable multi-touch input device

A portable input device is described. The portable input device can wirelessly send control signals to an external circuit.
Apple Inc.

Multi-phase touch-sensing electronic device

A touch-sensing electronic device includes a housing having first, second and third touch-sensing surfaces; a substrate extensively disposed under the first, second and third touch-sensing surfaces; sensing electrodes formed on the same substrate, and having capacitance changes in response to touch operations or gestures respectively performed on or over the first, second and third touch-sensing surfaces, wherein the sensing electrodes are grouped into three sensing electrode arrays corresponding to the first, second and third touch-sensing surfaces, respectively; and a controller for generating respective control signals corresponding to the touch operations performed on or over the first, second and third touch-sensing surfaces. At least two of the three sensing electrode arrays have different configurations for performing different sensing operations..
Touchplus Information Corp.

System and detecting hand gesture

Present disclosure relates to a system for detecting hand gesture and a method thereof. The system comprises a hand-held controller and a computing application.
Htc Corporation

Touchless user interface navigation using gestures

An example method includes displaying, by a display (104) of a wearable device (100), a content card (114b); receiving, by the wearable device, motion data generated by a motion sensor (102) of the wearable device that represents motion of a forearm of a user of the wearable device; responsive to determining, based on the motion data, that the user has performed a movement that includes a supination of the forearm followed by a pronation of the forearm at an acceleration that is less than an acceleration of the supination, displaying, by the display, a next content card (114c); and responsive to determining, based on the motion data, that the user has performed a movement that includes a supination of the forearm followed by a pronation of the forearm at an acceleration that is greater than an acceleration of the supination, displaying, by the display, a previous content card (114a).. .
Google Llc

Three-dimensional graphical user interface for informational input in virtual reality environment

Hand displacement data is received from sensing hardware and analyzed using a three-dimensional (3d) gesture recognition algorithm. The received hand displacement data is recognized as representing a 3d gesture.
Alibaba Group Holding Limited

Interactive media system and method

Interactive computing systems and methods are provided which enable simple and effective interaction with a user device, which increases interest and improves user experience. The interactive system comprises a user device including a motion sensor, for receiving motion-based gestures through motion of the user device; and a controller, coupled to the motion sensor, configured to control one or more aspects of the system according to the motion-based gestures.
Tiltsta Pty Ltd

System, method, and man-machine interaction

A man-machine interaction system, method, and apparatus, the man-machine interaction system includes a wearable device and a display device. The wearable device includes an image acquiring module, a memory, a processor, an image projecting module, and an information transmission interface.
Boe Technology Group Co., Ltd.

Control of machines through detection of gestures by optical and muscle sensors

A material handler system including a plurality of components and material handler equipment, and methods for utilizing the same, is disclosed. A first component includes gesture command recognition and enhancement for controlling material handler equipment.
Deere & Company

Wearable wireless hmi device

A wearable gesture control interface apparatus is used to control a controllable device based on gestures provided by a user. The wearable gesture control interface apparatus includes (i) sensors configured to detect user orientation and movement and generate corresponding sensor data and (ii) a microcontroller configured to: sample the sensor data from the sensors, determine whether the sensor data from one of the sensors meets transmission criteria; and if the sensor data meets the transmission criteria, transmitting control data corresponding to all of the sensors to the controllable device..
Protocode Inc.

Motion communication system and method

This document presents an apparatus and method for creating and displaying a three dimensional cgi character performing signs and gestures that form a motion communication capability. The 3d cgi character is created as a digital construct that is displayed on a silhouette that has an outline of the character created.

Integrated lighting system and network

An integrated lighting system and integrated lighting network including integrated lighting systems are communicatively coupled to one another, for example, via various wireless transceivers. The systems and networks can collect data of a passing object (e.g., person, animal, automotive vehicle).
Rf Digital Corporation

Mobile terminal

Disclosed herein are a mobile terminal and a method for controlling a mobile terminal. The mobile terminal includes a wireless communication unit configured to be connected to an external device for communication, a sensing unit configured to sense a motion of the mobile terminal, a touch screen, and a controller configured to receive information related to a first item selected in a screen of an application being executed in the external device when a motion of the mobile terminal according to a first gesture of a user who has worn the mobile terminal is sensed, to display a first screen on which the information related to the first item is displayed according to a time sequence on the touch screen, and to transmit a control signal which enables a screen of an application corresponding to a second item selected in the first screen to be displayed to the external device when a motion of the mobile terminal according to a second gesture of the user is sensed.
Lg Electronics Inc.

Verifying identity based on facial dynamics

A computer-implemented technique is described for verifying the identity of a user using two components of face analysis. In a first part, the technique determines whether captured face information matches a previously stored structural face signature pertaining to the user.
Microsoft Technology Licensing, Llc

Gesture control method, apparatus, terminal device, and storage medium

A gesture control method, a gesture control apparatus and a terminal device to enrich interaction manners of the terminal device, where the method includes detecting a touch action performed on a touchscreen of a terminal device, obtaining a contact area of the touch action on the touchscreen and a z-axis acceleration generated when the touch action is in contact with the touchscreen, determining that the touch action is a joint touch action when the contact area is larger than a preset area and the z-axis acceleration is greater than a preset acceleration, identifying a gesture type corresponding to the joint touch action, and calling a preset function of the terminal device according to the gesture type.. .
Huawei Technologies Co., Ltd.

Creating tables using gestures

A method comprising displaying, on a touchscreen, a digital electronic document; receiving first input from the touchscreen and determining that the first input comprises a rectangle gesture; receiving second input from the touchscreen and determining that the second input comprises a subdivision gesture that indicates dividing the rectangle; determining that the first input and the second input have been received within a time threshold; in response to determining that the first input and second input have been received within the time threshold, automatically generating a table that comprises a plurality of cells; automatically placing the table in the document at a location that is based on the first input and updating the document that is displayed on the touchscreen to visually show the table; wherein the method is performed by one or more computing devices.. .
Atlassian Pty Ltd

Image manipulation

A method includes displaying an image on a first area of a touch-sensitive electronic display and receiving touch input on a second area of the display, comprising the first area. A gesture type is detected from the touch input by detecting a larger component of motion of the touch input along one of first and second axes of the display than along the other of the axes.
Apical Ltd

Expandable application representation

Expandable application representation techniques are described. The techniques may include support of an expandable tile that may function as an intermediary within a root level (e.g., start menu or screen) of a file system.
Microsoft Technology Licensing, Llc

Actuation lock for a touch sensitive input device

Touch sensitive mechanical keyboards and methods of configuring the depressibility of one or more keys of a keyboard are provided. A touch sensitive mechanical keyboard can accept touch events performed on the surface of the keys.
Apple Inc.

Automated e-tran application

Techniques for text entry using gestures are disclosed. As disclosed, a camera may capture a frame and the face of the user can be detected therein.
Microsoft Technology Licensing, Llc

Gesture experiences in multi-user environments

Systems, apparatuses and methods may leverage technology that recognizes a set of one or more hands in one more frames of a video signal during a first gesture control interaction between the set of one or more hands and an electronic device. Moreover, one or more additional body parts may be detected in the frame(s), wherein the additional body part(s) are excluded from the gesture control interaction.
Intel Corporation

Interaction mode selection based on detected distance between user and machine interface

An embodiment of an interaction mode selection apparatus may include a distance estimator to estimate a distance between a user and a part of a machine interface, and an interaction selector communicatively coupled to the distance estimator to select one or more active interaction modes from two or more available interaction modes based on the estimated distance. The distance estimator may include a depth sensor, a three-dimensional camera, a two-dimensional camera, an array of cameras, an array of microphones, an array of wireless access points, a beacon sensor, a proximity sensor, and/or an ultrasonic sensor.
Intel Corporation

User interaction paradigms for a flying digital assistant

Methods and systems are described for new paradigms for user interaction with an unmanned aerial vehicle (referred to as a flying digital assistant or fda) using a portable multifunction device (pmd) such as smart phone. In some embodiments, a user may control image capture from an fda by adjusting the position and orientation of a pmd.
Skydio, Inc.

Touch-gesture control for side-looking sonar systems

Techniques are disclosed for systems and methods to provide touch screen side-scan sonar adjustment for mobile structures. A side-scan sonar adjustment system includes a user interface with a touch screen display and a logic device configured to communicate with the user interface and a side-scan sonar system.
Flir Belgium Bvba

Vehicle window with gesture control

A slider window assembly for a vehicle includes a frame portion, at least one fixed window panel that is fixed relative to the frame portion, and a movable window panel that is movable along upper and lower rails of the frame portion between a closed position and an opened position. A gesture sensing device is operable to sense a gesture of a user in the vehicle and to determine if the sensed gesture is indicative of an open window command or a close window command.
Magna Mirrors Of America, Inc.

Avatar creation and editing

The present disclosure generally relates to creating and editing user avatars. In some examples, guidance is provided to a user while capturing image data for use in generating a user-specific avatar.
Apple Inc.

Systems and methods for gesture-based control of equipment in video communication

Systems, methods, and non-transitory computer readable media are configured to obtain video data from a camera used in a video conferencing system. A user interface displaying the video data can be provided on a screen, wherein the screen is capable of receiving touch input.
Facebook, Inc.

Launching applications from a lock screen of a mobile computing device via user-defined symbols

A first task and a second task executable by a mobile computing device are associated with a first predefined symbol and a second predefined symbol, respectively. The first task and the second task are different types of tasks executable by the mobile computing device after the mobile computing device has been unlocked.

Gesture detection to pair two wearable devices and perform an action between them and a wearable device, a method and a system using heat as a means for communication

The present disclosure relates to devices and methods for initiating execution of actions and for communicating information to a user, and more particularly, to initiating execution of predefined actions in wearable devices and communication devices based on gestures made with the wearable devices and/or heat applied to a surface of the wearable devices. According to an aspect, the method relates to, in the wearable device, detecting a first, in the first wearable device predefined, gesture of the first wearable device, broadcasting a first signal comprising information associated with the first gesture, receiving, from a second wearable device, a second signal comprising information associated with a second gesture and initiating execution of a, in the first wearable device predefined, first action based on the first signal and the second signal..
Sony Corporation

Camera gesture clock in

A system for keeping track of an employee's attendance is described that allows the employee to be creative which also helps morale. An employee sets up an employee account and provides identification.
Wal-mart Stores, Inc.

Multi-modal user authentication

Various systems and methods for providing a mechanism for multi-modal user authentication are described herein. An authentication system for multi-modal user authentication includes a memory including image data captured by a camera array, the image data including a hand of a user; and an image processor to: determine a hand geometry of the hand based on the image data; determine a palm print of the hand based on the image data; determine a gesture performed by the hand based on the image data; and determine a bio-behavioral movement sequence performed by the hand based on the image data; and an authentication module to construct a user biometric template using the hand geometry, palm print, gesture, and bio-behavioral movement sequence..

Biometric authentication based on gait pattern or writing motion with an inertial measurement unit

The present invention relates to use an inertial measurement unit (imu) to record the acceleration trajectory of a person's gait or pen-less handwriting motion or any predesignated gestures, and to convert the data to a unique biometric pattern. The pattern is unique for each case and can be used as biometric security authentication..
Hong Kong Baptist University

View switching touch screen, and client device

A view switching method includes: displaying a switching flag on a current view in response to a first particular gesture of a user on the current view, where the first particular gesture passes through a particular area of the current view, and switching from the current view to a target view associated with the switching flag, in response to ending of the first particular gesture or to a second particular gesture of the user on the current view. Smooth switching to a specified view can reduce errors caused by mis-operations of users, and provide users with fast, convenient, and graceful browsing experience..
Guangzhou Ucweb Computer Technology Co., Ltd.

Device, method, and graphical user interface for force-sensitive gestures on the back of a device

An electronic device with a front side including a touch-sensitive display, a back side that does not include a display, and sensor(s) to detect contact intensities on the front and back side displays, on the touch-sensitive display, a user interface including objects. While displaying the user interface, the device detects an input on a side of the electronic device.
Apple Inc.

Method and device for sharing content

Methods and devices are provided for sharing content in the mobile internet technology field. In the method, the electronic device displays a content page including a plurality of content elements on the touch screen.
Beijing Xiaomi Mobile Software Co., Ltd.

Motion-assisted visual language for human computer interfaces

Embodiments of the invention recognize human visual gestures, as captured by image and video sensors, to develop a visual language for a variety of human computer interfaces. One embodiment of the invention provides a computer-implement method for recognizing a visual gesture portrayed by a part of human body such as a human hand, face or body.
Fastvdo Llc

Eyeglasses-type wearable device and method using the same

An eyeglasses-type wearable device of an embodiment can handle various data inputs. The device includes right and left eye frames corresponding to positions of right and left eyes and nose pads corresponding to a position of a nose.
Kabushiki Kaisha Toshiba

Performing operations based on gestures

Gesture-based interaction includes displaying a first image, the first image comprising one or more of a virtual reality image, an augmented reality image, and a mixed reality image, obtaining a first gesture, obtaining a first operation based at least in part on the first gesture and a service scenario corresponding to the first image, the service scenario being a context in which the first gesture is input, and operating according to the first operation.. .
Alibaba Group Holding Limited

Gesture interface

A user interface apparatus, computer program, computer readable medium, and method for selecting a selectable object on a display screen is presented. The display screen displays one or more selectable objects.
Rakuten, Inc.

Coordinate system for gesture control

Embodiments of a system and method for gesture controlled output are generally described. A method may include receiving sensor input information from a wearable device, the sensor input information, determining, using the sensor input information, a gravity vector or a magnetic field, determining a change in horizontal angle, rotational angle, or vertical angle based on the sensor input information, the gravity vector, or the magnetic field, and determining a gesture based on the change in the horizontal angle, the rotational angle, or the vertical angle.

Gesture detection

Apparatuses, methods, systems, and program products are disclosed for interrupting a device. A method includes detecting an auditory cue associated with a predefined gesture based on input received from one or more sensors.
Lenovo (singapore) Pte. Ltd.

3d hand gesture image recognition method and system thereof

A 3d hand gesture recognition system includes a light field capturing unit, a calculation unit and an output unit. The light field capturing unit is provided to capture a hand gesture action to obtain a 3d hand gesture image.
National Kaohsiung University Of Applied Sciences

Portable communication device for transmitting touch-generated messages

The invention relates to a portable communication device for transmitting touch-generated messages to at least one addressee, by means of touch gestures carried out by a user on a touch-sensitive panel of the device. This device seeks to cover certain communication demands that are not satisfied by smartphones and their accessory devices; in particular, for those persons who are not in the capacity or condition to access or properly use existing products, either permanently or momentarily.

Method and system for gesture-based interactions

Gesture based interaction is presented, including determining, based on an application scenario, a virtual object associated with a gesture under the application scenario, the gesture being performed by a user and detected by a virtual reality (vr) system, outputting the virtual object to be displayed, and in response to the gesture, subjecting the virtual object to an operation associated with the gesture.. .
Alibaba Group Holding Limited

Systems and methods for relative representation of spatial objects and disambiguation in an interface

Implementations described and claimed herein provide systems and methods for interaction between a user and a machine. In one implementation, a system is provided that receives an input from a user of a mobile machine which indicates or describes an object in the world.
Apple Inc.

Apparatus and methods for managing blood pressure vital sign content in an electronic anesthesia record

A graphical user interface and methods for managing blood pressure vital sign content in an electronic anesthesia record on a multi-function gesture-sensitive device via gesture-based means. The graphical user interface and methods give an improved user interface and set of functions via novel functions unique to gesture sensitive interfaces..

Communication device

A communication device includes a structure, a controller, and sensors that detect the relative position of an object around the structure. The structure has a face unit that is one of the units of the structure.
Toyota Jidosha Kabushiki Kaisha

Display system having world and user sensors

A mixed reality system that includes a head-mounted display (hmd) that provides 3d virtual views of a user's environment augmented with virtual content. The hmd may include sensors that collect information about the user's environment (e.g., video, depth information, lighting information, etc.), and sensors that collect information about the user (e.g., the user's expressions, eye movement, hand gestures, etc.).
Apple Inc.

Traffic direction gesture recognition

Traffic direction gesture recognition may be implemented for a vehicle in response to traffic diversion signals in the vehicles vicinity. Sensors implemented as part of a vehicle may collect data about pedestrians and other obstacles in the vicinity of the vehicle or along the vehicle's route of travel.
Apple Inc.

Method and system of hand segmentation and overlay using depth data

In a minimally invasive surgical system, a plurality of video images is acquired. Each image includes a hand pose image.
Intuitive Surgical Operations, Inc.

Instrument as well as operating an instrument

An instrument is described which comprises an input for receiving a signal, a data processing unit for analyzing said received signal and providing data to be displayed, and a touch enabled display screen for displaying said data to be displayed and receiving commands directed to said data processing unit. Said commands comprise commands that determine how said data is displayed on the touch enabled display screen and commands that determine operations that are performed by said instrument and/or said data processing unit.
Rohde & Schwarz Gmbh & Co. Kg

Display device and display control method

A display device includes an object detecting section, a display section, a gesture acceptance section, and a display control section. The object detecting section detects an object contained in display target data.
Kyocera Document Solutions Inc.

System and method to perform an allocation using a continuous two direction swipe gesture

A system, method and computer readable medium provide a gesture-based graphical user interface to determine allocation information to instruct an allocation. A gesture-based i/o device displays a graphical user interface having: an amount region configured to define an allocation amount; a plurality of source regions each configured to define an allocation source; and a plurality of destination regions each configured to define an allocation destination.
The Toronto-dominion Bank

Three-dimensional virtualization

Three-dimensional virtualization may include receiving captured images of an entity and/or a scene, and/or capturing the images of the entity and/or the scene. The images may be connected in a predetermined sequence to generate a virtual environment.
Accenture Global Services Limited

3d document editing system

A 3d document editing system and graphical user interface (gui) that includes a virtual reality and/or augmented reality device and an input device (e.g., keyboard) that implements sensing technology for detecting gestures by a user. Using the system, portions of a document can be placed at or moved to various z-depths in a 3d virtual space provided by the vr device to provide 3d effects in the document.
Apple Inc.

Orthogonal signaling touch user, hand and object discrimination systems and methods

A system and method for distinguishing between sources of simultaneous touch events on a touch sensitive device are disclosed. The touch sensitive device includes row conductors and column conductors, the path of each of the row conductors crossing the path of each of the column conductors.
Tactual Labs Co.

Gesture detection device for detecting hovering and click

There is provided a gesture detection device including two linear image sensor arrays and a processing unit. The processing unit is configured to compare sizes of pointer images in the image frames captured by the two linear image sensor arrays in the same period or different periods so as to identify a click event..
Pixart Imaging Inc.

Gesture detection, list navigation, and item selection using a crown and sensors

The present disclosure generally relates to methods and apparatuses for detecting gestures on a reduced-size electronic device at locations off of the display, such as gestures on the housing of the device or on a rotatable input mechanism (e.g., a digital crown) of the device, and responding to the gestures by, for example, navigating lists of items and selecting items from the list; translating the display of an electronic document; or sending audio control data to an external audio device.. .
Apple Inc.

Augmented-reality-based interactive authoring-service-providing system

The present invention includes: a wearable device including a head mounted display (hmd); an augmented reality service providing terminal paired with the wearable device and configured to reproduce content corresponding to a scenario-based preset flow via a gui interface, overlay corresponding objects in a three-dimensional (3d) space being viewed from the wearable device when an interrupt occurs in an object formed in the content to thereby generate an augmented reality image, convert a state of each of the overlaid objects according to a user's gesture, and convert location regions of the objects based on motion information sensed by a motion sensor; and a pointing device including a magnetic sensor and configured to select or activate an object output from the augmented reality service providing terminal.. .
Korea Advanced Institute Of Science And Technology

Systems and methods for recording custom gesture commands

Methods and apparatuses are disclosed for recording a custom gesture command. In an aspect, at least one camera of a user device captures the custom gesture command, wherein the custom gesture command comprises a physical gesture performed by a user of the user device, a user interface of the user device receives user selection of one or more operations of the user device, and the user device maps the custom gesture command to the one or more operations of the user device..
Qualcomm Incorporated

System and adapting a display on an electronic device

A technique is provided for adapting a display on an electronic device. The technique includes displaying a first user interface comprising a first plurality of graphical prompts to a user, wherein the first plurality of graphical prompts are highlighted sequentially, detecting one or more pre-defined eye-blinking gestures of a user to select a highlighted graphical prompt of the first user interface, and adapting the first user interface, to display a second user interface in response to detecting the one or more pre-defined eye-blinking gestures of the user..
Wipro Limited

Gesture based control of autonomous vehicles

A triggering condition for initiating an interaction session with an occupant of a vehicle is detected. A display is populated with representations of one or more options for operations associated with the vehicle.
Apple Inc.

Vehicle operating system using motion capture

Vehicle operating systems for operating a vehicle having a driving seat for a vehicle driver and at least one passenger seat for passengers are described. The vehicle operating system may include one or more camera devices for shooting images of hand actions of the driver or images of hand actions of a passenger, and a storage device for storing operating signals corresponding to hand actions.
Thunder Power New Energy Vehicle Development Company Limited

Workout monitor interface

The present disclosure relates to systems and processes for monitoring a workout and for generating improved interfaces for the same. One example user interface detects when a workout of a particular type is started and begins generating activity data related to workout metrics associated with the type of workout selected.
Apple Inc.

Wireless directional sharing based on antenna sectors

Apparatuses and methods for sharing data between wireless devices based on one or more user gestures are disclosed. In some implementations, a wireless device may include a number of antenna elements configured to beamform signals in a plurality of transmit directions, with each of the transmit directions corresponding to one of a number of antenna sectors.
Qualcomm Incorporated

Virtual push-to-talk button

A method and apparatus for providing a virtual push-to-talk (ptt) button is provided herein. During operation, an augmented reality and object recognition circuitry will detect user's fingers and a free surface near the fingers by analyzing image captured by camera.
Motorola Solutions, Inc

System and real-time transfer and presentation of multiple internet of things (iot) device information on an electronic device based on casting and slinging gesture command

A novel intermediary set-top box called a “cast-sling box” (csb) uniquely incorporates multimedia and/or iot data casting, slinging, transcoding, referring (i.e. Referral mode), rendering, and recording capabilities for seamless interoperability of various electronic devices in a multiple device environment.
Dvdo, Inc.

Integrated cast and sling system and its operation in an interoperable multiple display device environment

An integrated cast and sling system capable of intermediating seamless multimedia data and playback transfers across multiple display devices is disclosed. The integrated cast and sling system contains a cast-sling box (csb) connected to a conventional cable or satellite set-top box and a plurality of multimedia signal sources.
Dvdo, Inc.

Method and creating motion effect for image

A method includes maintaining images and associated video streams; detecting a swipe gesture of a user on a touch sensitive display, wherein the swipe gesture comprises direction and speed information; and selecting an image based on the direction information of the swipe gesture. The method includes adjusting a playback of an associated video stream based on the direction and the speed information of the swipe gesture; providing the associated video stream on the display during the swipe gesture, for creating a motion effect relating to the image; and providing the image on the display after the swipe gesture..
Nokia Technologies Oy

Device for multi-angle photographing in eyeglasses and eyeglasses including the device

A device for multi-angle photographing in eyeglasses and eyeglasses including the device are disclosed. The device includes a camera mounted on the eyeglasses and configured to capture an image, at least two motors configured to drive the camera to move, and at least two mobile platforms, each of which is provided with a plurality of limiting stoppers.
Beijing Boe Optoelectronics Technology Co., Ltd.

Neural network-based inferential advertising system and method

A neural network-based inferential advertising system and method delivers interactive advertisements to advertisement recipients in accordance with user-selected restrictions that inform the behavioral information upon which advertising preference inferences are made, as well as advertising preference inferences that are made based upon interpretations made from content by a computer-implemented neural network. The behavioral information upon which advertising inferences are made may include bodily movements of advertisement recipients that are performed without physical interaction with a device.
Manyworlds, Inc.

Determining a pointing vector for gestures performed before a depth camera

A pointing vector is determined for a gesture that is performed before a depth camera. One example includes receiving a first and a second image of a pointing gesture in a depth camera, the depth camera having a first and a second image sensor, applying erosion and dilation to the first image using a 2d convolution filter to isolate the gesture from other objects, finding the imaged gesture in the filtered first image of the camera, finding a pointing tip of the imaged gesture, determining a position of the pointing tip of the imaged gesture using the second image, and determining a pointing vector using the determined position of the pointing tip..
Intel Corporation

Identity authentication method and apparatus

Embodiments of the present disclosure disclose an identity authentication method performed at a computing device, the method including: obtaining a sequence of finger gestures on a touchpad of the computing device from a user, wherein each finger gesture has an associated pressure type on the touchpad; generating a corresponding character string according to the sequence of finger gestures; comparing the character string with a verification code of a user account associated with an application program; in accordance with a determination that the character string matches the verification code, granting the user access to the user account associated with the application program; and in accordance with a determination that the character string does not match the verification code, denying the user access to the user account associated with the application program.. .
Tencent Technology (shenzhen) Company Limited

Display system shared among computers

The display system is shared among a number of computers. The display system includes a display device, at least a gesture recognition device, a number of connection ports, a circuit board, a processing unit, and a switch unit.
Evga Corporation

Alternative hypothesis error correction for gesture typing

In one example, a method may include outputting, by a computing device and for display, a graphical keyboard comprising a plurality of keys, and receiving an indication of a gesture. The method may include determining an alignment score that is based at least in part on a word prefix and an alignment point traversed by the gesture.
Google Llc

Device, method and computer program product for creating viewable content on an interactive display

A device, method and computer program product for creating viewable content on an interactive display is provided. The method includes providing a user interface on a device for creating viewable content from a collection comprising at least one multimedia content.
Microsoft Technology Licensing, Llc

Browsing and selecting content items based on user gestures

The present disclosure is directed toward systems and methods that provide users with efficient and effective user experiences when browsing, selecting, or inspecting content items. More specifically, systems and methods described herein provide users the ability to easily and effectively select multiple content items via a single touch gesture (e.g., swipe gesture).
Dropbox, Inc.

Device, method, and graphical user interface for managing concurrently open software applications

An electronic device displays a first application view at a first size. The first application view corresponds to a first application in a plurality of concurrently open applications.
Apple Inc.

Gesture recognition and control based on finger differentiation

An embodiment of a computer implemented method of performing a processing action includes detecting an input from a user via an input device of a processing device, the input including a touch by at least one finger of a plurality of fingers of the user, estimating a gesture performed by the at least one finger based on the touch, measuring at least part of a fingerprint of the at least one finger, and identifying the at least one finger used to apply the input by the user based on stored fingerprint data that differentiates between individual fingers of the user. The method also includes identifying an action to be performed based on the estimated gesture and based on the identified at least one finger, and performing the action by the processing device..
International Business Machines Corporation

Gesture-based multimedia casting and slinging command method and system in an interoperable multiple display device environment

Novel gesture-based multimedia casting and slinging command methods and systems that are configured to accommodate seamless multimedia data and playback transfers across various electronic devices in a multiple display device environment are disclosed. In one instance, a “casting” of an audio/video (av), graphical, or photographic multimedia content from a touch-screen electronic device to a targeted device (e.g.
Dvdo, Inc.

Methods and apparatus to detect vibration inducing hand gestures

Methods and apparatus to detect vibration inducing hand movements are disclosed. An example apparatus includes a sensor to detect a vibration in a hand of a user wearing the apparatus.
Intel Corporation

Measurement of facial muscle emg potentials for predictive analysis using a smart wearable system and method

A system includes at least one wearable device having a housing, at least one sensor disposed within the housing, at least one output device disposed within the housing, and at least one processor operatively connected to the sensors and output devices, wherein one or more sensors are configured to detect electrical activity from a user's facial muscles and to transmit a data signal concerning the electrical activity of the user's facial muscles to one of more of the processors. A method of controlling a wearable device includes determining facial muscular electrical data of a facial gesture made by a user, interpreting the facial muscular electrical data to determine a user response, and performing an action based on the user response..
Bragi Gmbh

Information processing method, terminal, and computer storage medium

The present disclosure discloses a method for rendering a graphical user interface of an online game system on a display of a terminal, including: rendering, in the graphical user interface, at least one virtual resource object; detecting whether a configuration option in the game system is enabled, entering a first control mode in accordance with a determination that the configuration option is enabled; rendering, in a skill control display area in the graphical user interface, at least one skill object corresponding to multiple character objects; detecting, from a user of the terminal, a skill release operation gesture on the at least one skill object in the skill control display area; and performing a skill release operation, on a currently specified target object, of a skill to be triggered and supported by at least one skill object corresponding to one or more of the multiple character objects.. .
Tencent Technology (shenzhen) Company Limited

Reducing erroneous detection of input command gestures

Embodiments of the present invention provide for a system and method for reducing erroneous detection of input command gestures. The method is performed by the system and includes storing a reference time value of when a presence of a human body part is detected by a first sensor and storing a further time value of when a presence of a human body part is detected by a second sensor.
Jaguar Land Rover Limited

Arm-detecting overhead sensor for inventory system

Inventory systems may include one or more sensors capable of detecting spatial positioning of inventory holders and an arm of a worker interacting with the inventory holder. Data can be received from a sensor, a gesture of the arm can be determined from the data, and a bin location or other information can be determined based on the gesture..
Amazon Technologies, Inc.

Video monitoring system

A monitoring system includes cameras adapted to capture images and depth data of the images. A computer device processes the image signals and depth data from the cameras according to various software modules that monitor one or more of the following: (a) compliance with patient care protocols; (b) patient activity; (c) equipment usage; (d) the location and/or usage of assets; (e) patient visitation metrics; (f) data from other sensors that is integrated with the image and depth data; (g) gestures by the patient or caregivers that are used as signals or for controls of equipment, and other items.
Stryker Corporation

Gesture recognition and control based on finger differentiation

An embodiment of a computer implemented method of performing a processing action includes detecting an input from a user via an input device of a processing device, the input including a touch by at least one finger of a plurality of fingers of the user, estimating a gesture performed by the at least one finger based on the touch, measuring at least part of a fingerprint of the at least one finger, and identifying the at least one finger used to apply the input by the user based on stored fingerprint data that differentiates between individual fingers of the user. The method also includes identifying an action to be performed based on the estimated gesture and based on the identified at least one finger, and performing the action by the processing device..
International Business Machines Corporation

Portable electronic device performing similar operations for different gestures

A portable electronic device with a touch-sensitive display is disclosed. One aspect of the invention involves a computer-implemented method in which the portable electronic device: displays an application on the touch-sensitive display; and when the application is in a predefined mode, performs a predefined operation in response to each gesture of a set of multiple distinct gestures on the touch-sensitive display.
Apple Inc.

Social networking application for real-time selection and sorting of photo and video content

Systems and methods of the present disclosure include a virtual wall having a plurality of media, as referred to as entries or media items. The system lays out the entries horizontally and vertically in the virtual wall.
Piqpiq, Inc.

Processing capacitive touch gestures implemented on an electronic device

Content on a display user interface of an electronic device, such as a wearable electronic device, can be manipulated using capacitive touch sensors that may be seamlessly integrated into the housing or strap of the electronic device. The capacitive touch sensors can advantageously replace mechanical buttons and other mechanical user interface components, such as a crown, to provide industrial design opportunities not possible with the inclusion of mechanical buttons and mechanical interface components.
Apple Inc.

Gesture language for a device with multiple touch surfaces

A gesture language for a device with multiple touch surfaces is described. Generally, a series of new touch input models is described that includes touch input interactions on two disjoint touch-sensitive surfaces.
Microsoft Technology Licensing, Llc

Gesture detection

A supplemental surface area allows gesture recognition on outer surfaces of mobile devices. Inputs may be made without visual observance of display devices.
At&t Intellectual Property I, L.p.

Method and circuit for switching a wristwatch from a first power mode to a second power mode

An electronic wristwatch operable in two power modes. The wristwatch has an inertial sensor for detecting a gesture on a cover glass of the wristwatch.
Slyde Watch Sa

Display system operable in a non-contact manner

A display system operable in a non-contact manner includes a display, a wireless communication module, a plurality of user-wearable devices, and a control device. Each of the user-wearable devices is attachable to or around a user's hand, associated with a unique identifier, and is configured to detect a hand gesture made by the user's hand and wirelessly transmit data corresponding to the detected hand gesture along with the unique identifier.
Toshiba Tec Kabushiki Kaisha

Haptic effect handshake unlocking

A system that unlocks itself or another device or electronic media enters an unlocked mode by playing a predetermined haptic effect and in response receiving a gesture based interaction input from a user. The system compares the interaction input to a stored predefined interaction input and transitions to the unlocked mode if the interaction input substantially matches the stored predefined interaction input..
Immersion Corporation

User notification of powered system activation during non-contact human activation

A user-activated, non-contact power closure member system and method of operating a closure member of a vehicle are provided. The system includes at least one sensor for sensing an object or motion.
Magna Closures Inc.

Method and system for providing interactivity based on sensor measurements

There is provided a system for providing interactivity to a guest of an experiential venue, based on sensor measurement of the guest. The system comprises a sensor configured to sense a guest variable of the guest, where the sensor may be a biometric sensor, a facial recognition sensor, a voice stress analysis sensor, a gesture recognition sensor, a motion tracking sensor, or an eye tracking sensor, and may sense heart rate or another guest variable.
Disney Enterprises, Inc.

Systems and methods for prosthetic wrist rotation

Features for a prosthetic wrist and associated methods of rotation are described. The prosthetic wrist attaches to a prosthetic hand.
Touch Bionics Limited

Smart cup, and method and system therefor for performing command control via gesture

A method for a smart cup for performing command control via a gesture, comprising: a tilt angle sensor (121) detects an included angle θ between a smart cup and a vertical direction and an included angle α between the smart cup and a preset horizontal reference line in real time, and sends data about the included angles to a control unit (122); the control unit (122) calculates changes in the angles according to the data about the included angles, and recognizes a corresponding gesture according to the changes in the angles; and the control unit (122) matches and executes a corresponding command according to the gesture.. .
Bowhead Technology (shanghai) Co., Ltd.

Wearable gesture control device & method

Novel tools and techniques are provided for implementing internet of things (“iot”) functionality. In some embodiments, a wearable control device (“wcd”) might receive first user input comprising one or more of touch, gesture, and/or voice input from the user.
Centurylink Intellectual Property Llc

Personalization of experiences with digital assistants in communal settings through voice and query processing

In non-limiting examples of the present disclosure, systems, methods and devices for providing personalized experiences to a computing device based on user input such as voice, text and gesture input are provided. Acoustic patterns associated with voice input, speech patterns, language patterns and natural language processing may be used to identify a specific user providing input from a plurality of users, identify user background characteristics and traits for the specific user, and topically categorize user input in a tiered hierarchical index.
Microsoft Technology Licensing, Llc

Methods and vehicles for capturing emotion of a human driver and moderating vehicle response

Methods and systems for determining an emotion of a human driver of a vehicle and using the emotion for generating a vehicle response, is provided. One example method includes capturing, by a camera of the vehicle, a face of the human driver.
Emerging Automotive, Llc

Method and system of augmented-reality simulations

In one exemplary embodiment, a method includes the step of obtaining a digital image of an object with a digital camera. The object is identified.

Dynamic determination of human gestures based on context

A system comprises a processor configured to execute instructions to receive an indication of an occurrence of a human gesture and to perform an analysis of the indication of the occurrence of the human gesture to determine contextual criteria having a relationship to the occurrence of the human gesture. The processor may determine a meaning of the human gesture based at least in part on the contextual criteria and a plurality of possible intended meanings for the human gesture.
International Business Machines Corporation

Gesture matching mechanism

A mechanism is described to facilitate gesture matching according to one embodiment. A method of embodiments, as described herein, includes selecting a gesture from a database during an authentication phase, translating the selected gesture into an animated avatar, displaying the avatar, prompting a user to perform the selected gesture, capturing a real-time image of the user and comparing the gesture performed by the user in the captured image to the selected gesture to determine whether there is a match..
Intel Corporation

Application processing based on gesture input

Non-limiting examples of the present disclosure describe gesture input processing. As an example, a gesture input may be a continuous gesture input that is received through a soft keyboard application.
Microsoft Technology Licensing, Llc

Detecting and interpreting real-world and security gestures on touch and hover sensitive devices

“real-world” gestures such as hand or finger movements/orientations that are generally recognized to mean certain things (e.g., an “ok” hand signal generally indicates an affirmative response) can be interpreted by a touch or hover sensitive device to more efficiently and accurately effect intended operations. These gestures can include, but are not limited to, “ok gestures,” “grasp everything gestures,” “stamp of approval gestures,” “circle select gestures,” “x to delete gestures,” “knock to inquire gestures,” “hitchhiker directional gestures,” and “shape gestures.” in addition, gestures can be used to provide identification and allow or deny access to applications, files, and the like..
Apple Inc.

Enhanced 3d interfacing for remote devices

Operating a computerized system includes presenting user interface elements on a display screen. A first gesture made in a three-dimensional space by a by a distal portion of an upper extremity of a user is detected while a segment of the distal portion thereof rests on a surface.
Apple Inc.

Configuring three dimensional dataset for management by graphical user interface

An approach is provided that selects three attributes that correspond to objects included in a dataset, where each of the three attributes is assigned to a different coordinate value (x, y, and z coordinates). The approach creates a simulated three dimensional (3d) scene of the objects on a display screen by using the x, y, and z coordinate values corresponding to the attributes of each of the objects.
International Business Machines Corporation

Graphical user interface for managing three dimensional dataset

An approach is provided that displays, on a two dimensional (2d) screen, a gyroscopic graphical user interface (gui). The gyroscopic gui provides three dimensional (3d) control of a simulated 3d scene displayed on the 2d screen.
International Business Machines Corporation

3d touch enabled gestures

Touch events are detected on a touch sensing surface coupled to a capacitive sense array and one or more force electrodes. During a plurality of scan cycles, a plurality of capacitive sense signals and one or more force signals are obtained from capacitive sense electrodes of the capacitive sense array and the one or more force electrodes, respectively, and applied to determine a temporal sequence of touches on the touch sensing surface.
Parade Technologies, Ltd.

Touch display device

A touch display device. A touch display panel includes a source multiplexer switching the transfer of source signals to data lines.
Lg Display Co., Ltd.

Information processing device

An in-vehicle device includes a gesture detection unit for recognizing a user's hand at a predetermined range; an information control unit controlling information output to a display; and an in-vehicle device control unit receiving input from a control unit equipped in a vehicle to control the in-vehicle device. When the gesture detection unit detects a user's hand at a predetermined position, the output information control unit triggers the display to display candidates of an operation executed by the in-vehicle device control unit by associating the candidates with the user's hand motions.
Clarion Co., Ltd.

Controlling navigation of a visual aid during a presentation

Methods, systems and computer program products controlling navigation of a visual aid during a presentation are provided. Aspects include obtaining a presenter profile that includes associations between gestures of a presenter and desired actions for the visual aid and receiving indications of one or more movements of a presenter during the presentation.
International Business Machines Corporation

Interaction and management of devices using gaze detection

User gaze information, which may include a user line of sight, user point of focus, or an area that a user is not looking at, is determined from user body, head, eye and iris positioning. The user gaze information is used to select a context and interaction set for the user.
Microsoft Technology Licensing, Llc

Vehicle parking control

Methods and systems are provided for performing an automated parking operation of a vehicle. The methods and systems determine a position of a wireless driver device located outside of the vehicle.
Gm Global Technology Operations Llc

Gesture-activated remote control

A gesture-based control for a television is provided that runs in the background of a computing device remote from the television, where the control is activated by a gesture. Advantageously, the user need not interrupt any task in order to control the television.
Google Inc.

Method and system for dynamically interactive visually validated mobile ticketing

Systems and methods for interaction-based validation of electronic tickets. In some embodiments, the system renders a first visually illustrative scene on an interactive display screen of a mobile device, the first visually illustrative scene responsive to a pre-determined gesture performed at a predetermined location on the interactive display screen.
Moovel North America, Llc

Augmented reality display device with deep learning sensors

A head-mounted augmented reality (ar) device can include a hardware processor programmed to receive different types of sensor data from a plurality of sensors (e.g., an inertial measurement unit, an outward-facing camera, a depth sensing camera, an eye imaging camera, or a microphone); and determining an event of a plurality of events using the different types of sensor data and a hydra neural network (e.g., face recognition, visual search, gesture identification, semantic segmentation, object detection, lighting detection, simultaneous localization and mapping, relocalization).. .
Magic Leap, Inc.

Gesture masking in a video feed

Various systems and methods for processing video data, including gesture masking in a video feed, are provided herein. The system can include a camera system interface to receive video data from a camera system; a gesture detection unit to determine a gesture within the video data, the gesture being performed by a user; a permission module to determine a masking permission associated with the gesture; and a video processor.
Intel Corporation

Gesture based captcha test

One embodiment a method, including: providing, using a processor, a user challenge over a network, wherein the user challenge is associated with a predetermined gesture to be performed by a user; obtaining, using a processor, user image data; determining, using the user image data, that a user has performed the predetermined gesture; and thereafter providing the user access to information. Other aspects are described and claimed..
Lenovo (singapore) Pte. Ltd.

Mobile terminal

A mobile terminal including a wireless communication processor configured to provide wireless communication; a touch screen; and a controller configured to display an area of an omnidirectional image on the touch screen, display a guideline on the touch screen for guiding a movement of the omnidirectional image on the touch screen, in response to a scrolling gesture on the touch screen having a first direction corresponding to a direction of the guideline, move the display area of the omnidirectional image in the first direction, and in response to the scrolling gesture on the touch screen having a second direction different than the direction of the guideline, move the display area of the omnidirectional image along the guideline in the first direction instead of the second direction.. .
Lg Electronics Inc.

Proxy gesture recognizer

An electronic device displays one or more views. A first view includes a plurality of gesture recognizers.
Apple Inc.

Gesture recognition device, gesture recognition method, and information processing device

Provided are a gesture recognition device, a gesture recognition method and an information processing device for making it possible to quickly recognize a gesture of a user. The gesture recognition device includes a motion information generator that generates body part motion information by performing detection and tracking of the body part, a prediction processor that makes a first comparison of comparing the generated body part motion information with previously stored pre-gesture motion model information and generates a prediction result regarding a pre-gesture motion on the basis of a result of the first comparison, and a recognition processor that makes a second comparison of comparing the generated body part motion information with previously stored gesture model information and generates a result of recognition of the gesture represented by a motion of the detected body part on the basis of the prediction result and a result of the second comparison..
Mitsubishi Electric Corporation

System and distant gesture-based control using a network of sensors across the building

A gesture-based interaction system for communication with an equipment-based system includes a sensor device and a signal processing unit. The sensor device is configured to capture at least one scene of a user to monitor for at least one gesture of a plurality of possible gestures, conducted by the user, and output a captured signal.
Otis Elevator Company

System and distant gesture-based control using a network of sensors across the building

A gesture and location recognition system and method are provided. The system includes a sensor device that captures a data signal of a user and detects a gesture input from the user from the data signal, wherein a user location can be calculated based on a sensor location of the sensor device in a building and the collected data signal of the user, a signal processing device that generates a control signal based on the gesture input and the user location, and in-building equipment that receives the control signal from the signal processing device and controls the in-building equipment based on the control signal..
Otis Elevator Company

Device manipulation using hover

An apparatus may be manipulated using non-touch or hover techniques. Hover techniques may be associated with zooming, virtual feedback, authentication, and other operations.
Microsoft Technology Licensing, Llc

System and communicating inputs and outputs via a wearable apparatus

A system is disclosed comprising one or more of a wearable controller, a wearable apparatus such as a “smart” glove, a mobile device or mobile computer, and operating software, preferably in wireless communication with each other. In one embodiment, these components allow a user to create inputs through their gestures, movements and contacts with other surfaces, which facilitates the ability of the user to perform, edit, remix and produce musical compositions, or enhance a virtual/augmented reality or gaming environment, for example..

Motor control system based upon movements inherent to self-propulsion

The systems and methods described herein provide hands free motor control mechanisms based on the natural and inherent movements associated with an activity of interest, and can be combined with gesture communication based upon defined movements by the participant.. .
Medici Technologies, Llc

Surgical microscope with gesture control and a gesture control of a surgical microscope

The present invention relates to a surgical microscope (1) with a field of view (9) and comprising an optical imaging system (3) which images an inspection area (11) which is at least partially located in the field of view (9), and to a method for a gesture control of a surgical microscope (1) having an optical imaging system (3). Surgical microscopes (1) of the art have the disadvantage that an adjustment of the microscope parameters, for instance the working distance, field of view (9) or illumination mode, requires the surgeon to put down his or her surgical tools, look up from the microscope's eyepiece (5) and perform the adjustment by operating the microscope handles.
Leica Instruments (singapore) Pte. Ltd.

Gesture-based control and usage of video relay service communications

A method and system are disclosed for enabling members of the deaf, hard of hearing, or speech-impaired (d-hoh-si) community to start and control communications conducted through a video relay service (vrs) without the need for a remote-control device. A combination of standard-asl and non-asl hand commands are interpreted using “hand recognition” (similar to “face recognition”) to control the operation of a vrs system through multiple operating modes.
Purple Communications, Inc.

Control system and control processing method and apparatus

The complex operation and low control efficiency in controlling home devices, such as lights, televisions, and curtains, is reduced with a control system that senses the presence and any actions, such as hand gestures or speech, of a user in a predetermined space. In addition, the control system identifies a device to be controlled, and the command to be transmitted to the device in response to a sensed action..
Alibaba Group Holding Limited

Smart electronic device

A smart electronic device is provided for a multi-user environment to meet more convenient life requirements of a plurality of users. The smart electronic device has a camera, a microphone, a processing circuit, a network interface and a projector.
Xiamen Eco Lighting Co. Ltd.

Command processing using multimodal signal analysis

A first set of signals corresponding to a first signal modality (such as the direction of a gaze) during a time interval is collected from an individual. A second set of signals corresponding to a different signal modality (such as hand-pointing gestures made by the individual) is also collected.
Apple Inc.

Contemporaneous gesture and keyboard entry authentication

A restricted access device such as a cellphone, a tablet or a personal computer, analyzes contemporaneous keyboard inputs of a password and gestures to authenticate the user and enable further access to applications and processes of the restricted access device. The gestures may be facial gestures detected by a camera or may be gestures made by an avatar rendered on a display of the device.
International Business Machines Corporation

Touch keyboard using language and spatial models

A computing device outputs for display at a presence-sensitive display, a graphical keyboard comprising a plurality of keys, receives an indication of at least one gesture to select a group of keys of the plurality of keys, and determines at least one characteristic associated with the at least one gesture to select the group of keys of the plurality of keys. The computing device modifies a spatial model based at least in part on the at least one characteristic and determines a candidate word based at least in part on data provided by the spatial model and a language model, wherein the spatial model provides data based at least in part on the indication of the at least one gesture and wherein the language model provides data based at least in part on a lexicon.
Google Llc

Electronic device and the electronic device

The present disclosure proposes an electronic device and method for controlling the electronic device. The electronic device includes a display and a processor configured to detect a first gesture in a predefined area of the display while a first window is currently displayed in full screen on the display and upon detection of the first gesture instruct the display to display a gallery of previously opened windows; wherein the processor is further configured to subsequently upon detecting a second gesture in the predefined area on the display.
Huawei Technologies Co., Ltd.

System and 3d position and gesture sensing of human hand

A three dimensional touch sensing system having a touch surface configured to detect a touch input located above the touch surface is disclosed. The system includes a plurality of capacitive touch sensing electrodes disposed on the touch surface, each electrode having a baseline capacitance and a touch capacitance based on the touch input.
The Trustees Of Princeton University

Radar-based gesture sensing and data transmission

This document describes techniques and devices for radar-based gesture sensing and data transmission. The techniques enable, through a radar system, seamless and intuitive control of, and data transmission between, computing devices.
Google Llc

Method and selecting between multiple gesture recognition systems

A method and apparatus for selecting between multiple gesture recognition systems includes an electronic device determining a context of operation for the electronic device that affects a gesture recognition function performed by the electronic device. The electronic device also selects, based on the context of operation, one of a plurality of gesture recognition systems in the electronic device as an active gesture recognition system for receiving gesturing input to perform the gesture recognition function, wherein the plurality of gesture recognition systems comprises an image-based gesture recognition system and a non-image-based gesture recognition system..
Google Technology Holdings Llc

Radar-based gestural interface

Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for providing a gestural interface in vehicle. In one aspect, movement data corresponding to a gesture of a driver of a vehicle is received from a radar receiver arranged to detect movement at the interior of the vehicle.
Google Inc.

Information display device and information display method

An information display device includes a display control unit, a gesture detection unit, a gesture identification unit that makes identification of an operator's gesture based on gesture information and outputs a signal based on a result of the identification, a distance estimation unit that estimates a distance between the operator and a display unit, and an identification function setting unit that sets gestures identifiable by the gesture identification unit so that the number of gestures identifiable by the gesture identification unit when the estimated distance is over a first set distance is smaller than the number of gestures identifiable by the gesture identification unit when the estimated distance is less than or equal to the first set distance.. .
Mitsubishi Electric Corporation

Apparatus and method to navigate media content using repetitive 3d gestures

In a method for detecting a repetitive three-dimensional gesture by a computing device, a three-dimensional gesture sensor detects a plurality of positions corresponding to a finger movement. The computing device determines whether the plurality of positions contain a gesture cycle by: comparing at least two non-adjacent positions in the plurality of positions; and upon determining that the at least two non-adjacent positions match, determining that the plurality of positions contain the gesture cycle.

Driving support device, driving support system, and driving support method

In a driving support device, an image output unit outputs an image including a vehicle object representing a vehicle and a peripheral situation of the vehicle, to a display unit. An operation signal input unit receives a gesture operation by a user that involves moving of the vehicle object in the image displayed on the display unit.
Panasonic Intellectual Property Management Co., Ltd.

Physical gesture input configuration for interactive software and video games

Technologies are described for configuring user input using physical gestures. For example, a user can be prompted to perform a physical gesture to assign to a software application command (e.g., an action within a video game or a command in another type of application).
Microsoft Technology Licensing, Llc

Smart watch for indicating emergecny events

A portable device including a gesture recognizer module for automatically detecting a specific sequence of gestures is described. The portable device may be used to detect a health, safety, or security related event.

Method for transmitting image and electronic device thereof

A method for transmitting an image and an electronic device thereof are provided. An image transmission method of an electronic device includes displaying a message transmission/reception history with at least one other electronic device, sensing a selection of a camera execution menu, displaying a preview screen of a camera within a screen in which the message transmission/reception history is displayed, detecting a touch on the displayed preview screen, if the displayed preview screen is touched, capturing an image of a subject, detecting a gesture for the captured image, and, if the gesture for the captured image is detected, transmitting the captured image to the at least one other electronic device according to the detected gesture..
Samsung Electronics Co., Ltd.

Method for adding contact information, and user equipment

The present disclosure provides a method for adding contact information, and user equipment. The method includes: receiving gesture information input by a user on a communication interface of an instant messaging application, recognizing contact information in communication information according to the gesture information, and adding the contact information to an address book of user equipment..
Huawei Technologies Co., Ltd.

Method and password management

Systems, methods, and a security management apparatus, for password management including the determination of the identity of a service requesting a security token for access to the service. The security management apparatus generates personal identification data based on a personal identification input such as a touch selection or gesture, in order to access a service on a secured device.
Huami Inc.

Motion and gesture-based mobile advertising activation

The presentation of advertisements to a user on a mobile communications device is disclosed. A first external input corresponding to a triggering of an advertisement delivery is received on a first input modality.
Adtile Technologies Inc.

Device and operating a device

A method for operating a device, wherein a graphical user interface is generated and displayed on a display area. The user interface has at least one operating object assigned to an application program for controlling the device.
Volkswagen Aktiengesellschaft

User interface input handheld and mobile devices

Methods, systems, and computer readable media for receiving user input. According to one example method for receiving user input, the method includes identifying gestures from directional movements of a user's fingers on a touch sensor, mapping the gestures to alphanumeric characters, and outputting alphanumeric characters, including defining a set of the gestures as different sequences of at least two of: a contact event, a no contact event, a hold event, a finger movement in a first direction, a finger movement in a second direction..
The Trustees Of The University Of Pennsylvania

Interface scanning for disabled users

Systems and processes for scanning a user interface are disclosed. One process can include scanning multiple elements within a user interface by highlighting the elements.
Apple Inc.

Touch event model for web pages

One or more touch input signals can be obtained from a touch sensitive device. A touch event model can be used to determine touch and/or gesture events based on the touch input signals.
Apple Inc.

Method for determining display orientation and electronic apparatus using the same and computer readable recording medium

A method for determining display orientation is provided. The method includes the following steps.
Htc Corporation

User-defined virtual interaction space and manipulation of virtual cameras in the interaction space

The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3d) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3d sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures.
Leap Motion, Inc.

Route navigation method and system, terminal, and server

A route navigation method, include: at a first terminal in communication with a second terminal via a navigation server: obtaining a starting point and a destination that are set by a user; displaying, on a navigation interface, the starting point and the destination; drawing, on the navigation interface, route information from the starting point to the destination according to a swipe gesture of the user on the navigation interface; and sending route forwarding instruction to the navigation server, wherein sending the route forwarding instruction includes: sending the route information to the navigation server, wherein the navigation server is configured to determine a navigation path from the starting point to the destination according to the route information, and sending a request for the navigation server to forward the navigation path to the second terminal to prompt another user to arrive at the destination from the starting point in accordance with the navigation path.. .
Tencent Technology (shenzhen) Company Limited

Audio-based device control

Some disclosed systems may include a microphone system having two or more microphones, an interface system and a control system. In some examples, the control system may be capable of receiving, via the interface system, audio data from two or more microphones of the microphone system, of determining a gesture location based, at least in part, on the audio data and of controlling one or more settings of the system based on the gesture location..
Qualcomm Incorporated

Mobile terminal and control method therefor

The present invention provides a mobile terminal and a control method using an intuitive gesture as an input to perform a function. Specifically, the present invention provides a mobile terminal including a sensing unit for sensing movement of the mobile terminal, a wireless communication unit for transmitting/receiving a radio signal to/from an external terminal and a control unit for sensing a distance from the mobile terminal to the external terminal based on the strength of the received radio signal, and performing a specific function when the sensed movement corresponds to specific movement and the sensed distance is within a specific range..
Lg Electronics Inc.

Electronic device and control method

An electronic device includes a non-contact detection sensor, a display, and a controller. The display displays a first window and a second window.
Kyocera Corporation

Method for device interaction and identification

A method is provided for device interaction and identification by detecting similar or synchronous movements of two or more electronic devices using two or more movement data from two or more involved electronic devices to interact with each other and detect when the movement or motion of involved devices corresponds with certain multi-device gestures or activities.. .
16lab Inc

Gesture controlled calculator

A gesture controlled calculator has a touch screen controlled by a microprocessor. The touch screen receives a multiplication problem input by a user through a virtual keyboard.

System for identifying and using multiple display devices

Data, particularly display data, is sent to a particular peripheral device, particularly a display device) from a computer device, such as a mobile device. The method involves determining an identifier of each peripheral device and receiving (35) a user identification identifying a particular peripheral device.
Displaylink (uk) Limited

Systems and methods for shared broadcasting

Systems, methods, and non-transitory computer-readable media can provide an interface that includes a first region and a second region, wherein a live content stream being broadcasted is presented in the first region, and wherein information corresponding to users viewing the live content stream is presented in the second region. A determination is made that a first user operating the computing device has performed one or more touch screen gestures with respect to at least one user identifier in the second region, the user identifier corresponding to a second user.
Facebook, Inc.

Systems, devices, and methods for dynamically providing user interface controls at a touch-sensitive secondary display

Disclosed herein are systems, devices, and methods for dynamically updating a touch-sensitive secondary display. An example method includes receiving a request to open an application and, in response, (i) displaying, on a primary display, a plurality of user interface (ui) objects associated with the application, the plurality including a first ui object displayed with associated content and other ui objects displayed without associated content; and (ii) displaying, on the touch-sensitive secondary display, a set of affordances representing the plurality of ui objects.
Apple Inc.

Multi-touch uses, gestures, and implementation

A tablet pc having an interactive display, which is touchscreen enabled, may be enhanced to provide a user with superior usability and efficiency. A touchscreen device may be configured to receive multiple concurrent touchscreen contacts.
Microsoft Technology Licensing, Llc

Method and wearable device for providing a virtual input interface

Provided is a wearable device including: an image sensor configured to sense a gesture image of a user setting a user input region; and a display configured to provide a virtual input interface corresponding to the set user input region.. .
Samsung Electronics Co., Ltd.

Systems and methods for content-aware selection

Systems and methods detect simple user gestures to enable selection of portions of segmented content, such as text, displayed on a display. Gestures may include finger (such as thumb) flicks or swipes as well as flicks of the handheld device itself.
Fuji Xerox Co., Ltd.

Electronic device, control method, and non-transitory computer-readable recording medium

An electronic device includes a non-contact detection sensor and a controller that executes processing related to a timer in response to a gesture detected by the non-contact detection sensor.. .
Kyocera Corporation

Electronic device

An electronic device includes a controller that executes processing in response to a gesture. The controller starts the processing in response to a gesture in accordance with the physical state of the electronic device..
Kyocera Corporation

Human machine interface with haptic response based on phased array lidar

A device and a method for a human machine interface (hmi). The hmi device includes a dome having a hemispherical shape, a base attached to the dome forming an inverted cup like structure with a hollow interior, a chip scale lidar attached on the base and positioned to scan for a motion of an object external to the dome, at least one haptic device attached to the base and connected to the dome, and an hmi controller configured to send and receive signal from the chip scale lidar, detect and recognize a gesture based on the signal from the chip scale lidar, and activate or deactivate the at least one haptic device..
Toyota Motor Engineering & Manufacturing North America, Inc.

Information processing method, terminal, and computer storage medium

An information processing method, comprising: displaying, in a game user interface, a first game scene; displaying a skill selection object in the game user interface, the skill selection object includes a plurality of skill slots, and each slot includes a respective skill of a plurality of skills, wherein a total number of skill slots in the skill selection object is smaller than a total number of skills in the plurality of skills; while displaying the skill selection object, detecting a swipe gesture across a predefined region; in response: ceasing to display at least one of skills currently displayed in the skill selection object, and replacing the at least one skill with at least another skill among the plurality of skills that was not displayed in the skill selection object when the swipe gesture was detected.. .
Tencent Technology (shenzhen) Company Limited

Gestural control of visual projectors

Gestures may be performed to control a visual projector. When a device or human hand is placed into a projection field of the visual projector, the visual projector responds to gestures.
At&t Intellectual Property I, L.p.

Cyber reality device including gaming based on a plurality of musical programs

A system enabling the performance of sensory stimulating content including music and video using gaming in a cyber reality environment, such as using a virtual reality headset. This disclosure includes a system and method through which a performer can virtually trigger and control a presentation of pre-packaged sensory stimulating content including musical programs through gaming.
Beamz Interactive, Inc.

Systems and methods for advertising on virtual keyboards

Methods and systems are disclosed for interacting with advertisements on a virtual keyboard. An advertisement appears in a position that is proximate to a virtual key of the virtual keyboard.
Oversignal, Llc

Method and system for gesture-based confirmation of electronic transactions

A method for electronically transmitting data based on a physical gesture includes: storing at least one gesture pair, wherein each gesture pair includes at least a physical gesture and an associated data conveyance, the physical gesture being stored as one or more data points telegraphing three-dimensional movement; capturing a plurality of movement data points based on movement of one or more motion capturing devices; identifying a specific gesture pair where at least one of the captured plurality of movement data points corresponds to the included physical gesture; establishing a communication channel with an external computing device; and transmitting the associated data conveyance included in the identified specific gesture pair to the external computing device using the established communication channel.. .
Mastercard International Incorporated

Systems and methods for displaying an image capturing mode and a content viewing mode

Embodiments are also provided for displaying an image capturing mode and a content viewing mode. In some embodiments, one or more live images may be received from an image capturing component on a mobile device.
Dropbox, Inc.

Projecting a structured light pattern onto a surface and detecting and responding to interactions with the same

The present disclosure describes projecting a structured light pattern projected onto a surface and detecting and responding to interactions with the same. The techniques described here can, in some cases, facilitate recognizing that an object such as a user's hand is adjacent the plane of a projection surface and can distinguish the object from the projection surface itself.
Heptagon Micro Optics Pte. Ltd.

Information processing method, terminal, and computer storage medium

An information processing method includes: performing rendering on a graphical user interface to obtain at least one virtual resource object; when detecting a skill-release trigger gesture on at least one skill object, performing rendering to obtain a skill-release supplementary control object, having a skill-release control halo object and a virtual joystick object; when detecting a dragging operation on the virtual joystick object, controlling a skill-release location of the skill object to be correspondingly adjusted; and determining whether the virtual joystick object is out of a threshold range and, when the virtual joystick object is not out of the threshold range, selecting a target character object satisfying a first preset policy from at least one character object within a skill releasable range of the skill object according to a detected release operation of the dragging operation, and performing a skill-release operation on the target character object.. .
Tencent Technology (shenzhen) Company Limited

Gesture based interface system and method

A user interface apparatus for controlling any kind of a device. Images obtained by an image sensor in a region adjacent to the device are input to a gesture recognition system which analyzes images obtained by the image sensor to identify one or more gestures.
Eyesight Mobile Technologies Ltd.

Automated learning and gesture based interest processing

A system, method and program product for processing user interests. A system is provided that includes: a gesture management system that receives gesture data from a collection device for an inputted interest of a user; a pattern detection system that receives and analyzes behavior data associated with the inputted interest; an interest affinity scoring system that calculates an affinity score for the inputted interest based on the gesture data and an analysis of the behavior data; a dynamic classification system that assigns a dynamically generated tag to the inputted interest based on an inputted context associated with the inputted interest; and a user interest database that stores structured interest information for the user, including a unique record for the inputted interest that includes the affinity score and dynamically generated tag..
International Business Machines Corporation

Smart playable flying disc and methods

This disclosure is directed to a smart playable flying disc and methods for determining associated motion parameters. The playable device may capture and transmit sensor data including motion data to a computing device.
Play Impossible Corporation

Motorized shoe with gesture control

An article of footwear includes a motorized tensioning system, sensors, and a gesture control system. Based on information received from one or more sensors the gesture control system may, detect a prompting gesture and enters an armed mode for receiving further instructions.
Nike, Inc.

Smart lighting system and method

A smart lighting system for a vehicle is described. The smart lighting system includes at least one signal generator, and a lighting stem assembly.

Electronic device with gesture actuation of companion devices, and corresponding systems and methods

An electronic device includes a biometric sensor, such as a fingerprint sensor, to identify biometric input. One or more processors are then operable to identify at least one paired device and at least one companion device operating within a wireless communication radius.
Motorola Mobility Llc

Capturing smart playable device and gestures

This disclosure is generally directed to capturing aspects of a smart playable device. The playable device can include any device that is suitable for sports, games, or play, such as balls, discs, staffs, clubs, and the like.
Play Impossible Corporation

Information processing device

An information processing device includes a communication interface that is communicable with an image forming device, a camera interface configured to receive image data generated by a camera, and a controller. The controller is configured to, based on the image data received through the camera interface, determine whether or not a predetermined gesture is performed by a user during a predetermined period of time after controlling the communication interface to transmit a print command to the image forming device.
Kabushiki Kaisha Toshiba

Smart playable device and charging systems and methods

This disclosure is generally directed to a smart playable device and charging systems and methods. The playable device can include any device that is suitable for sports, games, or play, such as balls, discs, staffs, clubs, and the like.
Play Impossible Corporation

Combining gesture and voice user interfaces

A system includes a microphone providing input to a voice user interface (vui), a motion sensor providing input to a gesture-based user interface (gbi), an audio output device, and a processor in communication with the vui, the gbi, and the audio output device. The processor detects a predetermined gesture input to the gbi, and in response to the detection, decreases the volume of audio being output by the audio output device and activates the vui to listen for a command.
Bose Corporation

Synthetic data generation of time series data

A method of generating synthetic data from time series data, such as from handwritten characters, words, sentences, mathematics, and sketches that are drawn with a stylus on an interactive display or with a finger on a touch device. This computationally efficient method is able to generate realistic variations of a given sample.
University Of Central Florida Research Foundation, Inc.

Electronic device with gesture actuation of companion devices, and corresponding systems and methods

An electronic device includes a biometric sensor, such as a fingerprint sensor, that identifies biometric input received at the biometric sensor. One or more processors operable with the biometric sensor identify one or more companion devices operating within a wireless communication radius of the electronic device.
Motorola Mobility Llc

Incremental multi-word recognition

In one example, a computing device includes at least one processor that is operatively coupled to a presence-sensitive display and a gesture module operable by the at least one processor. The gesture module may be operable by the at least one processor to output, for display at the presence-sensitive display, a graphical keyboard comprising a plurality of keys and receive an indication of a continuous gesture detected at the presence-sensitive display, the continuous gesture to select a group of keys of the plurality of keys.
Google Inc.

Pressure-based gesture typing for a graphical keyboard

A computing device is described that outputs, for display, a graphical keyboard comprising a plurality of keys. The computing device receives an indication of a first gesture selecting a first sequence of one or more keys from the plurality of keys, and an indication of a second gesture selecting a second sequence of one or more keys from the plurality of keys.
Google Inc.

Interactive content for digital books

A graphical user interface (gui) is presented that allows a user to view and interact with content embedded in a digital book, such as text, image galleries, multimedia presentations, video, html, animated and static diagrams, charts, tables, visual dictionaries, review questions, three-dimensional (3d) animation and any other known media content, and various touch gestures can be used by the user to move through images and multimedia presentations, play video, answer review questions, manipulate three-dimensional objects, and interact with html.. .
Apple Inc.

System and touch/gesture based device control

A system and method for document processing includes a three-dimensional touch interface, a processor and associated memory. The processor generates thumbnail image data from received electronic document data and document format data corresponding to the electronic document data and the thumbnail image data.
Toshiba Tec Kabushiki Kaisha

Automated door

An automated door-opening device is disclosed that includes a first sensor disposed on the outside of the door. The first sensor is adapted to recognize a predetermined pattern of a gesture made by a patron.

Gesture based input system in a vehicle with haptic feedback

A vehicle haptic feedback system includes a haptic actuator, a detection device, and a controller. The haptic actuator is configured to provide haptic feedback to a vehicle driver.
Immersion Corporation

Smart playable device, gestures, and user interfaces

This disclosure is generally directed to a smart playable device and systems and methods of interacting with the playable device. The playable device can include any device that is suitable for sports, games, or play, such as balls, discs, staffs, clubs, and the like.
Play Impossible Corporation

Motor-activated multi-functional wrist orthotic

A multifunctional wrist orthotic comprising an electromyography (emg) sensor having at least two electrodes for attachment to a wrist of a user, an intertial measurement sensor (imu), a microcontroller unit (e.g., a arduino® mini) connected to the imu, a power supply unit. The microcontroller unit is configured to perform two-tiered gesture recognition, with the first tier comprising a fine gesture sensed by the emg sensor and the second tier comprising a gross gesture sensed by the imu sensor..
Purdue Research Foundation

Control system for navigating a principal dimension of a data space

Systems and methods are described for navigating through a data space. The navigating comprises detecting a gesture of a body from gesture data received via a detector.
Klemm+sohn Gmbh & Co. Kg

Lighting commanding method and an assymetrical gesture decoding device to command a lighting apparatus

A gesture lighting source output command method for controlling a lighting apparatus, together with an asymmetrical, no facing, gesture decoding device for commanding a lighting source, by decoding gesture translational motions of a heat emitting object, e.g. A human hand, at distances up to 1.5 meters, in geometrical planes that may be not coplanar with the gesture decoding device plane, and comprising a casing with an embedded electronic controller and two dual pir sensors, geometrically arranged in a special manner, rotated 2α degrees one relatively to the other, fitted with specially designed fresnel lenses, their beam axis forming an angle of 2γ degrees between each other and an angle β with the vertical..
Cwj Power Electronics Inc.

Methods and systems to perform at least one action according to a user's gesture and identity

The present invention discloses methods and systems for performing at least one action at a system according to a user's gesture information. The required steps comprises of capturing the user's gestures information by a mobile apparatus, wherein the apparatus comprises an antenna, a processor, a storage medium, at least one accelerometer, wherein the accelerometer has at least 3 axis; comparing gesture information against one or more predefined gesture information at the mobile apparatus and when the gesture information matches a predefined gesture information, the mobile apparatus selects a first identity based on the predefined gesture information, sends encrypted information to a system through a reader wherein the encrypted information comprises the predefined gesture information, the first identity, a timestamp, and a device identity.
Pismo Labs Technology Ltd

Fire alarm inspection application and user interface

A system and method for facilitating inspection of fire alarm systems includes a graphical user interface rendered on a touchscreen display of a mobile computing device receiving selections of inspection results. The graphical user interface includes a testing pane, which indicates devices that are currently being tested, and a selection pane, which indicates devices yet to be tested.
Tyco Fire & Security Gmbh

Enhancing video chatting

A method for a computing device to enhance video chatting includes receiving a live video stream, processing a frame in the live video stream in real-time, and transmitting the frame to another computing device. Processing the frame in real-time includes detecting a face, an upper torso, or a gesture in the frame, and applying a visual effect to the frame.
Arcsoft Inc.

Biometric, behavioral-metric, knowledge-metric, and electronic-metric directed authentication and transaction method and system

A system to authenticate an entity and/or select details relative to an action or a financial account using biometric, behavior-metric, electronic-metric and/or knowledge-metric inputs. These inputs may comprise gestures, facial expressions, body movements, voice prints, sound excerpts, etc.
Nxt-id, Inc.

Floating soft trigger for touch displays on electronic device

A portable electronic device (100) having a touch screen (112) with a floating soft trigger icon (175) for enabling various functions of the electronic device (100), such as bar code reading, capturing rfid data, capturing video and images, calling applications, and/or placing phone calls. The floating trigger icon (175) is displayed on the touch screen (112) to enable easy identification and access of the trigger icon (175).
Datalogic Usa, Inc.

Systems and methods for adaptive gesture recognition

Systems and methods are described for adaptively recognizing gestures indicated by user inputs received from a touchpad, touchscreen, directional pad, mouse or other multi-directional input device. If a user's movement does not indicate a gesture using current gesture recognition parameters, additional processing can be performed to recognize the gesture using other factors.
Sling Media Inc.

Mechanism to create pattern gesture transmissions to create device-sourcing emergency information

A system may comprise a registration device configured to register patterns for users; a recording device configured to record a received pattern, as an electronic pattern, wherein the recording device recognizes the received pattern as one of the registered patterns; a receiving device configured to observe human movement patterns with a camera, transform the observed human movement patterns to an electronic signal, and receive the recognized registered pattern from the recording device by a first wireless transmission; a forwarding device configured to transmit the electronic signal, and the received recognized registered pattern to an alert service by a second wireless transmission; and an alert service, configured to receive the electronic signal and the received recognized registered pattern from the forwarding device and configured to transmit the electronic signal and the received recognized registered pattern to a second electronic device by a third wireless transmission.. .
International Business Machines Corporation

Dynamic user interactions for display control

The technology disclosed relates to using gestures to supplant or augment use of a standard input device coupled to a system. It also relates to controlling a display using gestures.
Leap Motion, Inc.

Gesture-based user interface

A computer-implemented method for enabling gesture-based interactions between a computer program and a user is disclosed. According to certain embodiments, the method may include initiating the computer program.
Capital One Services, Llc

User interface device, vehicle including the same, and controlling the vehicle

A user interface device includes: an output device having an output region predefined around an output unit; an acquisition unit acquiring information about a user's gesture performed around the output region; and a controller determining an area of a shielded region which is shielded by the user's gesture in the output region based on the acquired information and controlling output of the output device.. .
Hyundai Motor Company

Multimodal haptic effects

Embodiments generate haptic effects in response to a user input (e.g., pressure based or other gesture). Embodiments receive a first input range corresponding to user input and receive a haptic profile corresponding to the first input range.
Immersion Corporation

Method and smart home control based on smart watches

A method for controlling a smart home using a smart watch is disclosed. The method includes: detecting whether the smart watch has entered a sensing range of the smart home; detecting, after the smart watch has entered the sensing range of the smart home, whether the smart watch has established a wireless connection with the smart home; turning on, after the smart watch has established the wireless connection with the smart home, a smart-home-control function of the smart watch; and while controlling the smart home using the smart-home-control function, recognizing hand gestures of the user using the smart watch and controlling the smart home through the wireless connection to switch current working state of the smart home based on the recognized hand gestures of the user..
Huizhou Tcl Mobile Communication Co.,ltd

Location-based experience with interactive merchandise

A system for providing an interactive experience for multiple game-players. The experience is provided in a light-controlled setting where the game-players may wear or hold one or more of various toys (e.g., gloves).
Disney Enterprises, Inc.

Wearable device and controlling method thereof, and system for controlling smart home

A wearable device and a controlling method thereof, and a system for controlling a smart home. The wearable device comprises: a hand gesture identifying module, for by characteristic data to be identified of a wearer that is collected by a sensor, identifying out a current hand gesture action of the wearer; an appliance configuring module, for according to acquired information of each of household electrical appliances, establishing and saving correspondence relations between controlling commands of each of the household electrical appliances and corresponding hand gesture actions; and upon receiving the hand gesture action, looking up the household electrical appliance and the controlling command that match the hand gesture action by using the correspondence relations, generating a controlling message according to the matched information; a wirelessly connecting module, for receiving the controlling message and wirelessly sending the controlling message to a smart home controlling server..
Goertek Inc.

Method and system for assisting a driver in driving a vehicle and vehicle on which such system is mounted

The invention regards a method and a system for assisting a driver in driving a vehicle, as well as a vehicle with such system being mounted thereon. Information on an environment of a vehicle is obtained by sensing the environment.
Honda Research Institute Europe Gmbh

Handheld electronic devices with remote control functionality and gesture recognition

Handheld electronic devices are provided that have remote control functionality and gesture recognition features. The handheld electronic device may have remote control functionality in addition to cellular telephone, music player, or handheld computer functionality.
Apple Inc.

Gesture-based database actions

Computing systems and related methods are provided for performing database actions responsive to input gestures made by a user. One exemplary method involves a server identifying a gesture on a graphical user interface display on a client device, identifying a subset of displayed content on the graphical user interface display corresponding to the gesture, determining a database action based on at least one of characteristics associated with the gesture and the gestured content, performing the database action with respect to an object in a database, and updating the gestured content on the graphical user interface display to reflect performance of the database action..
Salesforce.com, Inc.

Method and identifying input features for later recognition

Disclosed are method and apparatus to recognize actors during normal system operation. The method includes defining actor input such as hand gestures, executing and detecting input, and identifying salient features of the actor therein.
Atheer, Inc.

System and continuous multimodal speech and gesture interaction

Disclosed herein are systems, methods, and non-transitory computer-readable storage media for processing multimodal input. A system configured to practice the method continuously monitors an audio stream associated with a gesture input stream, and detects a speech event in the audio stream.
Nuance Communications, Inc.

Electronic device and providing visual notification of a received communication

A method disclosed herein includes displaying information associated with a first application in a display area of an electronic device, detecting a change in direction of a continuous gesture across at least a portion of the display area between a first direction and a second direction different than the first direction, the continuous gesture associated with a request to display information of a second application, the first application being different than the second application, in response to detecting the change in direction of the continuous gesture, reducing display of the first application to a first portion of the display area and presenting a preview of information associated with the second application in a second portion of the display area in which the first application was presented prior to detection of the continuous gesture, based on a first characteristic of the continuous gesture, discontinuing providing the first information and display the second application in the display area, and based on a second characteristic of the continuous gesture, discontinuing providing the second information and display the first application in the display area.. .
Blackberry Limited

Method and ego-centric 3d human computer interface

In the method, a processor generates a three dimensional interface with at least one virtual object, defines a stimulus of the interface, and defines a response to the stimulus. The stimulus is an approach to the virtual object with a finger or other end-effector to within a threshold of the virtual object.
Atheer, Inc.

User interface driven movement of data

A to-do list management system surfaces a user interface with a user input mechanism that displays a user actuatable element for each item on a user's to-do list. It senses a flick gesture and automatically moves a to-do list item to the user's agenda.
Microsoft Technology Licensing, Llc

Gesture detection and compact representation thereof

Techniques are described that may be implemented with an electronic device to detect a gesture within a field of view of a sensor and generate a compact data representation of the detected gesture. In implementations, a sensor is configured to detect a gesture and provide a signal in response thereto.
Maxim Integrated Products, Inc.

Occluded gesture recognition

This document describes techniques and devices for occluded gesture recognition. Through use of the techniques and devices described herein, users may control their devices even when a user's gesture is occluded by some material between the user's hands and the device itself.
Google Inc.

Robot control using gestures

A method and a device for operating a robot are provided. According to an example of the method, information of a first gesture is acquired from a group of gestures of an operator, each gesture from the group of gestures corresponding to an operation instruction from a group of operation instructions.
Beijing Airlango Technology Co., Ltd.

Digital multimedia platform for converting video objects to gamified multimedia objects

Embodiments provide an interactive digital multimedia platform for converting existing video objects to gamified multimedia objects and a method for the same. An editor-user of the platform may modify an existing video object while the content of the video object is playing.
Abrakadabra Reklam Ve Yayincilik Limited Sirketi

Method for locking target in game scenario and terminal

A method is performed at a terminal for locking a target in a game application, the method comprising: obtaining input gesture information from a user on an operation interface of a game application; recognizing the gesture information to obtain a switching instruction of the game application corresponding to the gesture information; selecting an object category of virtual characters according to the switching instruction; locking a target virtual character in the object category according to a preset rule and performing an operation on the target virtual character; and after performing the operation, automatically repeating the locking and the performing operations on a next target virtual character in the object category until finishing a last virtual character in the object category.. .
Tencent Technology (shenzhen) Company Limited