|| List of recent Gesture-related patents
| System and controlling data items displayed on a user interface|
The present application discloses a method for controlling data items displayed on a touch-sensitive display of a computing device. The method includes, at the computing device, displaying a plurality of data items on a graphical user interface of the touch display, and detecting a sequence of finger gestures on the touch display.
Tencent Technology (shenzhen) Company Limited
| Method and diagonal scrolling in a user interface|
A method and apparatus includes a user interface of a computing device for diagonal scrolling in a two-step selection process. The method includes presenting a main list and a sub-list substantially orthogonal to the main list, wherein the sub-list comprises elements associated with each element in the main list; receiving user input from a user to progress through the main list and/or the sub-list; and scrolling through the main list and/or the sub-list simultaneously responsive to the user input, wherein the main list and the sub-list are simultaneously scrolled responsive to a diagonal scrolling gesture as the user input..
Motorola Solutions, Inc
| Device, method, and graphical user interface for determining whether to scroll or select content|
An electronic device with a display, a touch-sensitive surface and one or more intensity sensors displays content. While a focus selector is over the content, the device detects a gesture on the touch-sensitive surface, the gesture including a first contact on the touch-sensitive surface and movement of the first contact across the touch-sensitive surface that corresponds to movement of the focus selector on the display.
| Method for adjusting input-method keyboard and mobile terminal thereof|
A method for adjusting an input-method keyboard includes: recording the sliding trajectories of a user's two fingers, and the trajectories include two starting contact points and two ending contact points of the two-finger sliding gesture; calculating an adjustment ratio according to the sliding trajectories; obtaining the current state of the input-method keyboard, and the state can be one of a maximum state, an intermediate state and a minimum state; and adjusting the size and/or layout of the current input-method keyboard according to the adjustment ratio and the current state of the input-method keyboard. The mobile terminal for adjusting an input-method keyboard includes a recording module, a calculation module, an acquisition module and an adjustment module.
Shenzhen Shi Ji Guang Su Information Technology Co., Ltd.
| Graphical user interface searching and displaying medical codes in an electronic anesthesia record|
An apparatus for improved searching and selecting of medical coding information for an electronic anesthesia record on a multi-function gesture-sensitive device.. .
| Conference system and associated signalling method|
The conference system (ks) can comprise at least one mobile terminal with a respective signaling apparatus which contains a sensor which detects motion by the signaling apparatus and outputs a corresponding motion signal. In addition, it can contain a motion signal processing apparatus which associates the motion signal from a gesture in nonverbal interhuman communication with a motion signal pattern and, as a result of an association having been made, produces an output signal with a corresponding piece of information about the type of detected gesture in the nonverbal interhuman communication.
Unify Gmbh & Co. Kg
| Image processor with static pose recognition module utilizing segmented region of interest|
An image processing system comprises an image processor having image processing circuitry and an associated memory. The image processor is configured to implement a gesture recognition system comprising a static pose recognition module.
| Electronic eyeglasses and manufacture thereto|
A system and methods for recognizing certain eye or eyelid gestures such as by opening or closing of eyelid or movement of the pupil as signals to trigger certain predesigned desired events. An embodiment comprises of electronic glasses placed in front of the eye to recognize certain eye or eyelid gestures as signals to control an electronic device such as a chair for the special needs, or tvs or car system or some video games.
| Interactive controls for operating devices and systems|
An electric device (e.g., module, interactive controller/switch) comprising a gesture sensor can use the gesture sensor to determine (e.g., detect, recognize, identify, etc.) a gesture performed by a user. If the electric device recognizes the gesture as corresponding to a gestural command to control or operate another device and/or system (e.g., such as a light), then the electric device can instruct the other device/system to function or operate in accordance with the gestural command (e.g., turn on or off).
| Display device and driving the same|
A display device and a method for driving the same are discussed. The display device includes a display panel including a common electrode commonly connected to pixels, a display driving circuit for applying a data voltage to the pixels during a vertical active time, and a sensor driving circuit which applies a gesture sensing driving signal to the common electrode during a vertical blank time and senses a gesture input..
Lg Display Co., Ltd.
Portable multi-touch input device
A portable input device is described. The portable input device can wirelessly send control signals to an external circuit.
Control method and control apparatus of electronic device, and electronic device
A control method and a control apparatus of an electronic device and the electronic device, wherein the control apparatus includes a detection unit configured to detect a user's contact with the electronic device, and acquire a detection result; a recognition unit configured to recognize a gesture of the user holding the electronic device according to the detection result; and a first control unit configured to generate a first control signal according to the gesture and a first correspondence between the gesture and the first control signal, the first control signal controlling the electronic device to perform a function corresponding to the first control signal. According to the control method and the control apparatus of the electronic device and the electronic device, a gesture of the user holding the electronic device is recognized, and a control signal for the electronic device is generated according to the gesture, so that the user conveniently controls the electronic..
Inputting mode switching method and system utilizing the same
An inputting mode switching method and system are provided. The inputting mode switching method is used to an inputting device.
Inventec (pudong) Technology Corporation
Input devices and methods
Devices and methods for providing an interface to a computing device are disclosed herein. The disclosed embodiments allow a user to utilize a first computing device, such as a smartphone or other mobile computing device, as a mouse-like peripheral input device for an associated second computing device, such as a tablet computing device.
Apparatus and recognizing spatial gesture
The present invention relates to an apparatus for recognizing a gesture in a space. In accordance with an embodiment, a spatial gesture recognition apparatus includes a pattern formation unit for radiating light onto a surface of an object required to input a gesture in a virtual air bounce, and forming a predetermined pattern on the surface of the object, an image acquisition unit for acquiring a motion image of the object, and a processing unit for recognizing a gesture input by the object based on the pattern formed on the surface of the object using the acquired image.
Center Of Human-centered Interaction For Coexistence
Electronic simulating or interfacing a backward compatible human input device by means or control of a gesture recognition system
Method and apparatus where human gestures are interpreted by means of software running on a host computer, into screen coordinates and low level commands—keyboard presses, clicks, double-clicks, drag-and-drop, wheel scroll etc.—which are sent to a hardware peripheral, instead of a software based application programming interface, which hardware corrects and polishes the said screen coordinates and low level commands and translates said data by means of emulating, simulating or manipulating the protocol of an actual human input device (hid)—such as a standard keyboard, mouse, joystick, touchpad, etc.—which actual hid-compliant device or simuloid is embedded into the invention, proper, and is in turn connected back into the host computer where it's recognized by native drivers as a standard hid device so that it may interact with common end-user programs in the usual manner—but thus be controlled by means of human gestures.. .
Hand pose recognition using boosted look up tables
Pose and gesture detection and classification of a human poses and gestures using a discriminative ferns ensemble classifier is provided. Sample image data in one or more channels includes a human image.
Head tracking based gesture control techniques for head mounted displays
A head gesture-based recognition system in headset computers (hsc) is disclosed. Notification dialogue boxes can be acknowledged by head nodding or ticking movement in the user interface.
Text selection using hmd head-tracker and voice-command
A joint head tracker and voice command in a headset computer enables hands-free user text selection. The method and system enables an end-user to select sections of text without requiring use of a mouse cursor control input device.
Methods, controllers and computer program products for accessibility to computing devices
Methods of providing user accessibility to an electronic device are provided. Methods include receiving a physical input via at least one user input device in a user interface, generating, in the user interface, a sensor output signal responsive to receiving the physical input from the user, and interpreting the sensor output signal as a gesture input signal that that is received by the electronic device.
Georgia Tech Research Corporation
Head-mounted integrated interface
A head mounted integrated interface (hmii) is presented that may include a wearable head-mounted display unit supporting two compact high resolution screens for outputting a right eye and left eye image in support of the stereoscopic viewing, wireless communication circuits, three-dimensional positioning and motion sensors, and a processing system which is capable of independent software processing and/or processing streamed output from a remote server. The hmii may also include a graphics processing unit capable of also functioning as a general parallel processing system and cameras positioned to track hand gestures.
Synchronous communication system and method
A method and computing system for providing, using one or more computing devices, a synchronous communication session for a plurality of users of a social network. A first video stream of a first user of the plurality of users is rendered within a primary viewing field associated with the synchronous communication session.
Gestures for manipulating tables, charts, and graphs
Gestures are described for manipulating tables, charts and graphs. For tables, a swipe gesture is described that deletes a column from a table when the gesture is detected on a column of the table.
Device, method, and graphical user interface for displaying user interface objects corresponding to an application
An electronic device with a touch-sensitive surface and a display, that includes one or more sensors to detect intensity of contacts with the touch-sensitive surface, displays a plurality of application icons, where the plurality of application icons include a respective application icon corresponding to a respective application. While a focus selector is over the respective application icon, the device detects a gesture that includes a contact on the touch-sensitive surface; and in response to detecting the gesture: when the contact had a maximum intensity during the gesture that was below a respective intensity threshold, the device displays an application window of the respective application; and when the contact reached an intensity during the gesture that was above the respective intensity threshold, the device displays a plurality of user interface objects that correspond to the respective application..
Transferring information among devices using sensors
Data provided on a first computing device is represented by a graphical object displayed on a screen. A user can initiate an “attach event” with a gesture to enable the graphical object to be associated and/or virtually attached to the user and/or a user's hand/fingers.
Method and system of searching note items
The present invention relates to mobile terminals. Disclosed are a method and system of searching note items.
Memory facilitation using directed acyclic graphs
Memory facilitation using directed acyclic graphs is described, for example, where a plurality of directed acyclic graphs are trained for gesture recognition from human skeletal data, or to estimate human body joint positions from depth images for gesture detection. In various examples directed acyclic graphs are grown during training using a training objective which takes into account both connection patterns between nodes and split function parameter values.
Unified content representation
Example apparatus and methods facilitate providing an incremental future-proof license to a master stream of content. The master stream may be related to different instances of content (e.g., different versions) for which there is a unified content representation.
Graphical generation and retrieval of medical records
Systems and method are provided for generating a record of a medical procedure. A system includes a processor and a non-transitory computer readable medium.
A gaming machine with more gaming excitement is provided. The gaming machine includes; a cabinet; an upper image display panel which is provided on the cabinet and is a display displaying an effect image regarding a game; a lamp body which is three-dimensionally formed and is a formed object provided on the cabinet to protrude toward the front surface as compared to at least the lower end of the upper image display panel; a sensor configured to detect the player' gesture with respect to the lamp body; a controller used to start the game, and the controller detects the player's gesture by the sensor at a timing corresponding to the state of the game and displays an effect image corresponding to the detected player's gesture on the upper image display panel..
Safety monitoring system for human-machine symbiosis and method using the same
A safety monitoring system for human-machine symbiosis is provided, including a spatial image capturing unit, an image recognition unit, a human-robot-interaction safety monitoring unit, and a process monitoring unit. The spatial image capturing unit, disposed in a working area, acquires at least two skeleton images.
Vehicle recognizing user gesture and controlling the same
A vehicle is provided that is capable of preventing malfunction or inappropriate operation of the vehicle due to a passenger error by distinguishing a gesture of a driver from that of the passenger when a gesture of a user is recognized, and a method for controlling the same is provided. The vehicle includes an image capturing unit mounted inside the vehicle and configured to capture a gesture image of a gesture area including a gesture of a driver or a passenger.
Gesture recognition device and control the same
A gesture recognition device configured to detect a gesture from acquired image and generate command issued to a control target instrument according to the gesture, the gesture recognition device comprising: an image acquisition unit configured to acquire an image; a gesture acquisition unit configured to detect a target region performing a gesture from the acquired image, and acquire the gesture based on motion or a shape of the detected target region; a face detection unit configured to detect a face comprised in the acquired image; a correlation unit configured to correlate the detected target region with the detected face using a human body model representing a shape of a human body; a personal identification unit configured to identify a user corresponding to the detected face; and a command generation unit configured to generate a command issued to the control target instrument based on the identified user and the acquired gesture.. .
Methods, systems, and products sense contactless gestures. A capacitive sensor measures capacitance during performance of a gesture.
System for gaze interaction
A control module for generating gesture based commands during user interaction with an information presentation area is provided. The control module is configured to acquire user input from a touchpad and gaze data signals from a gaze tracking module; and determine at least one user generated gesture based control command based on a user removing contact of a finger of the user with the touchpad; determine a gaze point area on the information presentation area including the user's gaze point based on at least the gaze data signals; and execute at least one user action manipulating a view presented on the graphical information presentation area based on the determined gaze point area and at least one user generated gesture based control command, wherein the user action is executed at said determined gaze point area..
Dynamic hover sensitivity and gesture adaptation in a dual display system
A dual display information handling system includes processor and a housing. The housing includes a display operable to detect a touch device hovering above the display.
Audio-visual interaction with user devices
A user device is enabled by an audio-visual assistant for audio-visual interaction with a user. The audio-visual assistant enables the user device to track the user's eyes and face to determine objects on the screen that the user is currently observing.
Gesture detecting device, gesture recognition device
An object of the present invention is to make it possible for smart devices such as smartphones to provide new operability to users. A gesture detecting device includes a motion detecting unit and a processor.
Directional touch unlocking for electronic devices
A system and machine-implemented method for matching input gestures on a touch interface to a security pattern to allow user access to an electronic device or account. The security pattern may correspond to a combination of linear and non-linear input gestures relating to directional changes of the input gestures.
Blacklisting of frequently used gesture passwords
A method of maintaining a blacklist for gesture-based passwords is provided. A data store of vectors corresponding to gestures is maintained on a blacklist server.
Second-screen tv bridge
A set top box can provide coordinated graphical user interfaces on a plurality of devices. The set top box can output first media content to a primary display (a television) and can output second media content to a second-screen device (a tablet).
Cisco Technology, Inc.
System to facilitate and streamline communication and information-flow in health-care
Processes and systems for facilitating communications in a health care environment are provided. In one example, a process includes receiving a trigger from a wearable computer device to communicate with a medical application interface.
Sidra Medical And Research Center
Method, device and computer system for performing operations on objects in an object list
The present application discloses methods, devices and computer systems for performing operations on objects in an object list. After detecting a swipe gesture on the touch screen, a computer system, e.g.
Tencent Technology (shenzhen) Company Limited
Gesture-based controls via bone conduction
Concepts and technologies are disclosed herein for utilizing bone conduction to detect gestures. According to one aspect, a device can generate a signal and send the signal to a sensor network that is connected to a user.
At&t Intellectual Property I, L.p.
Touch screen control for adjusting a numerical value
A method of operating a data processing system having a touch enabled display screen to alter the value of a specified variable in the data processing system is disclosed. A value control is provided on the display screen to alter the variable.
Keysight Technologies, Inc.
In one embodiment, a method includes receiving a touch input within a particular region of a display area of the computing device. The display area presents a user interface (ui) including a number of views organized in a hierarchy.
Multi-language input method and multi-language input apparatus using the same
A multi-language input method is provided. The method includes sensing a touch input for a letter entry, sensing a touch gesture consecutive to the touch input, and displaying a letter corresponding to the touch input and a symbol corresponding to the touch gesture..
Samsung Electronics Co., Ltd.
Remote control of a desktop application via a mobile device
One embodiment of the present invention provides a system for using a mobile device to remotely control a desktop application that was configured for use with a pointing device. During operation, the system receives at a mobile device, from a user, a connection request to connect to a desktop application executing on a remote device.
Multitasking experiences with interactive picture-in-picture
A user interface (“ui”) includes a personalized home screen that can be brought up at any time from any experience provided by applications, games, movies, television, and other content that is available on a computing platform such as a multimedia console using a single button press on a controller, using a “home” gesture, or using a “home” voice command. The home screen features a number of dynamically maintained visual objects called tiles that represent the experiences available on the console.
Electronic device and search and display the same
A search and display method of an electronic device using handwriting is provided. The search and display method includes recognizing the handwriting, determining whether the recognized handwriting is a gesture or text, recognizing the gesture if it is determined that the recognized handwriting is the gesture, and registering gesture information about the gesture and function information about a function corresponding to the gesture information based on the recognized gesture..
Samsung Electronics Co., Ltd.
Image processing apparatus and program
An information processing system that acquires image data corresponding to a target object that is a target for gesture recognition captured by an imaging device; determines whether a distance between the target object and the imaging device is inadequate for recognition of a gesture made by the target object; and outputs a notification when the determining determines that the distance between the target object and the imaging device is inadequate for recognition of a gesture made by the target object.. .
Systems, articles and methods for wearable electronic devices employing contact sensors
Wearable electronic devices that employ one or more contact sensors (e.g., capacitive sensors and/or biometric sensors) are described. Contact sensors include electromyography sensors and/or capacitive touch sensors.
Thalmic Labs Inc.
Shutter release using secondary camera
Capturing a target image includes activating a first image sensor for capturing the target image. A sequence of images is captured with a second image sensor while the first image sensor remains activated.
Omnivision Technologies, Inc.
A supplemental surface area allows gesture recognition on outer surfaces of mobile devices. Inputs may be made without visual observance of display devices.
At&t Intellectual Property I, L.p.
Steering wheel user interface
A steering wheel that identifies gestures performed on its surface, including a circular gripping element including a thumb-receiving notch disposed along its circumference, an array of light-based proximity sensors, mounted in the gripping element, that projects light beams through the notch radially outward from the gripping element, and detects light beams reflected back into the gripping element by a moving object at or near the notch, and a processor, coupled with the proximity sensor array, for determining polar angles along the circumference of the gripping element occupied by the object, responsive to light beams projected by the proximity sensor array and reflected back by the object being detected by the proximity sensor array.. .
Method, apparatus, and device for touch screen verification
Various embodiments provide methods, apparatus, and devices for touch screen verification (or device verification). In an exemplary method, trajectories of at least two discrete touch gestures inputted by a user can be recorded by an electronic device and compared with trajectories in a preset sequence of trajectories.
Tencent Technology (shenzhen) Company Limited
Gesture disambiguation using orientation information
Embodiments are disclosed that relate to controlling a computing device based upon gesture input. In one embodiment, orientation information of the human subject is received, wherein the orientation information includes information regarding an orientation of a first body part and an orientation of a second body part.
Digital device and control method thereof
Disclosed are a digital device and a control method thereof. The digital device comprising: a communication unit configured to transmit/receive a signal with an external device; a gesture sensor unit configured to sense a gesture with respect to the digital device; and a processor configured to control the communication unit and the gesture sensor unit, wherein the processor is further configured to: provide a first mode corresponding to a first event when occurrence of the first event is detected; transmit a first signal, commanding provision of a second mode corresponding to a second event, to the external device when occurrence of the second event is detected during provision of the first mode; and switch from the first mode to the second mode when detecting a first gesture with respect to the digital device after transmission of the first signal..
Lg Electronics Inc.
Gesture detection system, gesture detection apparatus, and mobile communication terminal
A gesture detection system having a gesture detection apparatus to detect a gesture of a user, and a mobile communication terminal that can communicate with the gesture detection apparatus, includes a storage unit to store first gesture data defining the gesture of the user, and audio or visual data associated with the first gesture data; an obtainment unit to obtain second gesture data representing the gesture of the user; a transmission unit to transmit the second gesture data to the mobile communication terminal; a determination unit to determine whether the gesture defined by the first gesture data is the same as the gesture represented by the second gesture data; a selection unit to select the audio or visual data associated with the first gesture data depending on the determination result; and an output unit to output the audio or visual data.. .
Digital device and control method thereof
Disclosed are a digital device and a control method thereof the digital device includes a communication unit to transmit/receive a signal with an external device; a gesture sensor unit to sense a gesture with respect to the digital device; and a processor to control the communication unit and the gesture sensor unit, and to provide a first mode corresponding to a first event when occurrence of the first event is detected; transmit a first signal, commanding provision of a second mode corresponding to a second event, to the external device when occurrence of the second event is detected during provision of the first mode; and switch from the first mode to the second mode when detecting a first gesture with respect to the digital device after transmission of the first signal.. .
Lg Electronics Inc.
Two hand natural user input
Embodiments are disclosed which relate to two hand natural user input. For example, one disclosed embodiment provides a method comprising receiving first hand tracking data regarding a first hand of a user and second hand tracking data regarding a second hand of the user from a sensor system.
Method for controlling terminal device by using headset wire and the terminal device thereof
The present disclosure provides a method for controlling a terminal device by using a headset wire connected to the terminal device. The method includes: recognizing a user's gesture based on a current detected in a specific region of the headset wire; acquiring control instruction corresponding to the user's gesture; and executing the control instruction.