Follow us on Twitter
twitter icon@FreshPatents


Gesture patents

      

This page is updated frequently with new Gesture-related patent applications.

SALE: 890+ Gesture-related patent PDFs



 Cloud-based custom metric/timer definitions and real-time analytics of mobile applications patent thumbnailnew patent Cloud-based custom metric/timer definitions and real-time analytics of mobile applications
A method for real-time capture of analytics from real users of a native mobile application (app) includes storing a custom metric/timer definition for a native mobile application (app) in a configuration file on a server. The custom metric/timer definition includes one or more identifiers of an element or object of the native mobile app selected by touch gesture input via a user interface on a mobile device running the native mobile app in a special mode.
Soasta, Inc.


 System and  gesture-based management patent thumbnailnew patent System and gesture-based management
A system includes a first mobile device configured to initiate communication with at least one other mobile device. The first mobile device includes a status indicator configured to provide a persistent visual indication to a user of the status of a mute function of the first user device during the active communication.
Intel Corporation


 Identity authentication method and apparatus, terminal and server patent thumbnailnew patent Identity authentication method and apparatus, terminal and server
A method, an apparatus, a terminal, and a server for identity authentication are disclosed. The method includes: receiving dynamic face authentication prompt information sent by a server during identity authentication of a user; obtaining gesture recognition information of the dynamic face authentication prompt information by recognizing a facial gesture presented by the user; and sending the gesture recognition information to the server to enable the server to confirm that the identity authentication is successful for the user in response to verifying that the gesture recognition information is consistent with the dynamic face authentication prompt information.
Alibaba Group Holding Limited


 Motion and gesture-based mobile advertising activation patent thumbnailnew patent Motion and gesture-based mobile advertising activation
The presentation of advertisements to a user on a mobile communications device is disclosed. A first external input corresponding to a triggering of an advertisement delivery is received on a first input modality.
Adtile Technologies Inc.


 Crowd gesture recognition patent thumbnailnew patent Crowd gesture recognition
Various systems and methods for implementing crowd gesture recognition are described herein. A system for implementing crowd gesture recognition includes an accelerometer; a gyrometer; a gesture detection circuit to: detect an air gesture performed by a user of the system based on data from the accelerometer and gyrometer; and parameterize an intensity of the air gesture; a processor subsystem to determine a transmission frequency band and a transmission strength based on the air gesture and the intensity of the air gesture; and a transducer to transmit a signal on the transmission frequency band with the transmission strength..
Intel Corporation


 Pattern password with variable hint pattern patent thumbnailnew patent Pattern password with variable hint pattern
A device unlock pattern (“pattern password”) is static in that the same pattern is entered each time to unlock a device. Due to this repetition, a pattern password may be discovered by an application that captures touchscreen gestures, by inspection of fingerprints or smudges on a screen, or simply by an onlooker that views the pattern password being entered.
Ca, Inc.


 Customizable gestures for mobile devices patent thumbnailnew patent Customizable gestures for mobile devices
Users are enabled to define and modify mappings between (1) gestures and (2) actions performed by one or more computing devices in response to a device detecting performance of a gesture. A generalized gesture-to-action mapping framework allows users to intuitively define and modify such mappings.
Yahoo! Inc.


 Cascaded touch to wake for split architecture patent thumbnailnew patent Cascaded touch to wake for split architecture
Aspects of the disclosure are related to a touch controller for use in a device having a processor and a touch sensor panel, the touch controller being coupled to the processor and the touch sensor panel, and further comprising: an analog front-end (afe), wherein the afe is configured to generate raw touch image data based on electrical signals generated by the touch sensor panel in response to one or more detected touches thereto; a coarse processing element configured to, in response to the processor being set to a sleep mode, coarsely process the raw touch image data to generate sparse data; and an embedded memory configured to store at least the sparse data, wherein the touch controller is configured to transmit a signal to the processor to wake the processor up and transmit the stored sparse data and new touch image data to the processor, wherein the processor performs gesture recognition based on the sparse data and the new touch image data.. .
Qualcomm Incorporated


 Method for interaction with terminal and electronic  the same patent thumbnailnew patent Method for interaction with terminal and electronic the same
The present application discloses a method for interaction with terminal and an electronic apparatus for the same. The method includes: determining whether a downward acceleration of a gesture is greater than a default threshold value when the gesture is detected under state of a displayed interface, wherein the displayed interface comprises: a replying information and recognition result interface, a replying information full screen interface, or a replying information full screen extension interface after record of speech in a speech recognition interface is detected to be finished; determining an operation type corresponding to the gesture, according to determination of whether the downward acceleration of the gesture is greater than the default threshold value; and executing an interaction corresponding to the operation type, according to the operation type..
Le Shi Zhi Xin Electronic Technology (tianjin) Limited


 Centering gesture to enhance pinch-to-zoom gesture on touchscreens patent thumbnailnew patent Centering gesture to enhance pinch-to-zoom gesture on touchscreens
A computing device detects movement of two contact positions on a touchscreen of the computing device as a pinch-to-zoom gesture. While detecting the movement of the two contact positions on the touchscreen, the computing device detects a third stationary contact position on the touchscreen of the computing device, as a centering gesture related to the pinch-to-zoom gesture.
Lenovo Enterprise Solutions (singapore) Pte. Ltd.


new patent

Systems and methods for identifying dominant hands for users based on usage patterns

Systems, methods, and non-transitory computer-readable media can detect a set of swiping touch gestures from a user. The set of swiping touch gestures can be analyzed to determine at least one respective movement property for each swiping touch gesture in the set of swiping touch gestures.
Facebook, Inc.

new patent

Input device

A control section allows a screen of a display section to display a plurality of numeric keys, detects a touch gesture on the individual numeric key through a touch panel to allow an input of a numeric number corresponding to the numeric key subjected to the touch gesture, and detects an action on each of hard keys to accept an instruction to perform a function corresponding to the hard key subjected to the action. When a predetermined action is performed on one or more individuals of the hard keys, the control section assigns numeric numbers to the respective hard keys and detects an action on the individual hard key to allow an input of the numeric number corresponding to the hard key subjected to the action..
Kyocera Document Solutions Inc.

new patent

Primary device that interfaces with a secondary device based on gesture commands

An incoming call from a remote device can be received by a primary device. The primary device can determine a numerical count of detected user gestures.
Google Technology Holdings, Llc.

new patent

System for hand gesture detection

A system for hand gesture detection is provided, comprising: a wrist wear adapted to be worn about a wrist of a user of the system and including a set of skin electrodes adapted to face the wrist; an impedance measurement circuit adapted to measure at least a first impedance in a first portion of the wrist and a second impedance in a second portion of the wrist which second portion is circumferentially displaced in relation to said first portion, wherein the first impedance is measured via a first electrode group including four skin electrodes of said set of skin electrodes and the second impedance is measured via a second electrode group including four skin electrodes of said set of skin electrodes, and a processing circuit adapted to detect a hand gesture of the user based on the first and the second impedance measured by the impedance measurement circuit.. .
Stichting Imec Nederland

new patent

Dynamic effects processing and communications for wearable devices

Processing techniques and device configurations for performing and controlling output effects at a plurality of wearable devices are generally described herein. In an example, a processing technique may include receiving, at a computing device, an indication of a triggering gesture that occurs at a first wearable device, determining an output effect corresponding to the indication of the triggering gesture, and in response to determining the output effect, transmitting commands to computing devices that are respectively associated with a plurality of wearable devices, the commands causing the plurality of wearable devices to generate the output effect at the plurality of wearable devices.

new patent

Two-step gesture recognition for fine-grain control of wearable applications

The present disclosure provides methods, devices, systems, and computer program products for providing fine-grain gesture-based control of wearable applications. Methods are provided for multi-step gesture-based control systems of wearable applications with an initial, easy to recognize gesture being used to place the device in a state that subtle gestures can be identified that can control navigation and interactivity on the device that rely on the user being able to view the device.
Sap Se

new patent

Hand skeleton comparison and selection for hand and gesture recognition with a computing interface

Hand skeletons are compared to a hand image and selected. The hand skeletons are used for hand and gesture recognition with a computing interface.
Intel Corporation

new patent

Free-form drawing and health applications

Various systems and methods for implementing free-form drawing for health applications are described herein. A system for implementing a health application includes a user interface module to receive, at a user device, a plurality of parameters including a free-form gesture path, the free-form gesture path representing an air gesture performed by a user of the user device; and a control module to adjust a fitness routine of the user based on the plurality of parameters..

new patent

Gesture management system

For storing gesture definitions and evaluating expressions that reference the gesture definitions, an expression evaluation engine evaluates the expressions to determine whether movements of a user satisfy the expressions. The expression evaluation engine receives expressions in user or application requests, or the expression evaluation engine may automatically evaluate the expressions when a gesture recognition system receives updated information about tracked body parts of the user.
Palantir Technologies, Inc.

new patent

Method of operating a control system and control system therefore

A method of operating a control system for controlling a device, the control system comprising a motion capture equipment, and a controller for providing control signals for controlling one or more device functions of the device, the method comprising the steps of: capturing, by the motion capture equipment, motion picture images of a space and providing the motion picture images to the controller; analyzing, by the controller, the motion picture images for detecting user input from a user in the space, and detecting by the controller a gesture performed by the user; and providing, by the controller in response to said detecting of the gesture, a control signal to the device for controlling a selected device functions of said one or more device functions; wherein said analyzing is performed by the controller by monitoring one or more gesture zones in said motion picture images, each gesture zone being associated with one respective device function of said plurality of device functions, and wherein for providing said control signal the controller determines the gesture zone wherein the gesture is detected for establishing the selected device function to control.. .
Koninklijke Philips N.v.

new patent

Systems and methods for controlling an unmanned aerial vehicle

Systems and methods for controlling an unmanned aerial vehicle recognize and interpret gestures by a user. The gestures are interpreted to adjust the operation of the unmanned aerial vehicle, a sensor carried by the unmanned aerial vehicle, or both..
Gopro, Inc.

new patent

Drone delivery of coffee based on a cognitive state of an individual

Coffee or other drink, for example a caffeine containing drink, is delivered to individuals that would like the drink, or who have a predetermined cognitive state, using an unmanned aerial vehicle (uav)/drone. The drink is connected to the uav, and the uav flies to an area including people, and uses sensors to scan the people for an individual who has gestured that they would like the drink, or for whom an electronic analysis of sensor data indicates to be in a predetermined cognitive state.
International Business Machines Corporation

new patent

Systems and methods of an adaptive interface to improve user experience within a vehicle

The present disclosure relates to a computer-readable device cause a processor to perform operations for interpreting a user request, such as a tactile inputs, gestures, and speech, to a vehicle system. The operations include receiving an input data package comprising user communications.
Gm Global Technology Operations Llc

Controlled lamp device

A controlled lamp device (1), comprising: a lamp housing (2) with a light exit opening (3), a sensor unit (4) for detecting a contactless manual intervention (15, 16-1, 16-2, 17) of an operator of the lamp device in an intervention region (5, 5-1, 5-2), and an evaluation and control device (6) for evaluating the intervention of the operator detected by the sensor unit and for influencing a control parameter for the operation of the lamp device depending on a result of the evaluation, wherein the sensor unit (4) is provided in and/or on the light housing (2) laterally adjacent to the light exit opening (3), and the evaluation and control device (6) is designed such that it only influences the control parameter if the evaluation carried out by the evaluation and control device (6) shows that a predefined path in an intervention region has been covered in a gesture-like manner during the intervention of the operator.. .
Steinel Gmbh

Motion-based authentication for a gesture-based device

A motion-based authentication method is operative in a mobile computing device having a display interface and that includes an accelerometer. Normally, the device software includes a locking mechanism that automatically locks the display interface after a configurable timeout.
Logmein, Inc.

Data transmission controlling device and data transmission controlling a mobile device

An embodiment of the present disclosure relates to the technical field of data transfer between a mobile device and a television, and discloses a data transmission controlling device for a mobile device and a data transmission controlling method for a mobile device. The data transmission controlling device for a mobile device includes: a detector, configured to detect at least one of a moving track of the mobile device, a sliding gesture on a screen of the mobile device, and a key instruction of the mobile device; a first transceiver; and a processor, configured to control the first transceiver to send a uniform resource locator of a video currently being played on the mobile device and a timestamp of the current playback, when at least one of a particular moving track, a particular sliding gesture, and a particular key instruction is detected.
Le Shi Internet Information & Technology Corp., Beijing

Method and electronic adjusting viewing angle of smart television playing panorama videos

Disclosed are a method and a electronic apparatus of adjusting a viewing angle of a smart television playing panorama videos, wherein the method is applied to a terminal apparatus and includes: displaying a touch control region for adjusting the viewing angle when opening an application program for controlling the smart television is detected; detecting a handed gesture input onto the touch control region to determine a viewing angle adjustment parameter corresponding to the detected handed gesture input; sending the viewing angle adjustment parameter to the smart television by communication with the smart television, so as to adjust the viewing angle. The disclosure uses a smart television with bluetooth and wifi communication functions to connect to the internet, and adjusts a panoramic play parameter of the smart television via a terminal apparatus communicating with the smart television..
Le Shi Internet Information Technology Corp. Beijing

Method, system and device for navigating in a virtual reality environment

A method, a system, and a device for navigating in a virtual reality scene, using body parts gesturing and posturing are provided herein. The method may include: projecting a synthetic 3d scene, into both eyes of a user, via a near eye display, so as to provide a virtual reality view to the user; identifying at least one gesture or posture carried out by at least one body part of said user; measuring at least one metric of a vector associated with the detected gesture or posture; applying a movement or action of said user in virtual reality environment, based on the measured metrics; and modifying the virtual reality view so as to reflect the movement or action of said user in the virtual reality environment..
Facebook, Inc.

Systems and methods for implementing retail processes based on machine-readable images and user gestures

Systems and methods for implementing retail processes based on machine-readable images and user gestures are disclosed. According to an aspect, a method includes capturing one or more images including a machine-readable image and a user gesture.
Toshiba Global Commerce Solutions Holdings Corporation

Information processing apparatus, control method, and program

There is provided an information processing apparatus including circuitry configured to initiate display of a virtual object, based on a gesture operation, starting from a point of origin and moving towards a target point; and continue to display the virtual object in display motion after the gesture operation, wherein a path of travel of the virtual object or a display characteristic of the virtual object is determined based on a positional relationship between the virtual object and another object that is a real object located in proximity to the path of travel of the virtual object.. .
Sony Corporation

Use of accelerometer input to change operating state of convertible computing device

A convertible computing device has an accelerometer to detect tapping gestures on the device, and a mode sensor to determine whether device is in a laptop mode or a tablet mode. The device includes a first physical human input sensor to change an operating state of the device between an off state and a non-off state and a second physical human input sensor to change an operating state of the computing device.
Google Inc.

Systems and methods for position-based haptic effects

One illustrative system disclosed herein includes a sensor configured to detect a gesture and transmit an associated sensor signal. The gesture includes a first position at a distance from a surface and a second position contacting the surface.
Immersion Corporation

Methods and apparatus using gestures to share private windows in shared virtual environments

Methods and apparatus using gestures to share private windows in shared virtual environments are disclosed herein. An example method includes detecting a gesture of a user in a virtual environment associated with a private window in the virtual environment, the private window associated with the user, determining whether the gesture represents a signal to share the private window with another, and, when the gesture represents a signal to share the private window, changing the status of the private window to a shared window..
Google Inc.

Smart watch and gesture input the smart watch

The present disclosure provides a smart watch and a gesture input method for the smart watch. The present disclosure starts to acquire data of gestures by receiving gestures of the user, collects the data of the gestures of a user at a continuous time section, and finds proximate text corresponding to the obtained data of the gestures from the prestored corresponding relationship between the data of the gestures and the text, namely the proximate text is final output text.
Huizhou Tcl Mobile Communication Co., Ltd.

Click response processing method, electronic device and system for motion sensing control

Embodiments of the present invention disclose a click response processing method for motion sensing control, including: s101: when a push gesture instruction for target content is received, acquiring a transfer point corresponding to the push gesture instruction; s102: triggering a down event corresponding to the push gesture instruction according to the transfer point, and determining and saving information of the transfer point; s103: when a pull gesture instruction is received, directly invoking the information of the transfer point, and triggering an up event corresponding to the pull gesture instruction based on the information of the transfer point; and s104: completing a click event for the target content and outputting a result. Embodiments of the present invention further disclose a electronic device and a motion sensing control system.
Le Shi Zhi Xin Electronic Technology (tianjin) Limited

Method and device for controlling operation components based on somatosensory

Disclosed are a method and electronic device for controlling an operation component based on somatosensory comprises: detecting gesture control information for the operation component; analyzing an operation event triggered by the gesture control information, wherein the operation event includes a down event, a move event and an up event, and setting the move event as an invalid event when the move event is generated between the down event and the up event; and determining that the down event and the up event form a click event so as to finish controlling on the operation component. The present disclosure avoids responding to other events formed by the move event, accurately completing control of somatosensory on the operation component, and improving success rate of triggering corresponding operations by the gesture control information..
Le Shi Zhi Xin Electronic Technology (tianjin) Limited

Intelligent gesture based word sentence augmentation and systems for the implementation thereof

Disclosed herein is a system comprising a user interface comprising an edit region in operative communication with a processor; where the processor is in operative communication with one or more modules; where the processor is operative to receive from the user interface a selection of words in the form of a sentence; use a grammar test to determine if the sentence is grammatically accurate; to parse the sentence and offer a user a choice of words to improve an accuracy of the sentence; where the choice of words is based upon a weighted probability of several possible words that can improve the accuracy of the sentence; and permitting the user to install his/her word choice in the sentence by performing an action involving one or more of swiping, tilting, steering or tapping of the user interface.. .
International Business Machines Corporation

Ultrasonic noise based sonar

The invention relates to a device with a microphone and a speaker or transducer and processing means to process audio signals from the microphone and for the transducer. Electronic devices and especially mobile devices serve several user interfaces of which the touch screen has revolutionized the market in the past few years.
Sound Solutions International Co., Ltd.

Hands-free rear vechicle access system and improvements thereto

A vehicle access system includes: an infrared detector assembly for detecting an object within a sensing region of the infrared detector assembly; at least one controller operatively connected to the infrared detector assembly, the at least one controller operative (i) to determine from inputs from the infrared detector assembly if a detected object exhibits a predefined gesture and, if the detected object exhibits a predefined gesture, (ii) to direct the execution of one or more pre-defined vehicle commands; and a plurality of lights operatively connected to the at least one controller, the plurality of lights selectively illuminable to produce visible light in one or more colors, wherein one or more of the plurality lights (i) are selectively illuminated by the at least one controller to visibly indicate the detected presence of an object within the sensing region by the infrared detector assembly, and (ii) are selectively illuminated by the at least one controller to visibly indicate that the detected object exhibits a predefined gesture.. .
Adac Plastic, Inc.

In-vehicle component control user interface

A personal device may include a display and a processor. The processor of the personal device may be programmed to send, to the display, a vehicle interior map overlaid with indications of in-vehicle components, create a group of the in-vehicle components responsive to receipt of a swipe gesture to the display selecting a subset of the indications, receive input from the display of a location on the map, and aim outputs of the in-vehicle components of the group based on the location..
Ford Global Technologies, Llc

Wand gesture

Some embodiments include a remote for gesture recognition for an external light system. In some embodiments, the remote may include an acceleration sensor; a wireless transceiver; memory; and a processor communicatively coupled with the acceleration sensor, the wireless transceiver, and the memory.
Mojo Labs Inc

Operator control apparatus, electronic domestic appliance, mobile operator control unit and system

An operator control apparatus for an electronic device, in particular an electronic domestic appliance, includes an operator control unit detection device for detecting a mobile operator control unit in the vicinity of the operator control apparatus. The detection device has a communication interface for a radio connection with the mobile operator control unit, and a gesture detection device for detecting gestures performed by the mobile operator control unit.
Diehl Ako Stiftung & Co. Kg

Method and gesture recognition

A computer-implemented method and an apparatus for improving gesture recognition are described. The method comprises providing a reference model defined by a joint structure, receiving at least one image of a user, and mapping the reference model to the at least one image of the user, thereby connecting the user to the reference model for recognition of a set of gestures predefined for the reference model, when the gestures are performed by the user..
Calay Venture S.á R.l.

System and improved gesture recognition using neural networks

According to various embodiments, a method for gesture recognition using a neural network is provided. The method comprises a training mode and an inference mode.
Pilot Ai Labs, Inc.

System and improved virtual reality user interaction utilizing deep-learning

According to various embodiments, a method for gesture recognition using a neural network is provided. The method comprises a training mode and an inference mode.
Pilot Ai Labs, Inc.

System and motion gesture access to an application and limited resources of an information handling system

An information handling system includes a processor that determines a first orientation from orientation sensors and a sensor hub for detecting a motion gesture. The processor is further activated from a sleep state by the motion gesture and the information handling system includes a limited, ad-hoc access system that permits ad-hoc access to limited user pre-set or context-based system resources in response to the sudden motion gesture..
Dell Products, Lp

Systems and user interfaces for dynamic interaction with two-and three-dimensional medical image data using hand gestures

Embodiments of the present disclosure relate to systems and techniques for accessing data stores of medical images and displaying the medical images in substantially real-time to provide information in an interactive user interface. Systems are disclosed that may advantageously provide highly efficient, intuitive, and rapid dynamic interaction with two- and three-dimensional medical image data using hand gestures.
D.r. Systems, Inc.

Gesture multi-function on a physical keyboard

A computer keyboard includes position sensors such as capacitive sensors on each of the keys for monitoring positions of fingertips on the keys. A processor receives an indication of contact of a finger on a surface of a key, including an identity of the key and an indication of a position of the contact on the surface.
At&t Intellectual Property I, L.p.

Size adjustable soft activation trigger for touch displays on electronic device

A portable electronic device having a touch screen for displaying a soft activation trigger. The soft trigger may be selected via application of a control gesture on the touch screen to configure the electronic device and enable various functions of the electronic device, such as bar code reading and capturing rfid data.
Datalogic Usa, Inc.

Media file processing method and terminal

The present invention provides a method, including: displaying text information, where the text information is associated with the media file; receiving a first gesture and displaying time information, where the time information is associated with a part that is selected by using the first gesture and of the text information; and receiving a second gesture and acquiring a segment that is confirmed by using the second gesture and of the media file. By using the displayed text information, the displayed time information, and the acquired segment that is confirmed by using the second gesture and of the media file, a terminal does not need to install other processing software to implement processing of a gesture..
Huawei Technologies Co., Ltd.

Electronic device and receiving and displaying user gesture inputs in accordance with one of multiple operation types

An electronic device is disclosed. The electronic device may include a touch display unit that receives and displays user gesture inputs.
Lenovo (beijing) Limited

Dual-mode touch sensing method and stylus and touch panel for the same

A dual-mode touch sensing method adapted for a stylus and a touch panel comprising n first signal lines and m second signal lines. The method comprises: sequentially controlling the n first signal lines to emit n corresponding pulse signals in n gesture periods in a scanning period, receiving m gesture feedback signals corresponding to the pulse signals via the m second signal lines in each among the n gesture periods, selectively generating a gesture signal based on the gesture feedback signals, determining a stylus period other than the n gesture periods in the scanning period by the stylus, generating a stylus signal in the stylus period by the stylus, and receiving the stylus signal and generating a stylus touching signal accordingly by the touch panel..
Silicon Integrated Systems Corp.

Mobile terminal and controlling the same

Disclosed are a mobile terminal capable of controlling a head mounted display (hmd) and a controlling method thereof. The mobile terminal includes: a sensor; a communication unit capable of communicating with a head mounted display configured to display a virtual reality screen; and a controller capable of: controlling the head mounted display to display the virtual reality screen including first content; and when a preset gesture input corresponding to a user's head or pupil movement is detected via the sensor, controlling the head mounted display to display locked second content on the virtual reality screen while a worn state of the head mounted display is maintained..
Lg Electronics Inc.

Electronic device, control method, and storage medium

A control method according to one aspect of the present disclosure is a control method for controlling an electronic device. The control method includes the steps of performing a notification on an event that occurs, detecting a response operation to the notification, outputting, when the response operation is a single-touch gesture, information on the event in a first method, and outputting, when the response operation is a multi-touch gesture, the information on the event in a second method.
Kyocera Corporation

Method and device for controlling operation component based on gesture

The present disclosure relates to methods and apparatus for controlling an operation component based on a gesture, as well as computer programs, computer readable media, and devices regarding same. An illustrative method may include detecting first position coordinates of an icon corresponding to motion sensing in a current interface; when spatial gesture information of a human body gesture for triggering an operation component in the current interface is detected, analyzing the spatial gesture information to determine the operation component corresponding to the spatial gesture information and second position coordinates of the operation component in the current interface; when the first position coordinates have an intersection with the second position coordinates, setting the icon corresponding to motion sensing to an overlay state.
Le Shi Zhi Xin Electronic Technology (tianjin) Limited

User-input apparatus, method and program for user-input

User-input apparatus for head gesture based control comprising: a camera for recording a head of a user; a database for defining a plurality of gestures of the head of the user and for relating at least some of the defined gestures to a corresponding user command; a detector for detecting one of the gestures defined in the database in the recording of the camera; a controller for identifying a user command in the database related to the detected gesture and for giving out the user command related to the detected gesture; wherein one of the defined gestures of the head is a switching gesture and one of the user commands is a switching command related to the switching gesture, wherein, when the switching command is given out from the controller, the user-input apparatus switches between a use mode for the head gesture based control and a settings mode.. .

Athletic training, data collection, dynamic, and personified sporting method, apparatus, system, and computer program product

According to one exemplary embodiment, a computer implemented personified sporting apparatus, system, method and/or computer program product may provide an electronically and programmably controlled personification attributes of a sport, as well as one or more sensor to identify sensor data relating to interactions with sport. Certain embodiments include robotic and/or animatronically moveable appendages, and/or emotions and gestures, via audio and visual enhanced features.

Television user interface

A user interface for a television display includes a remote control with a touch pad. The user controls movement of an indicator by means of gestures and/or other actions performed on the touch pad.
Sky Cp Limited

Platform for enabling remote services

A platform receives service requests from a principal. The service request includes a service location that is remote from the principal.
Uzoom, Inc.

Dynamic transaction card protected by gesture and voice recognition

A dynamic transaction card that includes a transaction card having a number of layers, each of which may be interconnected to one another. For example, a dynamic transaction card may include an outer layer, a potting layer, a sensor layer that may be utilized to activate a dynamic transaction card by authenticating the card user as authorized to use the card through user authentication input recognition, which may be gesture and voice recognition processing, a display layer (including, for example, leds, a dot matrix display, and the like), a microcontroller storing firmware, java applets, java applet integration, and the like, an emv™ chip, an energy storage component, one or more antenna (e.g., bluetooth™ antenna, nfc antenna, and the like), a power management component, a flexible printed circuit board (pcb), a chassis, and/or a card backing layer..
Capital One Services, Llc.

Body relationship estimation method and apparatus

A body relationship estimation method and apparatus are disclosed. The method includes obtaining a target picture, calculating a first body relationship feature of two persons according to at least one of first location information of a body part of each person of the two persons in the target picture or second location information of body parts of the two persons, where the first location information is obtained by performing single-person gesture estimation on each person, and the second location information is obtained by performing two-person joint gesture estimation on the two persons when the first location information indicates that the body parts of the two persons overlap, and determining a body relationship between the two persons according to the first body relationship feature..
Huawei Technologies Co., Ltd.

Systems and methods for virtually weighted user input elements for performing critical actions

In an example implementation of the disclosed technology, a method includes receiving an indication of a gesture of an input object moving, at a rate of movement, from a first location of a presence-sensitive input device toward a second location of the presence-sensitive input device. The method also includes, responsive to determining that the rate of movement does not exceed a predetermined rate of movement, outputting, for display, a visual indicator moving from a first location of a display toward a second location of the display.
Google Inc.

Method and processing new message associated with application

The present invention discloses a method including: applied to a portable electronic device including a display and multiple application programs, where the display includes a touch-sensitive surface and a display screen. The method includes: displaying a first application interface element in a first area of the display screen, where the first application interface element is corresponding to a first application program; displaying a second application interface element in a second area of the display screen, where the second application interface element indicates that a new message corresponding to the first application program is generated, and the second area and the first area at least partially overlap; detecting a first gesture; and displaying the second application interface element in a third area of the display screen to respond to the first gesture, where the third area and the first area do not overlap..
Huawei Technologies Co., Ltd.

System and note taking with gestures

A system, method and computer program product for use in editing digital documents with handwriting input to a computing device are provided. The computing device is connected to an input device in the form of an input surface.
Myscript

Method and system of gesture recognition in touch display device

Disclosed is a method and system of gesture recognition in a touch display device, which is able to predetermine gesture inputs possibly to be made by a user prior to the completion of the user's touch input, and enable a display unit to display all possible similar gesture inputs so as to provide an instruction (or navigation guidance) for the user. Thus, when using a large-sized touch display device, the user does not have to perform touch operations widely throughout the screen of the display device, because the system can recognize the similar gesture inputs in advance, which renders it easier for the user to operate on the touch display device, thereby obtaining a better user experience..
Shenzhen China Star Optoelectronics Technology Co., Ltd.

User interface for point of sale device

Devices and techniques are disclosed for identifying user selection actions on a pos device and based on the action changing the default color of a numpad to a different default color associated with the quick key for that action, while at the same time changing the default number, key, and symbol arrangements on the numpad to the default number, key and symbol arrangements of the quick key action. In another aspect, the quick key actions include at least one of a cash value discount selection key and a percent cash value selection key.
Intale Inc

Visual language for human computer interfaces

Embodiments of the invention recognize human visual gestures, as captured by image and video sensors, to develop a visual language for a variety of human computer interfaces. One embodiment provides a method for recognizing a hand gesture positioned by a user hand.
Fastvdo Llc

Video display device and operating method thereof

Provided is an operating method of an video display device. The method includes: obtaining an image for a user of the video display device; generating a plurality of gesture areas for the user from the obtained image; recognizing a position of a gesture object of the user from the plurality of generated gesture areas; recognizing a gesture of the user from the obtained image; and performing a control operation corresponding to the position of the recognized gesture object and the recognized gesture..
Lg Electronics Inc.

Method of giving a movement instruction to an object in a virtual space, and program therefor

To give various movement instructions by body gestures to an object in a virtual space, a method includes detecting movements of controllers held by both hands of a user. The method further includes determining a first movement instruction based on the movements of the controllers.
Colopl, Inc.

Secured and noise-suppressed multidirectional gesture recognition

The subject matter disclosed herein relates to detecting unidirectional or multidirectional movement(s) or gesture(s) made by moving object(s). Aspects of the disclosure pertain to a system and method for mining real-time deviation in illuminance of light reflected off moving object(s) to detect movement(s) or gesture(s) made by the moving object(s).

Method and processing gestures

A system that incorporates the subject disclosure may include, for example, a processor to perform operations including sensing a gesture performed by an object in a vicinity of a sensor, associating the gesture with the at least one gaming action responsive to determining that the gesture is a new gesture not previously associated with at least one gaming action of a plurality of gaming actions that control presentations produced by a gaming application, sensing by way of the sensor a subsequent instance of the gesture by the object or a different object, obtaining the at least one gaming action associated with the gesture responsive to detecting the subsequent instance of the gesture, and providing the at least one gaming action to the gaming application. Other embodiments are disclosed..
Steelseries Aps

Automated identity assessment method and system

A method, system and software for assessing an entity (15) at a first user terminal (13) connected to a data network (10). A control system (11) is used to receive an access request (101) from the entity (15) or an assessing user (16) at a second user terminal (14).

Method of performing multi-modal dialogue between a humanoid robot and user, computer program product and humanoid robot for implementing said method

A method of performing dialogue between a humanoid robot and user comprises: i) acquiring input signals from respective sensors, at least one being a sound sensor and another being a motion or image sensor; ii) interpreting the signals to recognize events generated by the user, including: the utterance of a word or sentence, an intonation of voice, a gesture, a body posture, a facial expression; iii) determining a response of the humanoid robot, comprising an event such as: the utterance of a word or sentence, an intonation of voice, a gesture, a body posture, a facial expression; iv) generating, an event by the humanoid robot; wherein step iii) comprises determining the response from events jointly generated by the user and recognized at step ii), of which at least one is not words uttered by the user. A computer program product and humanoid robot for carrying out the method is provided..

Cross-platform data visualizations using common descriptions

The present invention extends to methods, systems, and computer program products for cross-platform data visualizations using common descriptions. Embodiments of the invention provide mechanisms for simplifying software development and enhanced code reliability.

Electronic device for performing payment and operating the same

An electronic device, according to an example embodiment of the present disclosure, may include: a communication module that includes an antenna; one or more sensors; and a processor, wherein the processor is configured to: obtain a gesture input with respect to the electronic device using the one or more sensors; perform a function related to payment if the gesture input satisfies a specified condition; and suppress the execution of the function if the gesture input does not satisfy the specified condition.. .

Memory facilitation using directed acyclic graphs

Memory facilitation using directed acyclic graphs is described, for example, where a plurality of directed acyclic graphs are trained for gesture recognition from human skeletal data, or to estimate human body joint positions from depth images for gesture detection. In various examples directed acyclic graphs are grown during training using a training objective which takes into account both connection patterns between nodes and split function parameter values.

3d ir illumination for iris authentication

A system and method for iris authentication in an electronic device employ a presence detection sensor to detect when an object such as a user is close to the device. Thereafter, an array of gesture recognition ir (infrared) leds (light emitting diodes) are activated, and their reflections are used to determine the distance and location of the user with respect to the electronic device.

Braille data entry using continuous contact virtual keyboard

A first touch gesture is sensed at a subset of a set of six braille dot touch points at a virtual braille keyboard. The first touch gesture corresponds to a braille character.

Method for changing user-originating information through interaction between mobile device and information display device

A method for automatically changing information originating from at least either of a first user and a second user by using a mobile device includes steps of: (a) a first device searching second devices as a target to perform interaction and then selecting a specific second device among the searched second device to change the user-originating information; (b) the first device transmitting to, or receiving from, the specific second device data related to the interaction, if a touch gesture is detected in the first device; and (c) at least either of the first and the second devices allowing a server to update the information originating from at least either of the first user and the second user by referring to the transmitted data related to the specific interaction.. .

Device, method, and graphical user interface for performing a gesture in an area to replace a display of application launch icons in the area with a display of information customized to a user while maintaining a display of application launch icons in a different area

A computing device with a touch screen display displays a first set of a first plurality of icons in a first area of the touch screen display, the first plurality of icons including a plurality of sets of icons that are separately displayed in the first area of the touch screen display, the first plurality of icons including application launch icons; displays a second plurality of icons in a second area on the touch screen display, the second plurality of icons including application launch icons, wherein the second area is different from the first area; detects a finger swipe gesture in the first area; and, in response, replaces display of the first set of the first plurality of icons with display of a second set of the first plurality of icons in the first area on the touch screen display, while maintaining the display of the second plurality of icons in the second area on the touch screen display.. .

Gesture recognition and control based on finger differentiation

An embodiment of a computer implemented method of performing a processing action includes detecting an input from a user via an input device of a processing device, the input including a touch by at least one finger of a plurality of fingers of the user, estimating a gesture performed by the at least one finger based on the touch, measuring at least part of a fingerprint of the at least one finger, and identifying the at least one finger used to apply the input by the user based on stored fingerprint data that differentiates between individual fingers of the user. The method also includes identifying an action to be performed based on the estimated gesture and based on the identified at least one finger, and performing the action by the processing device..

Recognizing gestures and updating display by coordinator

A non-transitory computer-readable storage medium may comprise instructions stored thereon. When executed by at least one processor, the instructions may be configured to cause a computing device to implement at least a user interface module and a coordinator module.

Processing for distinguishing pen gestures and dynamic self-calibration of pen-based computing systems

Devices, methods, and computer-readable media process to distinguish user input device gestures, such as gestures input via a pen in a pen-based computing system, e.g., to quickly and reliably distinguish between electronic ink entry, single taps, double taps, press-and-hold actions, dragging operations, and the like. The devices, methods, and computer-readable media process quickly and reliably distinguishes between input device gestures by utilizing a gesture profile that includes a preferential input type, e.g.

Methods and devices for detecting intended touch action

Methods and devices are disclosed for determining that a touch gesture is validly detected. In one embodiment, a method for detecting the touch gesture comprises: generating by a first sensing unit a first sensing parameter upon a touch action on the touch input interface; determining by the first sensing unit a first touch gesture corresponding to the first sensing parameter; acquiring by the first sensing unit a second touch gesture corresponding to a second sensing parameter generated by a second sensing unit; and determining that the touch gesture has been detected and is valid by confirming that the first touch gesture and the second touch gesture are the same; wherein the first and second sensing units are interconnected and are respectively one and the other of a touch control processor for the touch input interface and a motion sensor.

Gesture recognition method, apparatus and wearable device

A method and an apparatus for gesture recognition and a wearable device for gesture recognition are described. A method of gesture recognition involves using a processor to detect a motion artifact from an output signal of a biosignal sensor and generating a control signal to control a function of a target device that corresponds to a reference signal pattern in response to a signal pattern of the detected motion artifact corresponding to the reference signal pattern..

Client device motion control via a video feed

An approach is described for enabling motion control of a client device, such as a mobile device, via a video feed transmitted from one or more video capture devices. An associated method may include establishing, via a communications network, a communication session between a client device and one or more video capture devices.

Determination of hand dimensions for hand and gesture recognition with a computing interface

Hand dimensions are determined for hand and gesture recognition with a computing interface. An input sequence of frames is received from a camera.

Methods and systems for enabling gesture control for a vehicle feature

A vehicle gesture system includes a processor connected to a transceiver and programmed to detect a wireless device associated with a vehicle feature settings interface for a first vehicle feature. The processor is further programmed to detect the wireless device based on received user input at the vehicle feature settings interface of the first vehicle feature.

Gesture recognition and control based on finger differentiation

An embodiment of a computer implemented method of performing a processing action includes detecting an input from a user via an input device of a processing device, the input including a touch by at least one finger of a plurality of fingers of the user, estimating a gesture performed by the at least one finger based on the touch, measuring at least part of a fingerprint of the at least one finger, and identifying the at least one finger used to apply the input by the user based on stored fingerprint data that differentiates between individual fingers of the user. The method also includes identifying an action to be performed based on the estimated gesture and based on the identified at least one finger, and performing the action by the processing device..

Method of providing tactile feedback and apparatus

A method includes detecting a query gesture and actuating, in response to the query gesture, an actuator to provide tactile feedback including information associated with the query gesture. The query gesture may be detected on a touch-sensitive display of a portable electronic device..

Wearable content navigation device

Techniques, systems, and methods are disclosed to implement a wearable content navigation device configured to interact with one or more media devices through gestures. A mobile device may receive motion data indicating movement of a wearable device.

Apparatus for detecting electromagnetic field change in response to gesture

Embodiments of the present disclosure provide techniques and configurations for an apparatus for detection of a change of electromagnetic field in response to a gesture, to identify the gesture that caused the field change. In one instance, the apparatus may include a first conducting component having first features for the disposal on or around a portion of a user's body, to generate an electromagnetic field in response to a receipt of a source signal.

Touchless gesture recognition for elevator service

A method for provisioning elevator service includes sensing, by a gesture interface comprising a sensor, a region in proximity to the gesture interface to obtain data; determining, by the gesture interface, that a first pattern in the data corresponds to a first gesture; and initiating a request for elevator service in response to determining that the first pattern corresponds to the first gesture.. .

Assigning gesture dictionaries

Techniques for assigning a gesture dictionary in a gesture-based system to a user comprise capturing data representative of a user in a physical space. In a gesture-based system, gestures may control aspects of a computing environment or application, where the gestures may be derived from a user's position or movement in a physical space.

Leader and follower management system for wearable devices

A wearable computing device identifies a gesture made by the user of the wearable computing device. The wearable computing device sends an offer to a second wearable computing device, where the offer is to follow the user.
International Business Machines Corporation

Leader and follower management system for wearable devices

A wearable computing device identifies a gesture made by the user of the wearable computing device. The wearable computing device sends an offer to a second wearable computing device, where the offer is to follow the user.
International Business Machines Corporation

System, device, and detecting user identity based on motor-control loop model

Device, system, and method of detecting identity of a user based on motor-control loop model. A method includes: during a first session of a user who utilizes a pointing device for interacting with a computerized service, monitoring the pointing device dynamics and gestures of the user; based on the monitored dynamics and gestures, estimating parameters that characterize a sensorimotor control loop model of the user; storing in a database a record indicating that the user is associated with the parameters that characterize the sensorimotor control loop model of the user..
Biocatch Ltd.

Gesture-based object measurement method and apparatus

In the field of man-machine interaction technologies, a gesture-based object measurement method and apparatus, which are used to improve measurement efficiency. According to this method, after image information is collected, contour information of a to-be-measured object is automatically extracted and partitioned off, and a measurement parameter value such as a length, an area, or a volume is calculated on this basis.
Huawei Technologies Co., Ltd.

Gesture recognition method and virtual reality display output device

Disclosed are a gesture recognition method for virtual reality display output device and a virtual reality display output electronic device. The recognition method includes: acquiring first and second videos from first and second cameras respectively; separating first and second plane gestures associated with first and second plane information of first and second hand graphs in the first and second video from the first and second videos respectively; converting the first plane information and the second plane information into spatial information using a binocular imaging way, and generating a spatial gesture including the spatial information; acquiring an execution instruction corresponding to the spatial gesture; and executing the execution instruction.
Le Shi Zhi Xin Electronic Technology (tianjin) Limited

Behavior based authentication for touch screen devices

A method, system, and one or more computer-readable storage media for behavior based authentication for touch screen devices are provided herein. The method includes acquiring a number of training samples corresponding to a first action performed on a touch screen of a touch screen device, wherein the first action includes an input of a signature or a gesture by a legitimate user.
Microsoft Technology Licensing, Llc

Tap to initiate a next action for user requests

Embodiments may relate to intuitive user-interface features for a head-mountable device (hmd), in the context of a hybrid human and computer-automated response system. An illustrative method may involve a head-mountable device (hmd) that comprises a touchpad: (a) sending a speech-segment message to a hybrid response system, wherein the speech-segment message is indicative of a speech segment that is detected in audio data captured at the hmd, and wherein the speech-segment is associated with a first user-account with the hybrid response system, (b) receiving a response message that includes a response to the speech-segment message and an indication of a next action corresponding to the response to the speech-segment message, (c) displaying a screen interface that includes an indication of the response, and (d) while displaying the response, detecting a singular touch gesture and responsively initiating the at least one next action..
X Development Llc

Enlarging or reducing an image on a display screen

A method, and associated apparatus and system and program product, for enlarging or reducing an image. The image is displayed on a display screen.
International Business Machines Corporation

Portable device and controlling screen thereof

A portable device and a method for controlling a screen thereof, which move a displayed screen corresponding to a movement distance of a touch gesture that is detected from a touch screen, are provided. In an aspect, the portable device and a method for controlling a screen thereof move a displayed screen corresponding to a movement distance of a touch gesture that is detected from a touch screen and stored setting..
Samsung Electronics Co., Ltd.

User account switching interface

Example implementations relate to switching user accounts. For example, a method includes displaying, at a display of a mobile device, a first user account.
Hewlett-packard Development Company, L.p.

Method and system for controlling an illumination device and related lighting system

A method for controlling an illumination device is provided. The method includes obtaining an image of an illumination device, thereby capturing an illumination pattern generated by the illumination device based on a visible light communication technique.
General Electric Company

Data entering method and terminal

Disclosed are a data inputting method and terminal. The terminal includes: a processor; and a memory for storing instructions executable by the processor; wherein the processor is configured to: extract data information from a capturing object; identify an operation gesture of a user, and input the extracted data information into a target region according to an inputting manner corresponding to the identified operation gesture, wherein the inputting manner comprises an application program to be inputted and an input format..
Zte Corporation

Method and system for processing composited images

The disclosure is related to a method and a system for processing composited images. The method is operated in a computer.
Framy Inc.

Image processing apparatus, image processing method, and program

Provided is an image processing apparatus including a hand shape recognition unit that performs hand shape recognition on an input image to detect a position and a size of a hand with a specific shape in the input image, a determination region setting unit that sets a region in a vicinity of the hand on the input image as a determination region used to recognize a gesture performed using the hand, based on the position and the size of the hand, and a gesture recognition unit that recognizes the gesture by monitoring movement of the hand to the determination region.. .
Sony Interactive Entertainment Inc.

Three-dimensional computer-aided-design system user interface

A three-dimensional (3d) computer aided design (cad) user interface (ui) is describing using both two-handed and one-handed free hand gestures and poses to map to actions in the 3d cad ui environment. Free hands may be used to directly both constrain and organically modify an object.

Image display apparatus and operation method thereof

A method of operating the video display device according to the embodiment of the present invention comprises acquiring a video of a video display device associated with a user; recognizing a gesture of the user in the acquired video; calculating a control amount with regard to the gesture based on the recognized gesture; and performing the control operation corresponding to the recognized gesture and the calculated control amount.. .
Lg Electronics Inc.

Method and detecting hand gestures with a handheld controller

A method for detecting a user's hand gestures with a handheld controller. The method includes monitoring a first sensor and a second sensor.
Oculus Vr, Llc

Tactile sensation control system and tactile sensation control method

It is an object of the invention to provide a tactile sensation control system and a tactile sensation control method. The system includes: a processor to execute a program; and a memory to store the program which, when executed by the processor, performs processes of: acquiring operation area information on at least one operation area for operation by the user on the operation surface and on an operation type corresponding to the operation area, and controlling the tactile sensation on the operation surface so that the operation area in the acquired operation area information causes the user to have a tactile sensation according to the operation type corresponding to the operation area.
Mitsubishi Electric Corporation

Method for intelligently controlling controlled equipment and device

The present disclosure provides a method for intelligently controlling a controlled equipment, which includes the following steps: a current scene is monitored, when a user exists in the current scene, scene information of the current scene is acquired, and the current scene is recognized according to the scene information to acquires a scene recognizing result; action information of the user is acquired, the action information is recognized to acquires an action recognizing result; a controlled equipment matched with the scene recognizing result and the action recognizing result is controlled according to the scene recognizing result and the action recognizing result. The present disclosure further provides a device for intelligently controlling a controlled equipment.
Shenzhen Skyworth-rgb Electronic Co., Ltd.

Light control systems and methods

Provided is a light-emitting device control system, comprising a beam steering mechanism that directs a beam of light at a first surface location, wherein an illumination region is formed at the first surface location in response to the directed beam of light, a sensor that recognizes a hand gesture at the illumination region; a processor that converts data related to the hand gesture into a command signal, and a controller that instructs the beam steering mechanism to move the illumination region to a second surface location in response to the command signal corresponding to the hand gesture.. .

Methods and systems for controlling vehicle body motion and occupant experience

In one embodiment, one or more suspension systems of a vehicle may be used to mitigate motion sickness by limiting motion in one or more frequency ranges. In another embodiment, an active suspension may be integrated with an autonomous vehicle architecture.
Levant Power Corporation

Systems and methods for control device including a movement detector

An example image processing system and method uses a control device including a movement detector. gesture inputs corresponding to a gesture made by moving the control device are used for animation.
Nintendo Co., Ltd.

Automated discovery and launch of an application on a network enabled device

A method, apparatus and system related to automated discovery and launch of an application on a network enabled device are disclosed. In one embodiment, a method of a client device includes determining that a networked media device sharing a local area network common with the client device has automatically detected an audio-visual data and/or an application currently being accessed by a user of the client device.

Visual task feedback for workstations in materials handling facilities

Visual task feedback for workstations in a materials handling facility may be implemented. Image data of a workstation surface may be obtained from image sensors.
Amazon Technologies, Inc.

Multi-application viewing

In one example implementation, a computing device displays a first application window and second application window adjacent to each other in a multi-application viewing area. The computing device detects a gesture in either an upper gesture detection area or a lower gesture detection area and controls either the first application window or second application window based on the gesture..
Hewlett-packard Development Company, L.p.

Slider and gesture recognition using capacitive sensing

Conventional user interface for sensing gestures often require physical touching of a sensor pad, or an area filled with sensor pads. These conventional sensor pads take up precious real estate on a compact mobile device, interferes significantly with other components of the mobile device, complicates design, consumes power, and adds costs to the final product.
Analog Devices, Inc.

Software design tool for a user interface and the administration of proximity responsive information displays in augmented reality or virtual reality environments

A method, computer program product, and system for creating and modifying an information display for an ar/vr environment, including for a display containing functional content to be responsive without the user needing to know any computer coding or web-development languages. The method, computer program product, and system allow the user to create a multitude of different tile arrangements and configurations through a simple user interface, and to position them so that they are responsive based on inputs such as apparent proximity of the viewer to the information display within the ar/vr environment or end-user gestures or verbal commands.
Livetiles Llc

Natural user interface for selecting a target element

Selecting an intended target element via gesture-dependent hit testing is provided. Aspects provide for receiving a gesture input on or proximate to a selection handle and neighboring content; performing a hit test for determining whether a gesture input contact area and a selection handle hit target area overlap and/or exceed an upper limit overlap value; performing gesture recognition for determining whether the gesture input is a static or a manipulation gesture; selecting an intended target element based on at least one of the results of the hit test and the gesture recognition; and manipulating the intended target element in accordance with the manipulation gesture..
Microsoft Technology Licensing, Llc.

Expandable application representation, activity levels, and desktop representation

Expandable application representation techniques are described. The techniques may include support of an expandable tile that may function as an intermediary within a root level (e.g., start menu or screen) of a file system.
Microsoft Technology Licensing, Llc

Optical sensor module utilizing optical designs to adjust gesture sensitive region, and related mobile apparatus

An optical sensor module is provided. The optical sensor module includes a light source, a first lens and a sensor device.
Eminent Electronic Technology Corp. Ltd.

Cursor mode switching

Methods and systems for processing input from an image-capture device for gesture-recognition. The method further includes computationally interpreting user gestures in accordance with a first mode of operation; analyzing the path of movement of an object to determine an intent of a user to change modes of operation; and, upon determining an intent of the user to change modes of operation, subsequently interpreting user gestures in accordance with the second mode of operation..
Leap Motion, Inc.

Method for controlling and calibrating a device with a gesture

A sensor of a control system can be controlled by commands based on knocking gestures. The method includes installing a housing with a sensor, generating a contact interaction, detecting the data signals corresponding to the contact interaction, determining a status data pattern, matching the detected profile with a status gesture profile associated with a command to switch the sensor to a listening status, and controlling terminal devices when the sensor is in the listening status.
Swan Solutions Inc.

Methods and systems for positioning, navigating, and manipulating documents

A computer-based method, non-transitory computer-readable medium, and system include performance of actions by a processor operating within a computing device. The actions include displaying a document; observing, using at least one sensor, at least one object associated with a user; determining, based on the observation, whether the at least one object performs one of multiple pre-defined gestures or movements; and, upon determination that the at least one object has performed a one of the pre-defined gestures or movements, manipulating the document based on a pre-defined response to the one of the pre-defined gestures or movements..
Global Graphics Software Limited

Recognition system for sharing information

A system and method for sharing information between users based on recognition of the users and their associated processing devices in a scene. Interactions can be physical, verbal or a combination of physical and verbal gestures.
Microsoft Technology Licensing, Llc

Methods and systems for defining gestures for a user interface

A method performed at an electronic device with a utility for prototyping a user interface having one or more layers includes, in the utility: for each image of one or more images in the user interface, selecting an image patch, selecting a layer patch, and coupling an image output of the image patch to an image input of the layer patch; selecting a gesture patch and specifying a gesture for the gesture patch; coupling an output of the gesture patch to an input of a first layer patch; generating the user interface for display in accordance with the couplings; receiving user-interaction data for manipulating the user interface, the user-interaction data corresponding to the gesture; and in response to the user-interaction data, updating display of the user interface in accordance with the user-interaction data and the gesture patch as coupled to the first layer patch.. .
Facebook, Inc.

Method and system for audible delivery of notifications partially presented on an always-on display

An electronic device includes one or more processors, a touch-sensitive display and an audio output operative in a first mode to produce an audible output at a first output level, and operable in a discreet mode operative to produce the audible output at a second output level that is less than a first output level. The one or more processors can present a user actuation target or a portion of a notification at a location on the touch sensitive display.
Motorola Mobility Llc

Helmet with blind spot assistant function

A helmet with blind spot assistant function includes a helmet body, a gesture sensation unit, at least one camera unit, a display unit and a control unit. According to a gesture sensation signal generated by the gesture sensation unit, the control unit generates a control signal to control and activate the camera unit to generate video information for the display unit to display the video information.
Jarvish Inc.

Devices, systems, and methods for detecting gestures using multiple antennas and/or reflections of signals transmitted by the detecting device

Examples described herein may detect gestures using multiple antennas and/or using reflected signals transmitted by the device which is also detecting the gesture. Multiple antenna detection may allow for classification of 3d gestures around a device.
University Of Washington

Method for optically measuring three-dimensional coordinates and controlling a three-dimensional measuring device

A method for scanning and obtaining three-dimensional (3d) coordinates is provided. The method includes providing a 3d measuring device having a projector, a first camera and a second camera.
Faro Technologies, Inc.

Hearing assistance system and method

A hearing assistance system having a hand-held audio signal wireless transmission unit for transmitting an audio signal to at least one receiver unit, and a hearing stimulating device. The transmission unit has at least one microphone for generating an audio signal, a motion sensor unit for sensing the acceleration acting on the transmission unit in three orthogonal axes and sensing the orientation of the transmission unit in space, a memory unit for storing a plurality of motion patterns of the transmission unit corresponding to gestures of a person holding the transmission unit in a hand, a control unit for identifying a motion pattern of the transmission unit from the stored motion patterns by analyzing the time dependence of an output signal of the motion sensor unit and comparing it to the stored motion patterns, and the control unit controlling operation of the transmission unit according to the identified motion pattern..
Sonova Ag

Tutorial model comprising an assistance template

Disclosed is a tutorial model including at least one frame provided with an assistance template for assisting in the recording of the frame, the recording of the frame being a rush that can be incorporated into a framework for a film or series of images or image; the invention also relates to a film or series of images or image which includes a framework and rushes incorporated into the framework, the rushes being produced from such a model; a method for designing a personalised film or series of images or image; a tutored coaching method for helping a user to record frames that can be incorporated into a framework; and a tutored coaching method for helping a user to learn professional gestures.. .

Virtual reality text manipulation

A virtual reality (vr) method and system for text manipulation. According to an exemplary embodiment, a vr method displays and provides a user with interaction with displayed documents in a virtual environment (ve), the ve including a vr head-mounted display and one or more gesture sensors.
Xerox Corporation

Method and server for paying commodity using device

Disclosed herein is a method for paying a commodity using a device. The method may include the steps of (a) obtaining commodity information by scanning barcode of a commodity, (b) recognizing a first gesture of a user, (c) sending the obtained commodity information to a server in response to the recognized first gesture, and (d) receiving purchase information about the commodity whose commodity information has been obtained from the server and outputting the purchase information.
Sk Planet Co., Ltd.

Method and detecting error in gesture recognition

A method and apparatus for detecting an error in gesture recognition are provided. The method includes sensing whether an effective gesture occurs in a first area for gesture recognition of a user; setting a second area and sensing an occurrence of an event due to a movement of the user, based on a result of the sensing in the first area; and detecting the error in the gesture recognition based on whether the occurrence of the event is sensed in the second area..
Samsung Electronics Co., Ltd.

Methods, apparatuses, and devices for processing interface displays

Methods and apparatuses are provided for processing interface displays. The disclosed methods and apparatuses may detect a gesture operation on a current interface of a computing device.
Guangzhou Ucweb Computer Technology Co., Ltd.

Terminating computing applications using a gesture

In general, this disclosure is directed to techniques for outputting, by a computing device and for display, a graphical user interface of an application currently executing at the computing device (582). A presence-sensitive input device detects two gestures (584, 588).
Google Inc.

Touch point recognition method and apparatus

A touch point recognition method and apparatus pertain to the field of touchscreen technologies. The touch point recognition method includes: determining, by a touchscreen terminal, a touch point invalidation policy; obtaining a touch event on a touchscreen; determining a touch point of a touch gesture within a touchscreen edge area according to the touch event; and recognizing an invalid touch point of the touch gesture within the touchscreen edge area according to the touch point of the touch gesture within the touchscreen edge area and the touch point invalidation policy..
Huawei Technologies Co., Ltd.

Method and device for preventing accidental touch of terminal with touch screen

Aspects of the disclosure provide a method for preventing an accidental operation of a terminal device due to an accidental touch on a touch screen. The method includes detecting status information of the terminal device, detecting a touch gesture on the touch screen, determining, according to a pre-established relationship between status information of the terminal device and motion status information of touch gestures for touch behaviors, whether the status information of the terminal device and the touch gesture correspond to a same touch behavior, and when the status information of the terminal device and the touch gesture correspond to the same touch behavior, executing an operation associated with to the touch behavior..
Xiaomi Inc.

System and 3d position and gesture sensing of human hand

A three dimensional touch sensing system having a touch surface configured to detect a touch input located above the touch surface is disclosed. The system includes a plurality of capacitive touch sensing electrodes disposed on the touch surface, each electrode having a baseline capacitance and a touch capacitance based on the touch input.
The Trustees Of Princeton University

Method and adjusting orientation, and electronic device

This application relates to the field of display technologies, and provides a method and an apparatus for adjusting an orientation, and an electronic device. The method comprises: in response to a first rotation operation performed by a user on an electronic device, performing the first rotation operation on an orientation of content displayed on the screen of the electronic device; determining whether an orientation of the screen of the electronic device is consistent with the orientation of the content after the first rotation operation is performed; and in response to that the orientation of the screen of the electronic device is inconsistent with the orientation of the content after the first rotation operation is performed, performing a second rotation operation on the content.
Beijing Zhigu Rui Tuo Tech Co., Ltd.

Method and adjusting orientation, and electronic device

This application relates to the field of display technologies, and provides a method and an apparatus for adjusting an orientation, and an electronic device. The method comprises: in response to a first rotation operation performed by a user on an electronic device, determining an orientation of content displayed on the screen of the electronic device; after the user performs the first rotation operation on the electronic device, determining whether an orientation of the screen is consistent with the orientation of the content; and in response to that after the user performs the first rotation operation on the electronic device, the orientation of the screen is consistent with the orientation of the content, maintaining the orientation of the content.
Beijing Zhigu Rui Tuo Tech Co., Ltd.

Method to control perspective for a camera-controlled computer

Systems, methods and computer readable media are disclosed for controlling perspective of a camera-controlled computer. A capture device captures user gestures and sends corresponding data to a recognizer engine.
Microsoft Technology Licensing, Llc

Wearable gesture control device and smart home system

The present invention relates to a gesture control method for a smart home system, characterized in that it comprises the steps of: pre-defining each gesture; associating each predefined gesture with a control command for a smart home system; making a gesture and wirelessly transmitting a control command associated with the gesture in accordance with a wireless protocol; receiving the control command and performing a control operation associated with the gesture to the smart home system.. .
Honeywell International Inc.

Gesture control device and method

A gesture control method is provided. The method includes: obtaining gesture images with depth information; creating a coordinate system; determining coordinates of a center of each camera, a start position and an end position of the gesture; calculating directions and values of a first angle defined from an axle through the end position to a line connecting between the start position and the end position and at least two second angles each defined a vertical axle through a center of a camera to a line connecting the center of the camera and the start position, each second angle corresponding to a camera of an electronic device; and determining an electronic device to be a controlled device, wherein the electronic device corresponds to a second angle in a same direction with the first angle having a minimum absolute difference with the first angle..
Hon Hai Precision Industry Co., Ltd.

Gesture based human machine interface using marker

The present disclosure relates to system and method for gesture recognition for emulating a mouse for human machine interface wherein displacements, direction of displacements of cursor as also double click actions of mouse can be emulated by instinctive hand gestures. The method uses a marker as gesture interface and therefore does not depend on hand segmentation techniques, which suffer from deficiencies related to lighting conditions, variation of skin color from person to person and complexities of background..
Centre For Development Of Telematics

Computer-implemented gaze interaction method and apparatus

A computer-implemented method of communicating via interaction with a user-interface based on a person's gaze and gestures, comprising: computing an estimate of the person's gaze comprising computing a point-of-regard on a display through which the person observes a scene in front of him; by means of a scene camera, capturing a first image of a scene in front of the person's head (and at least partially visible on the display) and computing the location of an object coinciding with the person's gaze; by means of the scene camera, capturing at least one further image of the scene in front of the person's head, and monitoring whether the gaze dwells on the recognised object; and while gaze dwells on the recognised object: firstly, displaying a user interface element, with a spatial expanse, on the display face in a region adjacent to the point-of-regard; and secondly, during movement of the display, awaiting and detecting the event that the point-of-regard coincides with the spatial expanse of the displayed user interface element. The event may be processed by communicating a message..
Itu Business Development A/s

System and methods for on-body gestural interfaces and projection displays

A wearable system with a gestural interface for wearing on, for instance, the wrist of a user. The system comprises an ultrasonic transceiver array structure and may comprise a pico projector display element for displaying an image on a surface.
Ostendo Technologies, Inc.

Gesture-controlled mr imaging system and method

A magnetic resonance imaging system and method are provided that include user control of certain functions using physical gestures, such as hand motions or the like. The gesture control aspects can include one or more cameras, and a processor configured to detect and recognize gestures corresponding to predetermined commands and to provide signals to execute the commands.
Johns Hopkins University

Gesture-based vehicle-user interaction

A system, for use in implementing a vehicle function based on user gesture, including a hardware-based processing unit and a hardware-based storage device. The storage device includes a user-gesture determination module that, when executed by the hardware-based processing unit, determines a user gesture, made by a user proximate a vehicle, wherein the user gesture is not an under-vehicle user kick.
Gm Global Technology Operations Llc

System and executing gesture based control of a vehicle system

A method and system for executing gesture based control of a vehicle system that include receiving at least one signal that pertains to at least one muscle movement from sensors disposed within a gear shift knob. The method and system also include determining at least one gesture based on the at least one signal.
Honda Motor Co., Ltd.

Apparatus and method to visually communicate with a vehicle

A visual communication apparatus affixed to a vehicle. The visual communication apparatus includes a smart key system that detects a key fob of a user within a keyless detection zone, a projector projecting visual indications on a projecting zone lying on a ground surface of the vehicle, a sensor optically capturing gestures of the user of the vehicle, and an electrical control unit capable of actuated key elements of the vehicle.
Aisin Technical Center Of America, Inc.

Humanoid robot with an autonomous life capability

A humanoid robot which is capable of surveying its environment, notably to determine when humans are present and to engage in activities with humans corresponding to an evaluation of their desires is provided. An operating system of the robot is configured in the robot to process the information received by extractors (sensors and processing capabilities), to list activities (gestures, dialogs, etc.
Softbank Robotics Europe

Ultrasound imaging system touchscreen user interface

An ultrasound imaging system (102) includes a probe (104) with a transducer array (106) with at least one transducer element (108). The ultrasound imaging system further includes a console (112) with a controller (124) and an echo processor.
B-k Medical Aps

Ornamental lighting

Enhancements to ornamental or holiday lighting are disclosed including remote control ornamental illumination with color pallet control whereby a user can vary the color/intensity/appearance of an individual bulb or entire light string by selecting the electronic address of the bulb and selecting its attribute. Further disclosures include: motion responsive lights which respond to sensed movement, gesture controlled lights, adjustable white color/white led sets, connectable multi-function lights, controller to sequence lights to music or other input source, rotating projection led light/tree top/table top unit, and remote controlled sequencing icicle lights and ornament lighting system..
Seasonal Specialties, Llc

Gesture based notification tuning for collaboration systems

A mobile device or a server may be configured to automatically define a customized mute status. Data indicative of a physical movement of the mobile device is received.
Cisco Technology, Inc.

System and verifying liveliness

A machine-assisted method for verifying a video presence that includes: receiving, at a computing device of an identity provider, an authentication request initially sent from a requester to access an account managed by a relying party, different from the identity provider; retrieving, from the authentication request, at least a portion of a video stream feed initially from the requester, to the computing device, the portion of video stream feed portraying a face of the requester; extracting the face of the requester from the portion of the video stream feed; providing a directive to the requester soliciting a corresponding gesture; and receiving a response gesture from the requester.. .
Morphotrust Usa, Llc

Touch-less switching

A light switch network comprises a plurality of light switch units, each comprising a gesture interface to sense a user gesture by receiving at least one gesture signal from a sensing zone, and configured to exchange one or more gesture status signals with at least one other switch unit in the network in relation to the received gesture signal; each switch being enabled, on receiving the gesture signal: in a first mode, to change a designated switch mode and/or state in response to the gesture signal; or in a second mode, to not change the designated switch mode and/or state according to one or more conditions associated with the status signals received from the other switch unit.. .
Xyz Interactive Technologies Inc.

Systems and methods for combining drawings and videos prior to buffer storage

Systems, methods, and non-transitory computer-readable media can initiate a video capture mode that provides a camera view. A touch gesture can be detected via a touch display.
Facebook, Inc.

Wearable emotion detection and feedback system

A see-through, head mounted display and sensing devices cooperating with the display detect audible and visual behaviors of a subject in a field of view of the device. A processing device communicating with display and the sensors monitors audible and visual behaviors of the subject by receiving data from the sensors.
Microsoft Technology Licensing, Llc

Image segmentation threshold value deciding method, gesture determining method, image sensing system and gesture determining system

An image segmentation threshold value determining method, comprising: defining a plurality of image regions of a first sensing image; determining a first, a second part image segmentation threshold values according to a first, second image regions of the image regions; performing a first, a second image segmentation operation to the first sensing image according to the first, the second part image segmentation threshold values to acquire a first, a second segmented images; and selecting one of the first part image segmentation threshold value and the second part image segmentation threshold value as the first image segmentation threshold value according to the first segmented image and the second segmented image.. .
Pixart Imaging Inc.

Unlocking electronic devices using touchscreen input gestures

A computer implemented method for detecting input gesture events on a touchscreen of an electronic device and for unlocking the electronic device is disclosed. The method may include displaying, while the electronic device is in a locked state, a plurality of guidance lines on the touchscreen of the electronic device, detecting, during an input gesture event, guidance line crossings and calculating a number of guidance line crossings detected during the input gesture event.
International Business Machines Corporation

Unlocking electronic devices using touchscreen input gestures

A computer implemented method for detecting input gesture events on a touchscreen of an electronic device and for unlocking the electronic device is disclosed. The method may include displaying, while the electronic device is in a locked state, a plurality of guidance lines on the touchscreen of the electronic device, detecting, during an input gesture event, guidance line crossings and calculating a number of guidance line crossings detected during the input gesture event.
International Business Machines Corporation

Thematic interactive attraction

The invention is generally directed to an attraction which draws a visitor's attention from other visual attractions in a fashion which provides a stunning visual perspective including a collection of curated images organized and facilitated by a human ambassador providing narrative and visual content and varying the visual presentation in an interactive fashion with an audience utilizing a gesture-based language to manufacture a bespoke experience.. .
Legends Attractions, Llc

Custom gestures

In one embodiment, a method includes accessing a social graph that includes user nodes and edges connecting the user nodes; identifying, based on the social graph, a set of second users corresponding to second-user nodes that are within a specified social degree of separation from a first-user node corresponding to a first user; determining, based on the social graph, that a particular feature is enabled on computing devices associated with at least a threshold number of the identified set of second users; and enabling the particular feature on a computing device associated with the first user.. .
Facebook, Inc.

Disambiguation of multitouch gesture recognition for 3d interaction

A multitouch device can interpret and disambiguate different gestures related to manipulating a displayed image of a 3d object, scene, or region. Examples of manipulations include pan, zoom, rotation, and tilt.
Apple Inc.

Method and system for presenting educational material

Method and system for presenting a lesson plan having a plurality of keyframes is provided. The method includes initializing a fixed variable for a first keyframe; detecting an input from a user at an interactive display device; correlating the input with a gesture from a gesture set and one or more of database domains from among a plurality of database domains; displaying an image at the interactive display device when all mutable variables for the first keyframe content are determined based on the input; manipulating the image using a gesture associated with the displayed image and any associated database domains; and transitioning to a next keyframe using animation..
Mind Research Institute

Method and system for interacting with a touch screen

Disclosed is a system, method, and non-transitory computer readable storage medium for monitoring a user interface for a gesture, determining whether the gesture is exerting pressure on a screen of a computing device, when the gesture is exerting pressure on the screen, determining whether the pressure is above a threshold amount of pressure, when the pressure is above the threshold, previewing a content item in a series of content items, monitoring the user interface for a second gesture, determining whether the second gesture is traversing pixels in a predetermined direction on the screen, upon determining that the second gesture is traversing pixels, further determining whether the traversal meets a threshold number of traversed pixels, and when the traversal meets the threshold, displaying a subsequent content item in the series.. .
Yahoo! Inc.

Method, device and terminal recognizing multi-finger sliding gesture

The present disclosure provides a method, a device, and a terminal apparatus for recognizing a multi-finger sliding gesture. The method includes detecting a plurality of touch events on a touch screen by a user, and acquiring touch event data corresponding to each of the touch events, the touch event data including coordinates of a plurality of touch points and the time of the touch event; calculating an average sliding rate of each of the touch points based on the touch event data; determining that the user's gesture is a multi-finger sliding gesture, if the average sliding rate of each of the touch points is greater than or equal to a preset sliding rate.
Leauto Intelligent Technology (beijing) Co. Ltd.

Electronic apparatus and operating method thereof

An electronic apparatus according to an embodiment of the present disclosure includes a housing and a display exposed through one surface of the housing. The display is configured to detect a touch or gesture input and has a first radius.
Samsung Electronics Co., Ltd

Human interface for vehicle automation

The invention concerns a system by which intelligent land vehicle, transportation and highway data can be usefully presented to, and acted on by, a driver of a vehicle, providing a bridge between the vehicles of today, and the automated vehicles of tomorrow. Preferred embodiments utilize displayed sensor based imagery the driver interacts with by touch, gesture or other methods in order to confirm or reject alert signals provided by sensors and to confirm or understand the operation of sensory systems of the vehicle.

Removable protective cover with embedded proximity sensors

A proximity sensor including a housing including a printed circuit board, the housing configured to be repeatedly attached to and detached from an electronic device, a linear array including interleaved light emitters and photodiode detectors mounted on the printed circuit board, lenses mounted in the housing directing light beams emitted by the light emitters towards an airspace outside the housing, and directing light beams reflected by one or more objects in the airspace towards the photodiode detectors, and a processor generating information regarding a plurality of different gestures performed by the one or more objects in the airspace, based on reflected light detected by the photodiode detectors, and communicating the information to the electronic device as input to the electronic device, when the housing is attached to the electronic device.. .
Neonode Inc.

Method, device and terminal recognizing a multi-finger pinching-in or pinching-out gesture

The present disclosure provides a method, device, and terminal apparatus for recognizing a multi-finger pinching-in or pinching-out gesture. The method includes detecting a plurality of touch events on a touch screen by a user, and acquiring touch event data corresponding to each touch event, the touch event data including coordinates of a plurality of touch points and the time period of the touch event; calculating an average sliding rate of each touch point and a value of reduced or increased distance between any two of the touch points based on the touch event data; and determining that the user's gesture is a multi-finger pinching-in or pinching-out gesture, if average sliding rate of each touch point is greater than or equal to a preset sliding rate and the value of reduced or increased distance between any two of the touch points is greater than or equal to a preset value of distance variation..
Leauto Intelligent Technology (beijing) Co. Ltd.

Press hard and move gesture

A method. The method may include obtaining force information regarding an input force applied by an input object to a sensing region of an input device.
Synaptics Incorporated

Combined grip and mobility sensing

By correlating user grip information with micro-mobility events, electronic devices can provide support for a broad range of interactions and contextually-dependent techniques. Such correlation allows electronic devices to better identify device usage contexts, and in turn provide a more responsive and helpful user experience, especially in the context of reading and task performance.
Microsoft Technology Licensing, Llc

Impact and contactless gesture inputs for electronic devices

A docking station configured to mate to an electronic device enables methods of interacting with the electronic device by impacting (e.g., knocking) on a table on which the device and/or the docking station are disposed and by means of contactless gestures. The electronic device may remain in a powered off state while the docking station continuously monitors for user input.
Apple Inc.

Wearable augmented reality eyeglass communication device including mobile phone and mobile computing via virtual touch screen gesture control and neuron command

Provided are an augmented reality, virtual reality and mixed reality eyeglass communication device method via eye movement tracking and gestures. The eyeglass communication device may comprise an eyeglass frame, and a right earpiece and a left earpiece connected to the frame.

Method and device for transmitting/receiving data using hdmi

The present invention is related to a method and an apparatus for transmitting and receiving data by using hdmi (high definition media interface). A method and an apparatus according to the present invention comprises requesting reading edid (extended display identification data) from a sink device in case the sink device is connected; receiving from the sink device edid including capability information of the sink device, wherein the capability information indicates whether the sink device is capable of processing gesture information; requesting gesture information indicating a predetermined gesture which may be recognized or extracted from the sink device on the basis of the edid; and receiving the gesture information from the sink device..
Lg Electronics Inc.

Gesture control using depth data

One embodiment provides a method, including: receiving, from at least one sensor of a band shaped wearable device, depth data, wherein the depth data is based upon a gesture performed by a user and wherein the depth data comprises data associated with a distance between the gesture and the band shaped wearable device; identifying, using a processor, the gesture performed by a user using the depth data; and performing an action based upon the gesture identified. Other aspects are described and claimed..
Lenovo (singapore) Pte. Ltd.

Photo-based unlock patterns

Embodiments described herein may help to provide a lock-screen for a computing device. An example method involves: (a) displaying an image and an input region that is moveable over the image, (b) based on head-movement data, determining movement of the input region with respect to the image, (c) during the movement of the input region, receiving gesture data corresponding to a plurality of gestures, (d) determining an input pattern, wherein the input pattern comprises a sequence that includes a plurality of locations in the image, wherein each location in the sequence is a location of the input region in the image at or near a time of a corresponding one of the gestures, (e) determining whether or not the input pattern matches a predetermined unlock pattern, and (f) if the input pattern matches the predetermined unlock pattern, then unlocking the computing device..
Google Inc.

Tactile sensation control system and tactile sensation control method

It is an object of the invention to provide a tactile sensation control system and a tactile sensation control method, which allow a user to perform convenient operation with no visual concentration on a display screen. The system includes: a processor to execute a program; and a memory to store the program which, when executed by the processor, performs processes of: detecting an operation by the user to the operation surface, and controlling, when it is detected in the detecting that the user is operating one of an icon operation and a gesture operation, the tactile sensation on the operation surface while the user is operating the one of the icon operation and the gesture operation so that a tactile sensation on an area receiving the one of the icon operation and the gesture operation changes with a lapse of time in accordance with a predetermined tactile sensation change rule..
Mitsubishi Electric Corporation

Frameless tablet

In order for a user to be able to hold a narrow frame or even frameless handheld device on its front touch-screen display, configure a full-screen grip area on the front touch-screen display when no human hands are holding and/touching the touch-screen display. The grip area is ready for holding and will be reduced to be much smaller when a human hand holds the handheld device on the grip area of the touch-screen display.

Method and system for assessment of cognitive function based on mobile device usage

A system and method that enables a person to unobtrusively assess their cognitive function from mobile device usage. The method records on the mobile device the occurrence and timing of user events comprising the opening and closing of applications resident on the device, the characters inputted, touch-screen gestures made, and voice inputs used on those applications, performs the step of learning a function mapping from the mobile device recordings to measurements of cognitive function that uses a loss function to determine relevant features in the recording, identifies a set of optimal weights that produce a minimum of the loss function, creates a function mapping using the optimal weights, and performs the step of applying the learned function mapping to a new recording on the mobile device to compute new cognitive function values..
Mindstrong, Llc

Multi-factor user authentication framework using asymmetric key

A multi-factor user authentication framework using asymmetric key includes a host device, a user agent, a gesture system, and an authentication system. The multiple factors include a user credential as well as a user gesture that indicates that the user is present.
Microsoft Technology Licensing, Llc

Method and process for determining whether an individual suffers a fall requiring assistance

A method for monitoring an individual in a dwelling so as to know when such individual falls or indicates the need of assistance. A plurality of 3d motion and sound sensors are located in the dwelling and provide data to a computerized monitoring system.
Cerner Innovation, Inc.

Systems, apparatuses and methods for using virtual keyboards

Methods and systems are disclosed for interacting with advertisements on a virtual keyboard. A user can manipulate the advertisement based on performing one or more types of gestures or key presses on or near the virtual keys.
Oversignal, Llc

System and authentication with a computer stylus

A method for securing operation of a computing device operated with a stylus includes recognizing a pre-defined gesture performed by a stylus on a touch screen, the pre-defined gesture defined as a user command to lock an item displayed on the touch screen, determining a location of the gesture, determining identity of the stylus, locking an item displayed at the location determined, and recording identity of the stylus. A method for operating a computing device with a stylus includes receiving a command with a stylus to add restricted annotations to a document, receiving identity of the stylus, linking an annotation to the identity, restricting display of the annotation on the document to a computing device receiving input from the stylus; and displaying the document absent the at least one annotation on a computing device on which input from the stylus is not received..
Microsoft Technology Licensing, Llc

Method and performing operation using intensity of gesture in electronic device

The embodiments herein provide a method for performing an operation in an electronic device. The method includes displaying at least two objects.
Samsung Electronics Co., Ltd.

Dial control for touch screen navigation

A computing device includes a hardware processor and a machine-readable storage medium storing instructions. The instructions may be executable to: display, on a touch screen, a first screen image of a user interface of the computing device; detect a first touch gesture on the first screen image, where the first touch gesture is associated with a dial control including a plurality of control options; and in response to a detection of the first touch gesture: blur the first screen image; present the dial control over the first screen image; in response to a rotation of the first touch gesture, rotate the dial control to select a first control option of the plurality of control options; and in response to a selection of the first control option, present additional information in a second portion of the touch screen..
Hewlett Packard Enterprise Development Lp

Mobile operating system

A mobile operating system, includes a “smart dynamic icon”, a “quick voice assistant”, a “quick slide assistant”, “smart gesture”, a “full screen application interface”, a “global application icons interface”, “quick shut down” and an “important contacts application”. If an application is in an update status, the “smart dynamic icon” automatically display this application's icon on a home screen.

Method and recognizing a touch drag gesture on a curved screen

A method and an apparatus for recognizing a touch drag gesture on a curved screen are provided. A method for recognizing a touch drag gesture on a curved screen may include: dividing the curved screen into a plurality of areas; setting a plurality of threshold values, where each threshold value corresponds to a gesture start direction in the plurality of areas; detecting a gesture start point based on infrared images received from an infrared camera disposed to face the curved screen; determining an area where the gesture start point exists from among the plurality of areas; determining a gesture start direction in the area where the gesture start point exists; selecting a threshold value that corresponds to the gesture start direction in the area where the gesture start point exists from among the plurality of threshold values; calculating a length of a trajectory from the gesture start point to a gesture end point; and recognizing the touch drag gesture based on the trajectory when the length of the trajectory is greater than the selected threshold value..
Hyundai Motor Company

Motion-based view scrolling with augmented tilt control

Systems and methods for motion-based scrolling of contents view on a display with a screen view that is smaller than the contents view are described. During an augmented tilt control mode, scrolling follows changes in a primary tilt direction aligned with the scrolling direction, while a secondary tilt along a direction perpendicular to the scrolling direction is used to modify at least one scrolling control parameter.
Innoventions, Inc.

Positional based movements and accessibility of features associated with a vehicle

Methods and systems are presented for accepting inputs into a vehicle or other conveyance to control functions of the conveyance. A vehicle control system can receive gestures and other inputs.
Autoconnect Holdings Llc

Control method and apparatus, electronic device and computer storage medium

A control method may include: obtaining surface information of a contact surface between user limbs and a control apparatus by detecting the contact surface, the surface information is used for representing a gesture to be recognized, and the control apparatus is fixed to the user limbs; recognizing the gesture to be recognized based on at least the surface information to obtain a recognition result; and controlling a controlled apparatus based on the recognition result. A control apparatus, an electronic device and a computer storage medium are also provided..
Zte Corporation

Gesture recognition device

The present invention provides a recognition device capable of recognizing a two-dimensional or three-dimensional gesture of a user by using a light receiving element. The device of one embodiment of the present invention comprises: a light receiving unit having a plurality of light receiving elements arranged therein; and a control unit for determining a direction in which a change in the amount of light of the plurality of light receiving elements is sensed by a gesture of a user.
Lg Innotek Co., Ltd.

Body gesture control system for button-less vaping

A method of detecting a hand-to-mouth (hmg) gesture with an e-vaping device includes detecting movements of the e-vaping device; generating quaternions based on the detected movements; generating movement features based on the generated quaternions; applying the generated movement features to a classifier; and determining whether the detected movements correspond to an hmg based on an output of the classifier.. .
Altria Client Sevices Llc

Facial gesture recognition and video analysis tool

Embodiments disclosed herein may be directed to a video communication server. In some embodiments, the video communication server includes: at least one memory including instructions; and at least one processing device configured for executing the instructions, wherein the instructions cause the at least one processing device to perform the operations of: determining a time duration of a video communication connection between a first user of a first user device and a second user of a second user device; analyzing video content transmitted between the first user device and the second user device; determining at least one gesture of at least one of the first user and the second user based on analyzing the video content; and generating a compatibility score of the first user and the second user based at least in part on the determined time duration and the at least one determined gesture..
Krush Technologies, Llc

Voice operated remote control

A remote control device controls content on a display device by transmission, to a wireless entertainment controller, of a voice command to change programming on the display device. The voice command may be detected by a sound detection device.
Rateze Remote Mgmt Llc

Adjusting motion capture based on the distance between tracked objects

The technology disclosed relates to adjusting the monitored field of view of a camera and/or a view of a virtual scene from a point of view of a virtual camera based on the distance between tracked objects. For example, if the user's hand is being tracked for gestures, the closer the hand gets to another object, the tighter the frame can become—i.e., the more the camera can zoom in so that the hand and the other object occupy most of the frame.
Leap Motion, Inc.

Mobile-optimized captcha system based on multi-modal gesture challenge and mobile orientation

In an approach to user authorization by mobile-optimized captcha, a computing device detects information suggesting a risk level. The computing device displays one or more prompts based on the risk level.
International Business Machines Corporation



Gesture topics:
  • Virtual Keyboard
  • Touchscreen
  • Electronic Device
  • User Interface
  • Characters
  • Display Panel
  • Touch Screen
  • Output Device
  • Input Device
  • Computing Device
  • Device Control
  • Computer Vision
  • Mobile Terminal
  • Ball Mouse
  • Navigation


  • Follow us on Twitter
    twitter icon@FreshPatents

    ###

    This listing is a sample listing of patent applications related to Gesture for is only meant as a recent sample of applications filed, not a comprehensive history. There may be associated servicemarks and trademarks related to these patents. Please check with patent attorney if you need further assistance or plan to use for business purposes. This patent data is also published to the public by the USPTO and available for free on their website. Note that there may be alternative spellings for Gesture with additional patents listed. Browse our RSS directory or Search for other possible listings.


    3.1397

    file did exist - 11630

    2 - 1 - 255