Follow us on Twitter
twitter icon@FreshPatents


Gesture patents

      

This page is updated frequently with new Gesture-related patent applications.

SALE: 890+ Gesture-related patent PDFs



Detection of computerized bots and automated cyber-attack modules
Devices, systems, and methods of detecting whether an electronic device or computerized device or computer, is being controlled by a legitimate human user, or by an automated cyber-attack unit or malware or automatic script. The system monitors interactions performed via one or more input units of the electronic device.
Biocatch Ltd.


Frictionless access control system with ranging camera
An access control and user tracking system includes ranging cameras installed in thresholds of access points for generating three dimensional models of users passing through the access points in order to detect unauthorized individuals or hand gestures of the users, which can indicate unsafe conditions or that lights and/or equipment should be turned on. The ranging cameras can be point range finding measurement sensor scanning cameras, structured light cameras, or time of flight cameras and can be installed along the top or in the corners of the thresholds..
Sensormatic Electronics, Llc


Synchronization system comprising display device and wearable device, and controlling method
Disclosed are a synchronization system comprising a display device and a wearable device, and a controlling method. The synchronization system comprises: a display device for receiving a preset touch gesture, converting a displayed background image to a preset image, displaying the preset image, and transmitting a synchronization signal to a wearable device; and the wearable device for, when receiving the synchronization signal from the display device, displaying an image which is identical to the preset image of the display device, and synchronizing data with the display device..
Lg Electronics Inc.


Generating and displaying supplemental information and user interactions on interface tiles of a user interface
Technologies for displaying supplemental interface tiles on a user interface of a computing device include determining supplemental information and/or available user interactions associated with a user interface tile displayed on the user interface. A supplemental interface tile is displayed in association with the user interface tile in response to a user selecting the user interface tile.
Intel Corporation


Method and applying free space input for surface constrained control
A free space input standard is instantiated on a processor. Free space input is sensed and communicated to the processor.
Atheer, Inc.


Portable multi-touch input device
A portable input device is described. The portable input device can wirelessly send control signals to an external circuit.
Apple Inc.


Multi-phase touch-sensing electronic device
A touch-sensing electronic device includes a housing having first, second and third touch-sensing surfaces; a substrate extensively disposed under the first, second and third touch-sensing surfaces; sensing electrodes formed on the same substrate, and having capacitance changes in response to touch operations or gestures respectively performed on or over the first, second and third touch-sensing surfaces, wherein the sensing electrodes are grouped into three sensing electrode arrays corresponding to the first, second and third touch-sensing surfaces, respectively; and a controller for generating respective control signals corresponding to the touch operations performed on or over the first, second and third touch-sensing surfaces. At least two of the three sensing electrode arrays have different configurations for performing different sensing operations..
Touchplus Information Corp.


System and detecting hand gesture
Present disclosure relates to a system for detecting hand gesture and a method thereof. The system comprises a hand-held controller and a computing application.
Htc Corporation


Touchless user interface navigation using gestures
An example method includes displaying, by a display (104) of a wearable device (100), a content card (114b); receiving, by the wearable device, motion data generated by a motion sensor (102) of the wearable device that represents motion of a forearm of a user of the wearable device; responsive to determining, based on the motion data, that the user has performed a movement that includes a supination of the forearm followed by a pronation of the forearm at an acceleration that is less than an acceleration of the supination, displaying, by the display, a next content card (114c); and responsive to determining, based on the motion data, that the user has performed a movement that includes a supination of the forearm followed by a pronation of the forearm at an acceleration that is greater than an acceleration of the supination, displaying, by the display, a previous content card (114a).. .
Google Llc


Three-dimensional graphical user interface for informational input in virtual reality environment
Hand displacement data is received from sensing hardware and analyzed using a three-dimensional (3d) gesture recognition algorithm. The received hand displacement data is recognized as representing a 3d gesture.
Alibaba Group Holding Limited


Interactive media system and method

Interactive computing systems and methods are provided which enable simple and effective interaction with a user device, which increases interest and improves user experience. The interactive system comprises a user device including a motion sensor, for receiving motion-based gestures through motion of the user device; and a controller, coupled to the motion sensor, configured to control one or more aspects of the system according to the motion-based gestures.
Tiltsta Pty Ltd

System, method, and man-machine interaction

A man-machine interaction system, method, and apparatus, the man-machine interaction system includes a wearable device and a display device. The wearable device includes an image acquiring module, a memory, a processor, an image projecting module, and an information transmission interface.
Boe Technology Group Co., Ltd.

Control of machines through detection of gestures by optical and muscle sensors

A material handler system including a plurality of components and material handler equipment, and methods for utilizing the same, is disclosed. A first component includes gesture command recognition and enhancement for controlling material handler equipment.
Deere & Company

Wearable wireless hmi device

A wearable gesture control interface apparatus is used to control a controllable device based on gestures provided by a user. The wearable gesture control interface apparatus includes (i) sensors configured to detect user orientation and movement and generate corresponding sensor data and (ii) a microcontroller configured to: sample the sensor data from the sensors, determine whether the sensor data from one of the sensors meets transmission criteria; and if the sensor data meets the transmission criteria, transmitting control data corresponding to all of the sensors to the controllable device..
Protocode Inc.

Motion communication system and method

This document presents an apparatus and method for creating and displaying a three dimensional cgi character performing signs and gestures that form a motion communication capability. The 3d cgi character is created as a digital construct that is displayed on a silhouette that has an outline of the character created.

Integrated lighting system and network

An integrated lighting system and integrated lighting network including integrated lighting systems are communicatively coupled to one another, for example, via various wireless transceivers. The systems and networks can collect data of a passing object (e.g., person, animal, automotive vehicle).
Rf Digital Corporation

Mobile terminal

Disclosed herein are a mobile terminal and a method for controlling a mobile terminal. The mobile terminal includes a wireless communication unit configured to be connected to an external device for communication, a sensing unit configured to sense a motion of the mobile terminal, a touch screen, and a controller configured to receive information related to a first item selected in a screen of an application being executed in the external device when a motion of the mobile terminal according to a first gesture of a user who has worn the mobile terminal is sensed, to display a first screen on which the information related to the first item is displayed according to a time sequence on the touch screen, and to transmit a control signal which enables a screen of an application corresponding to a second item selected in the first screen to be displayed to the external device when a motion of the mobile terminal according to a second gesture of the user is sensed.
Lg Electronics Inc.

Verifying identity based on facial dynamics

A computer-implemented technique is described for verifying the identity of a user using two components of face analysis. In a first part, the technique determines whether captured face information matches a previously stored structural face signature pertaining to the user.
Microsoft Technology Licensing, Llc

Gesture control method, apparatus, terminal device, and storage medium

A gesture control method, a gesture control apparatus and a terminal device to enrich interaction manners of the terminal device, where the method includes detecting a touch action performed on a touchscreen of a terminal device, obtaining a contact area of the touch action on the touchscreen and a z-axis acceleration generated when the touch action is in contact with the touchscreen, determining that the touch action is a joint touch action when the contact area is larger than a preset area and the z-axis acceleration is greater than a preset acceleration, identifying a gesture type corresponding to the joint touch action, and calling a preset function of the terminal device according to the gesture type.. .
Huawei Technologies Co., Ltd.

Creating tables using gestures

A method comprising displaying, on a touchscreen, a digital electronic document; receiving first input from the touchscreen and determining that the first input comprises a rectangle gesture; receiving second input from the touchscreen and determining that the second input comprises a subdivision gesture that indicates dividing the rectangle; determining that the first input and the second input have been received within a time threshold; in response to determining that the first input and second input have been received within the time threshold, automatically generating a table that comprises a plurality of cells; automatically placing the table in the document at a location that is based on the first input and updating the document that is displayed on the touchscreen to visually show the table; wherein the method is performed by one or more computing devices.. .
Atlassian Pty Ltd

Image manipulation

A method includes displaying an image on a first area of a touch-sensitive electronic display and receiving touch input on a second area of the display, comprising the first area. A gesture type is detected from the touch input by detecting a larger component of motion of the touch input along one of first and second axes of the display than along the other of the axes.
Apical Ltd

Expandable application representation

Expandable application representation techniques are described. The techniques may include support of an expandable tile that may function as an intermediary within a root level (e.g., start menu or screen) of a file system.
Microsoft Technology Licensing, Llc

Actuation lock for a touch sensitive input device

Touch sensitive mechanical keyboards and methods of configuring the depressibility of one or more keys of a keyboard are provided. A touch sensitive mechanical keyboard can accept touch events performed on the surface of the keys.
Apple Inc.

Automated e-tran application

Techniques for text entry using gestures are disclosed. As disclosed, a camera may capture a frame and the face of the user can be detected therein.
Microsoft Technology Licensing, Llc

Gesture experiences in multi-user environments

Systems, apparatuses and methods may leverage technology that recognizes a set of one or more hands in one more frames of a video signal during a first gesture control interaction between the set of one or more hands and an electronic device. Moreover, one or more additional body parts may be detected in the frame(s), wherein the additional body part(s) are excluded from the gesture control interaction.
Intel Corporation

Interaction mode selection based on detected distance between user and machine interface

An embodiment of an interaction mode selection apparatus may include a distance estimator to estimate a distance between a user and a part of a machine interface, and an interaction selector communicatively coupled to the distance estimator to select one or more active interaction modes from two or more available interaction modes based on the estimated distance. The distance estimator may include a depth sensor, a three-dimensional camera, a two-dimensional camera, an array of cameras, an array of microphones, an array of wireless access points, a beacon sensor, a proximity sensor, and/or an ultrasonic sensor.
Intel Corporation

User interaction paradigms for a flying digital assistant

Methods and systems are described for new paradigms for user interaction with an unmanned aerial vehicle (referred to as a flying digital assistant or fda) using a portable multifunction device (pmd) such as smart phone. In some embodiments, a user may control image capture from an fda by adjusting the position and orientation of a pmd.
Skydio, Inc.

Touch-gesture control for side-looking sonar systems

Techniques are disclosed for systems and methods to provide touch screen side-scan sonar adjustment for mobile structures. A side-scan sonar adjustment system includes a user interface with a touch screen display and a logic device configured to communicate with the user interface and a side-scan sonar system.
Flir Belgium Bvba

Vehicle window with gesture control

A slider window assembly for a vehicle includes a frame portion, at least one fixed window panel that is fixed relative to the frame portion, and a movable window panel that is movable along upper and lower rails of the frame portion between a closed position and an opened position. A gesture sensing device is operable to sense a gesture of a user in the vehicle and to determine if the sensed gesture is indicative of an open window command or a close window command.
Magna Mirrors Of America, Inc.

Avatar creation and editing

The present disclosure generally relates to creating and editing user avatars. In some examples, guidance is provided to a user while capturing image data for use in generating a user-specific avatar.
Apple Inc.

Systems and methods for gesture-based control of equipment in video communication

Systems, methods, and non-transitory computer readable media are configured to obtain video data from a camera used in a video conferencing system. A user interface displaying the video data can be provided on a screen, wherein the screen is capable of receiving touch input.
Facebook, Inc.

Launching applications from a lock screen of a mobile computing device via user-defined symbols

A first task and a second task executable by a mobile computing device are associated with a first predefined symbol and a second predefined symbol, respectively. The first task and the second task are different types of tasks executable by the mobile computing device after the mobile computing device has been unlocked.

Gesture detection to pair two wearable devices and perform an action between them and a wearable device, a method and a system using heat as a means for communication

The present disclosure relates to devices and methods for initiating execution of actions and for communicating information to a user, and more particularly, to initiating execution of predefined actions in wearable devices and communication devices based on gestures made with the wearable devices and/or heat applied to a surface of the wearable devices. According to an aspect, the method relates to, in the wearable device, detecting a first, in the first wearable device predefined, gesture of the first wearable device, broadcasting a first signal comprising information associated with the first gesture, receiving, from a second wearable device, a second signal comprising information associated with a second gesture and initiating execution of a, in the first wearable device predefined, first action based on the first signal and the second signal..
Sony Corporation

Camera gesture clock in

A system for keeping track of an employee's attendance is described that allows the employee to be creative which also helps morale. An employee sets up an employee account and provides identification.
Wal-mart Stores, Inc.

Multi-modal user authentication

Various systems and methods for providing a mechanism for multi-modal user authentication are described herein. An authentication system for multi-modal user authentication includes a memory including image data captured by a camera array, the image data including a hand of a user; and an image processor to: determine a hand geometry of the hand based on the image data; determine a palm print of the hand based on the image data; determine a gesture performed by the hand based on the image data; and determine a bio-behavioral movement sequence performed by the hand based on the image data; and an authentication module to construct a user biometric template using the hand geometry, palm print, gesture, and bio-behavioral movement sequence..

Biometric authentication based on gait pattern or writing motion with an inertial measurement unit

The present invention relates to use an inertial measurement unit (imu) to record the acceleration trajectory of a person's gait or pen-less handwriting motion or any predesignated gestures, and to convert the data to a unique biometric pattern. The pattern is unique for each case and can be used as biometric security authentication..
Hong Kong Baptist University

View switching touch screen, and client device

A view switching method includes: displaying a switching flag on a current view in response to a first particular gesture of a user on the current view, where the first particular gesture passes through a particular area of the current view, and switching from the current view to a target view associated with the switching flag, in response to ending of the first particular gesture or to a second particular gesture of the user on the current view. Smooth switching to a specified view can reduce errors caused by mis-operations of users, and provide users with fast, convenient, and graceful browsing experience..
Guangzhou Ucweb Computer Technology Co., Ltd.

Device, method, and graphical user interface for force-sensitive gestures on the back of a device

An electronic device with a front side including a touch-sensitive display, a back side that does not include a display, and sensor(s) to detect contact intensities on the front and back side displays, on the touch-sensitive display, a user interface including objects. While displaying the user interface, the device detects an input on a side of the electronic device.
Apple Inc.

Method and device for sharing content

Methods and devices are provided for sharing content in the mobile internet technology field. In the method, the electronic device displays a content page including a plurality of content elements on the touch screen.
Beijing Xiaomi Mobile Software Co., Ltd.

Motion-assisted visual language for human computer interfaces

Embodiments of the invention recognize human visual gestures, as captured by image and video sensors, to develop a visual language for a variety of human computer interfaces. One embodiment of the invention provides a computer-implement method for recognizing a visual gesture portrayed by a part of human body such as a human hand, face or body.
Fastvdo Llc

Eyeglasses-type wearable device and method using the same

An eyeglasses-type wearable device of an embodiment can handle various data inputs. The device includes right and left eye frames corresponding to positions of right and left eyes and nose pads corresponding to a position of a nose.
Kabushiki Kaisha Toshiba

Performing operations based on gestures

gesture-based interaction includes displaying a first image, the first image comprising one or more of a virtual reality image, an augmented reality image, and a mixed reality image, obtaining a first gesture, obtaining a first operation based at least in part on the first gesture and a service scenario corresponding to the first image, the service scenario being a context in which the first gesture is input, and operating according to the first operation.. .
Alibaba Group Holding Limited

Gesture interface

A user interface apparatus, computer program, computer readable medium, and method for selecting a selectable object on a display screen is presented. The display screen displays one or more selectable objects.
Rakuten, Inc.

Coordinate system for gesture control

Embodiments of a system and method for gesture controlled output are generally described. A method may include receiving sensor input information from a wearable device, the sensor input information, determining, using the sensor input information, a gravity vector or a magnetic field, determining a change in horizontal angle, rotational angle, or vertical angle based on the sensor input information, the gravity vector, or the magnetic field, and determining a gesture based on the change in the horizontal angle, the rotational angle, or the vertical angle.

Gesture detection

Apparatuses, methods, systems, and program products are disclosed for interrupting a device. A method includes detecting an auditory cue associated with a predefined gesture based on input received from one or more sensors.
Lenovo (singapore) Pte. Ltd.

3d hand gesture image recognition method and system thereof

A 3d hand gesture recognition system includes a light field capturing unit, a calculation unit and an output unit. The light field capturing unit is provided to capture a hand gesture action to obtain a 3d hand gesture image.
National Kaohsiung University Of Applied Sciences

Portable communication device for transmitting touch-generated messages

The invention relates to a portable communication device for transmitting touch-generated messages to at least one addressee, by means of touch gestures carried out by a user on a touch-sensitive panel of the device. This device seeks to cover certain communication demands that are not satisfied by smartphones and their accessory devices; in particular, for those persons who are not in the capacity or condition to access or properly use existing products, either permanently or momentarily.

Method and system for gesture-based interactions

gesture based interaction is presented, including determining, based on an application scenario, a virtual object associated with a gesture under the application scenario, the gesture being performed by a user and detected by a virtual reality (vr) system, outputting the virtual object to be displayed, and in response to the gesture, subjecting the virtual object to an operation associated with the gesture.. .
Alibaba Group Holding Limited

Systems and methods for relative representation of spatial objects and disambiguation in an interface

Implementations described and claimed herein provide systems and methods for interaction between a user and a machine. In one implementation, a system is provided that receives an input from a user of a mobile machine which indicates or describes an object in the world.
Apple Inc.

Apparatus and methods for managing blood pressure vital sign content in an electronic anesthesia record

A graphical user interface and methods for managing blood pressure vital sign content in an electronic anesthesia record on a multi-function gesture-sensitive device via gesture-based means. The graphical user interface and methods give an improved user interface and set of functions via novel functions unique to gesture sensitive interfaces..

Communication device

A communication device includes a structure, a controller, and sensors that detect the relative position of an object around the structure. The structure has a face unit that is one of the units of the structure.
Toyota Jidosha Kabushiki Kaisha

Display system having world and user sensors

A mixed reality system that includes a head-mounted display (hmd) that provides 3d virtual views of a user's environment augmented with virtual content. The hmd may include sensors that collect information about the user's environment (e.g., video, depth information, lighting information, etc.), and sensors that collect information about the user (e.g., the user's expressions, eye movement, hand gestures, etc.).
Apple Inc.

Traffic direction gesture recognition

Traffic direction gesture recognition may be implemented for a vehicle in response to traffic diversion signals in the vehicles vicinity. Sensors implemented as part of a vehicle may collect data about pedestrians and other obstacles in the vicinity of the vehicle or along the vehicle's route of travel.
Apple Inc.

Method and system of hand segmentation and overlay using depth data

In a minimally invasive surgical system, a plurality of video images is acquired. Each image includes a hand pose image.
Intuitive Surgical Operations, Inc.

Instrument as well as operating an instrument

An instrument is described which comprises an input for receiving a signal, a data processing unit for analyzing said received signal and providing data to be displayed, and a touch enabled display screen for displaying said data to be displayed and receiving commands directed to said data processing unit. Said commands comprise commands that determine how said data is displayed on the touch enabled display screen and commands that determine operations that are performed by said instrument and/or said data processing unit.
Rohde & Schwarz Gmbh & Co. Kg

Display device and display control method

A display device includes an object detecting section, a display section, a gesture acceptance section, and a display control section. The object detecting section detects an object contained in display target data.
Kyocera Document Solutions Inc.

System and method to perform an allocation using a continuous two direction swipe gesture

A system, method and computer readable medium provide a gesture-based graphical user interface to determine allocation information to instruct an allocation. A gesture-based i/o device displays a graphical user interface having: an amount region configured to define an allocation amount; a plurality of source regions each configured to define an allocation source; and a plurality of destination regions each configured to define an allocation destination.
The Toronto-dominion Bank

Three-dimensional virtualization

Three-dimensional virtualization may include receiving captured images of an entity and/or a scene, and/or capturing the images of the entity and/or the scene. The images may be connected in a predetermined sequence to generate a virtual environment.
Accenture Global Services Limited

3d document editing system

A 3d document editing system and graphical user interface (gui) that includes a virtual reality and/or augmented reality device and an input device (e.g., keyboard) that implements sensing technology for detecting gestures by a user. Using the system, portions of a document can be placed at or moved to various z-depths in a 3d virtual space provided by the vr device to provide 3d effects in the document.
Apple Inc.

Orthogonal signaling touch user, hand and object discrimination systems and methods

A system and method for distinguishing between sources of simultaneous touch events on a touch sensitive device are disclosed. The touch sensitive device includes row conductors and column conductors, the path of each of the row conductors crossing the path of each of the column conductors.
Tactual Labs Co.

Gesture detection device for detecting hovering and click

There is provided a gesture detection device including two linear image sensor arrays and a processing unit. The processing unit is configured to compare sizes of pointer images in the image frames captured by the two linear image sensor arrays in the same period or different periods so as to identify a click event..
Pixart Imaging Inc.

Gesture detection, list navigation, and item selection using a crown and sensors

The present disclosure generally relates to methods and apparatuses for detecting gestures on a reduced-size electronic device at locations off of the display, such as gestures on the housing of the device or on a rotatable input mechanism (e.g., a digital crown) of the device, and responding to the gestures by, for example, navigating lists of items and selecting items from the list; translating the display of an electronic document; or sending audio control data to an external audio device.. .
Apple Inc.

Augmented-reality-based interactive authoring-service-providing system

The present invention includes: a wearable device including a head mounted display (hmd); an augmented reality service providing terminal paired with the wearable device and configured to reproduce content corresponding to a scenario-based preset flow via a gui interface, overlay corresponding objects in a three-dimensional (3d) space being viewed from the wearable device when an interrupt occurs in an object formed in the content to thereby generate an augmented reality image, convert a state of each of the overlaid objects according to a user's gesture, and convert location regions of the objects based on motion information sensed by a motion sensor; and a pointing device including a magnetic sensor and configured to select or activate an object output from the augmented reality service providing terminal.. .
Korea Advanced Institute Of Science And Technology

Systems and methods for recording custom gesture commands

Methods and apparatuses are disclosed for recording a custom gesture command. In an aspect, at least one camera of a user device captures the custom gesture command, wherein the custom gesture command comprises a physical gesture performed by a user of the user device, a user interface of the user device receives user selection of one or more operations of the user device, and the user device maps the custom gesture command to the one or more operations of the user device..
Qualcomm Incorporated

System and adapting a display on an electronic device

A technique is provided for adapting a display on an electronic device. The technique includes displaying a first user interface comprising a first plurality of graphical prompts to a user, wherein the first plurality of graphical prompts are highlighted sequentially, detecting one or more pre-defined eye-blinking gestures of a user to select a highlighted graphical prompt of the first user interface, and adapting the first user interface, to display a second user interface in response to detecting the one or more pre-defined eye-blinking gestures of the user..
Wipro Limited

Gesture based control of autonomous vehicles

A triggering condition for initiating an interaction session with an occupant of a vehicle is detected. A display is populated with representations of one or more options for operations associated with the vehicle.
Apple Inc.

Vehicle operating system using motion capture

Vehicle operating systems for operating a vehicle having a driving seat for a vehicle driver and at least one passenger seat for passengers are described. The vehicle operating system may include one or more camera devices for shooting images of hand actions of the driver or images of hand actions of a passenger, and a storage device for storing operating signals corresponding to hand actions.
Thunder Power New Energy Vehicle Development Company Limited

Workout monitor interface

The present disclosure relates to systems and processes for monitoring a workout and for generating improved interfaces for the same. One example user interface detects when a workout of a particular type is started and begins generating activity data related to workout metrics associated with the type of workout selected.
Apple Inc.

Wireless directional sharing based on antenna sectors

Apparatuses and methods for sharing data between wireless devices based on one or more user gestures are disclosed. In some implementations, a wireless device may include a number of antenna elements configured to beamform signals in a plurality of transmit directions, with each of the transmit directions corresponding to one of a number of antenna sectors.
Qualcomm Incorporated

Virtual push-to-talk button

A method and apparatus for providing a virtual push-to-talk (ptt) button is provided herein. During operation, an augmented reality and object recognition circuitry will detect user's fingers and a free surface near the fingers by analyzing image captured by camera.
Motorola Solutions, Inc

System and real-time transfer and presentation of multiple internet of things (iot) device information on an electronic device based on casting and slinging gesture command

A novel intermediary set-top box called a “cast-sling box” (csb) uniquely incorporates multimedia and/or iot data casting, slinging, transcoding, referring (i.e. Referral mode), rendering, and recording capabilities for seamless interoperability of various electronic devices in a multiple device environment.
Dvdo, Inc.

Integrated cast and sling system and its operation in an interoperable multiple display device environment

An integrated cast and sling system capable of intermediating seamless multimedia data and playback transfers across multiple display devices is disclosed. The integrated cast and sling system contains a cast-sling box (csb) connected to a conventional cable or satellite set-top box and a plurality of multimedia signal sources.
Dvdo, Inc.

Method and creating motion effect for image

A method includes maintaining images and associated video streams; detecting a swipe gesture of a user on a touch sensitive display, wherein the swipe gesture comprises direction and speed information; and selecting an image based on the direction information of the swipe gesture. The method includes adjusting a playback of an associated video stream based on the direction and the speed information of the swipe gesture; providing the associated video stream on the display during the swipe gesture, for creating a motion effect relating to the image; and providing the image on the display after the swipe gesture..
Nokia Technologies Oy

Device for multi-angle photographing in eyeglasses and eyeglasses including the device

A device for multi-angle photographing in eyeglasses and eyeglasses including the device are disclosed. The device includes a camera mounted on the eyeglasses and configured to capture an image, at least two motors configured to drive the camera to move, and at least two mobile platforms, each of which is provided with a plurality of limiting stoppers.
Beijing Boe Optoelectronics Technology Co., Ltd.

Neural network-based inferential advertising system and method

A neural network-based inferential advertising system and method delivers interactive advertisements to advertisement recipients in accordance with user-selected restrictions that inform the behavioral information upon which advertising preference inferences are made, as well as advertising preference inferences that are made based upon interpretations made from content by a computer-implemented neural network. The behavioral information upon which advertising inferences are made may include bodily movements of advertisement recipients that are performed without physical interaction with a device.
Manyworlds, Inc.

Determining a pointing vector for gestures performed before a depth camera

A pointing vector is determined for a gesture that is performed before a depth camera. One example includes receiving a first and a second image of a pointing gesture in a depth camera, the depth camera having a first and a second image sensor, applying erosion and dilation to the first image using a 2d convolution filter to isolate the gesture from other objects, finding the imaged gesture in the filtered first image of the camera, finding a pointing tip of the imaged gesture, determining a position of the pointing tip of the imaged gesture using the second image, and determining a pointing vector using the determined position of the pointing tip..
Intel Corporation

Identity authentication method and apparatus

Embodiments of the present disclosure disclose an identity authentication method performed at a computing device, the method including: obtaining a sequence of finger gestures on a touchpad of the computing device from a user, wherein each finger gesture has an associated pressure type on the touchpad; generating a corresponding character string according to the sequence of finger gestures; comparing the character string with a verification code of a user account associated with an application program; in accordance with a determination that the character string matches the verification code, granting the user access to the user account associated with the application program; and in accordance with a determination that the character string does not match the verification code, denying the user access to the user account associated with the application program.. .
Tencent Technology (shenzhen) Company Limited

Display system shared among computers

The display system is shared among a number of computers. The display system includes a display device, at least a gesture recognition device, a number of connection ports, a circuit board, a processing unit, and a switch unit.
Evga Corporation

Alternative hypothesis error correction for gesture typing

In one example, a method may include outputting, by a computing device and for display, a graphical keyboard comprising a plurality of keys, and receiving an indication of a gesture. The method may include determining an alignment score that is based at least in part on a word prefix and an alignment point traversed by the gesture.
Google Llc

Device, method and computer program product for creating viewable content on an interactive display

A device, method and computer program product for creating viewable content on an interactive display is provided. The method includes providing a user interface on a device for creating viewable content from a collection comprising at least one multimedia content.
Microsoft Technology Licensing, Llc

Browsing and selecting content items based on user gestures

The present disclosure is directed toward systems and methods that provide users with efficient and effective user experiences when browsing, selecting, or inspecting content items. More specifically, systems and methods described herein provide users the ability to easily and effectively select multiple content items via a single touch gesture (e.g., swipe gesture).
Dropbox, Inc.

Device, method, and graphical user interface for managing concurrently open software applications

An electronic device displays a first application view at a first size. The first application view corresponds to a first application in a plurality of concurrently open applications.
Apple Inc.

Gesture recognition and control based on finger differentiation

An embodiment of a computer implemented method of performing a processing action includes detecting an input from a user via an input device of a processing device, the input including a touch by at least one finger of a plurality of fingers of the user, estimating a gesture performed by the at least one finger based on the touch, measuring at least part of a fingerprint of the at least one finger, and identifying the at least one finger used to apply the input by the user based on stored fingerprint data that differentiates between individual fingers of the user. The method also includes identifying an action to be performed based on the estimated gesture and based on the identified at least one finger, and performing the action by the processing device..
International Business Machines Corporation

Gesture-based multimedia casting and slinging command method and system in an interoperable multiple display device environment

Novel gesture-based multimedia casting and slinging command methods and systems that are configured to accommodate seamless multimedia data and playback transfers across various electronic devices in a multiple display device environment are disclosed. In one instance, a “casting” of an audio/video (av), graphical, or photographic multimedia content from a touch-screen electronic device to a targeted device (e.g.
Dvdo, Inc.

Methods and apparatus to detect vibration inducing hand gestures

Methods and apparatus to detect vibration inducing hand movements are disclosed. An example apparatus includes a sensor to detect a vibration in a hand of a user wearing the apparatus.
Intel Corporation

Measurement of facial muscle emg potentials for predictive analysis using a smart wearable system and method

A system includes at least one wearable device having a housing, at least one sensor disposed within the housing, at least one output device disposed within the housing, and at least one processor operatively connected to the sensors and output devices, wherein one or more sensors are configured to detect electrical activity from a user's facial muscles and to transmit a data signal concerning the electrical activity of the user's facial muscles to one of more of the processors. A method of controlling a wearable device includes determining facial muscular electrical data of a facial gesture made by a user, interpreting the facial muscular electrical data to determine a user response, and performing an action based on the user response..
Bragi Gmbh

Information processing method, terminal, and computer storage medium

The present disclosure discloses a method for rendering a graphical user interface of an online game system on a display of a terminal, including: rendering, in the graphical user interface, at least one virtual resource object; detecting whether a configuration option in the game system is enabled, entering a first control mode in accordance with a determination that the configuration option is enabled; rendering, in a skill control display area in the graphical user interface, at least one skill object corresponding to multiple character objects; detecting, from a user of the terminal, a skill release operation gesture on the at least one skill object in the skill control display area; and performing a skill release operation, on a currently specified target object, of a skill to be triggered and supported by at least one skill object corresponding to one or more of the multiple character objects.. .
Tencent Technology (shenzhen) Company Limited

Reducing erroneous detection of input command gestures

Embodiments of the present invention provide for a system and method for reducing erroneous detection of input command gestures. The method is performed by the system and includes storing a reference time value of when a presence of a human body part is detected by a first sensor and storing a further time value of when a presence of a human body part is detected by a second sensor.
Jaguar Land Rover Limited

Arm-detecting overhead sensor for inventory system

Inventory systems may include one or more sensors capable of detecting spatial positioning of inventory holders and an arm of a worker interacting with the inventory holder. Data can be received from a sensor, a gesture of the arm can be determined from the data, and a bin location or other information can be determined based on the gesture..
Amazon Technologies, Inc.

Video monitoring system

A monitoring system includes cameras adapted to capture images and depth data of the images. A computer device processes the image signals and depth data from the cameras according to various software modules that monitor one or more of the following: (a) compliance with patient care protocols; (b) patient activity; (c) equipment usage; (d) the location and/or usage of assets; (e) patient visitation metrics; (f) data from other sensors that is integrated with the image and depth data; (g) gestures by the patient or caregivers that are used as signals or for controls of equipment, and other items.
Stryker Corporation

Gesture recognition and control based on finger differentiation

An embodiment of a computer implemented method of performing a processing action includes detecting an input from a user via an input device of a processing device, the input including a touch by at least one finger of a plurality of fingers of the user, estimating a gesture performed by the at least one finger based on the touch, measuring at least part of a fingerprint of the at least one finger, and identifying the at least one finger used to apply the input by the user based on stored fingerprint data that differentiates between individual fingers of the user. The method also includes identifying an action to be performed based on the estimated gesture and based on the identified at least one finger, and performing the action by the processing device..
International Business Machines Corporation

Portable electronic device performing similar operations for different gestures

A portable electronic device with a touch-sensitive display is disclosed. One aspect of the invention involves a computer-implemented method in which the portable electronic device: displays an application on the touch-sensitive display; and when the application is in a predefined mode, performs a predefined operation in response to each gesture of a set of multiple distinct gestures on the touch-sensitive display.
Apple Inc.

Social networking application for real-time selection and sorting of photo and video content

Systems and methods of the present disclosure include a virtual wall having a plurality of media, as referred to as entries or media items. The system lays out the entries horizontally and vertically in the virtual wall.
Piqpiq, Inc.

Processing capacitive touch gestures implemented on an electronic device

Content on a display user interface of an electronic device, such as a wearable electronic device, can be manipulated using capacitive touch sensors that may be seamlessly integrated into the housing or strap of the electronic device. The capacitive touch sensors can advantageously replace mechanical buttons and other mechanical user interface components, such as a crown, to provide industrial design opportunities not possible with the inclusion of mechanical buttons and mechanical interface components.
Apple Inc.

Gesture language for a device with multiple touch surfaces

A gesture language for a device with multiple touch surfaces is described. Generally, a series of new touch input models is described that includes touch input interactions on two disjoint touch-sensitive surfaces.
Microsoft Technology Licensing, Llc

Gesture detection

A supplemental surface area allows gesture recognition on outer surfaces of mobile devices. Inputs may be made without visual observance of display devices.
At&t Intellectual Property I, L.p.

Method and circuit for switching a wristwatch from a first power mode to a second power mode

An electronic wristwatch operable in two power modes. The wristwatch has an inertial sensor for detecting a gesture on a cover glass of the wristwatch.
Slyde Watch Sa

Display system operable in a non-contact manner

A display system operable in a non-contact manner includes a display, a wireless communication module, a plurality of user-wearable devices, and a control device. Each of the user-wearable devices is attachable to or around a user's hand, associated with a unique identifier, and is configured to detect a hand gesture made by the user's hand and wirelessly transmit data corresponding to the detected hand gesture along with the unique identifier.
Toshiba Tec Kabushiki Kaisha

Haptic effect handshake unlocking

A system that unlocks itself or another device or electronic media enters an unlocked mode by playing a predetermined haptic effect and in response receiving a gesture based interaction input from a user. The system compares the interaction input to a stored predefined interaction input and transitions to the unlocked mode if the interaction input substantially matches the stored predefined interaction input..
Immersion Corporation

User notification of powered system activation during non-contact human activation

A user-activated, non-contact power closure member system and method of operating a closure member of a vehicle are provided. The system includes at least one sensor for sensing an object or motion.
Magna Closures Inc.

Method and system for providing interactivity based on sensor measurements

There is provided a system for providing interactivity to a guest of an experiential venue, based on sensor measurement of the guest. The system comprises a sensor configured to sense a guest variable of the guest, where the sensor may be a biometric sensor, a facial recognition sensor, a voice stress analysis sensor, a gesture recognition sensor, a motion tracking sensor, or an eye tracking sensor, and may sense heart rate or another guest variable.
Disney Enterprises, Inc.

Systems and methods for prosthetic wrist rotation

Features for a prosthetic wrist and associated methods of rotation are described. The prosthetic wrist attaches to a prosthetic hand.
Touch Bionics Limited

Smart cup, and method and system therefor for performing command control via gesture

A method for a smart cup for performing command control via a gesture, comprising: a tilt angle sensor (121) detects an included angle θ between a smart cup and a vertical direction and an included angle α between the smart cup and a preset horizontal reference line in real time, and sends data about the included angles to a control unit (122); the control unit (122) calculates changes in the angles according to the data about the included angles, and recognizes a corresponding gesture according to the changes in the angles; and the control unit (122) matches and executes a corresponding command according to the gesture.. .
Bowhead Technology (shanghai) Co., Ltd.

Wearable gesture control device & method

Novel tools and techniques are provided for implementing internet of things (“iot”) functionality. In some embodiments, a wearable control device (“wcd”) might receive first user input comprising one or more of touch, gesture, and/or voice input from the user.
Centurylink Intellectual Property Llc

Personalization of experiences with digital assistants in communal settings through voice and query processing

In non-limiting examples of the present disclosure, systems, methods and devices for providing personalized experiences to a computing device based on user input such as voice, text and gesture input are provided. Acoustic patterns associated with voice input, speech patterns, language patterns and natural language processing may be used to identify a specific user providing input from a plurality of users, identify user background characteristics and traits for the specific user, and topically categorize user input in a tiered hierarchical index.
Microsoft Technology Licensing, Llc

Methods and vehicles for capturing emotion of a human driver and moderating vehicle response

Methods and systems for determining an emotion of a human driver of a vehicle and using the emotion for generating a vehicle response, is provided. One example method includes capturing, by a camera of the vehicle, a face of the human driver.
Emerging Automotive, Llc

Method and system of augmented-reality simulations

In one exemplary embodiment, a method includes the step of obtaining a digital image of an object with a digital camera. The object is identified.

Dynamic determination of human gestures based on context

A system comprises a processor configured to execute instructions to receive an indication of an occurrence of a human gesture and to perform an analysis of the indication of the occurrence of the human gesture to determine contextual criteria having a relationship to the occurrence of the human gesture. The processor may determine a meaning of the human gesture based at least in part on the contextual criteria and a plurality of possible intended meanings for the human gesture.
International Business Machines Corporation

Gesture matching mechanism

A mechanism is described to facilitate gesture matching according to one embodiment. A method of embodiments, as described herein, includes selecting a gesture from a database during an authentication phase, translating the selected gesture into an animated avatar, displaying the avatar, prompting a user to perform the selected gesture, capturing a real-time image of the user and comparing the gesture performed by the user in the captured image to the selected gesture to determine whether there is a match..
Intel Corporation

Application processing based on gesture input

Non-limiting examples of the present disclosure describe gesture input processing. As an example, a gesture input may be a continuous gesture input that is received through a soft keyboard application.
Microsoft Technology Licensing, Llc

Detecting and interpreting real-world and security gestures on touch and hover sensitive devices

“real-world” gestures such as hand or finger movements/orientations that are generally recognized to mean certain things (e.g., an “ok” hand signal generally indicates an affirmative response) can be interpreted by a touch or hover sensitive device to more efficiently and accurately effect intended operations. These gestures can include, but are not limited to, “ok gestures,” “grasp everything gestures,” “stamp of approval gestures,” “circle select gestures,” “x to delete gestures,” “knock to inquire gestures,” “hitchhiker directional gestures,” and “shape gestures.” in addition, gestures can be used to provide identification and allow or deny access to applications, files, and the like..
Apple Inc.

Enhanced 3d interfacing for remote devices

Operating a computerized system includes presenting user interface elements on a display screen. A first gesture made in a three-dimensional space by a by a distal portion of an upper extremity of a user is detected while a segment of the distal portion thereof rests on a surface.
Apple Inc.

Configuring three dimensional dataset for management by graphical user interface

An approach is provided that selects three attributes that correspond to objects included in a dataset, where each of the three attributes is assigned to a different coordinate value (x, y, and z coordinates). The approach creates a simulated three dimensional (3d) scene of the objects on a display screen by using the x, y, and z coordinate values corresponding to the attributes of each of the objects.
International Business Machines Corporation

Graphical user interface for managing three dimensional dataset

An approach is provided that displays, on a two dimensional (2d) screen, a gyroscopic graphical user interface (gui). The gyroscopic gui provides three dimensional (3d) control of a simulated 3d scene displayed on the 2d screen.
International Business Machines Corporation

3d touch enabled gestures

Touch events are detected on a touch sensing surface coupled to a capacitive sense array and one or more force electrodes. During a plurality of scan cycles, a plurality of capacitive sense signals and one or more force signals are obtained from capacitive sense electrodes of the capacitive sense array and the one or more force electrodes, respectively, and applied to determine a temporal sequence of touches on the touch sensing surface.
Parade Technologies, Ltd.

Touch display device

A touch display device. A touch display panel includes a source multiplexer switching the transfer of source signals to data lines.
Lg Display Co., Ltd.

Information processing device

An in-vehicle device includes a gesture detection unit for recognizing a user's hand at a predetermined range; an information control unit controlling information output to a display; and an in-vehicle device control unit receiving input from a control unit equipped in a vehicle to control the in-vehicle device. When the gesture detection unit detects a user's hand at a predetermined position, the output information control unit triggers the display to display candidates of an operation executed by the in-vehicle device control unit by associating the candidates with the user's hand motions.
Clarion Co., Ltd.

Controlling navigation of a visual aid during a presentation

Methods, systems and computer program products controlling navigation of a visual aid during a presentation are provided. Aspects include obtaining a presenter profile that includes associations between gestures of a presenter and desired actions for the visual aid and receiving indications of one or more movements of a presenter during the presentation.
International Business Machines Corporation

Interaction and management of devices using gaze detection

User gaze information, which may include a user line of sight, user point of focus, or an area that a user is not looking at, is determined from user body, head, eye and iris positioning. The user gaze information is used to select a context and interaction set for the user.
Microsoft Technology Licensing, Llc

Vehicle parking control

Methods and systems are provided for performing an automated parking operation of a vehicle. The methods and systems determine a position of a wireless driver device located outside of the vehicle.
Gm Global Technology Operations Llc

Gesture-activated remote control

A gesture-based control for a television is provided that runs in the background of a computing device remote from the television, where the control is activated by a gesture. Advantageously, the user need not interrupt any task in order to control the television.
Google Inc.

Method and system for dynamically interactive visually validated mobile ticketing

Systems and methods for interaction-based validation of electronic tickets. In some embodiments, the system renders a first visually illustrative scene on an interactive display screen of a mobile device, the first visually illustrative scene responsive to a pre-determined gesture performed at a predetermined location on the interactive display screen.
Moovel North America, Llc

Augmented reality display device with deep learning sensors

A head-mounted augmented reality (ar) device can include a hardware processor programmed to receive different types of sensor data from a plurality of sensors (e.g., an inertial measurement unit, an outward-facing camera, a depth sensing camera, an eye imaging camera, or a microphone); and determining an event of a plurality of events using the different types of sensor data and a hydra neural network (e.g., face recognition, visual search, gesture identification, semantic segmentation, object detection, lighting detection, simultaneous localization and mapping, relocalization).. .
Magic Leap, Inc.

Gesture masking in a video feed

Various systems and methods for processing video data, including gesture masking in a video feed, are provided herein. The system can include a camera system interface to receive video data from a camera system; a gesture detection unit to determine a gesture within the video data, the gesture being performed by a user; a permission module to determine a masking permission associated with the gesture; and a video processor.
Intel Corporation

Gesture based captcha test

One embodiment a method, including: providing, using a processor, a user challenge over a network, wherein the user challenge is associated with a predetermined gesture to be performed by a user; obtaining, using a processor, user image data; determining, using the user image data, that a user has performed the predetermined gesture; and thereafter providing the user access to information. Other aspects are described and claimed..
Lenovo (singapore) Pte. Ltd.

Mobile terminal

A mobile terminal including a wireless communication processor configured to provide wireless communication; a touch screen; and a controller configured to display an area of an omnidirectional image on the touch screen, display a guideline on the touch screen for guiding a movement of the omnidirectional image on the touch screen, in response to a scrolling gesture on the touch screen having a first direction corresponding to a direction of the guideline, move the display area of the omnidirectional image in the first direction, and in response to the scrolling gesture on the touch screen having a second direction different than the direction of the guideline, move the display area of the omnidirectional image along the guideline in the first direction instead of the second direction.. .
Lg Electronics Inc.

Proxy gesture recognizer

An electronic device displays one or more views. A first view includes a plurality of gesture recognizers.
Apple Inc.

Gesture recognition device, gesture recognition method, and information processing device

Provided are a gesture recognition device, a gesture recognition method and an information processing device for making it possible to quickly recognize a gesture of a user. The gesture recognition device includes a motion information generator that generates body part motion information by performing detection and tracking of the body part, a prediction processor that makes a first comparison of comparing the generated body part motion information with previously stored pre-gesture motion model information and generates a prediction result regarding a pre-gesture motion on the basis of a result of the first comparison, and a recognition processor that makes a second comparison of comparing the generated body part motion information with previously stored gesture model information and generates a result of recognition of the gesture represented by a motion of the detected body part on the basis of the prediction result and a result of the second comparison..
Mitsubishi Electric Corporation

System and distant gesture-based control using a network of sensors across the building

A gesture-based interaction system for communication with an equipment-based system includes a sensor device and a signal processing unit. The sensor device is configured to capture at least one scene of a user to monitor for at least one gesture of a plurality of possible gestures, conducted by the user, and output a captured signal.
Otis Elevator Company

System and distant gesture-based control using a network of sensors across the building

A gesture and location recognition system and method are provided. The system includes a sensor device that captures a data signal of a user and detects a gesture input from the user from the data signal, wherein a user location can be calculated based on a sensor location of the sensor device in a building and the collected data signal of the user, a signal processing device that generates a control signal based on the gesture input and the user location, and in-building equipment that receives the control signal from the signal processing device and controls the in-building equipment based on the control signal..
Otis Elevator Company

Device manipulation using hover

An apparatus may be manipulated using non-touch or hover techniques. Hover techniques may be associated with zooming, virtual feedback, authentication, and other operations.
Microsoft Technology Licensing, Llc

System and communicating inputs and outputs via a wearable apparatus

A system is disclosed comprising one or more of a wearable controller, a wearable apparatus such as a “smart” glove, a mobile device or mobile computer, and operating software, preferably in wireless communication with each other. In one embodiment, these components allow a user to create inputs through their gestures, movements and contacts with other surfaces, which facilitates the ability of the user to perform, edit, remix and produce musical compositions, or enhance a virtual/augmented reality or gaming environment, for example..

Motor control system based upon movements inherent to self-propulsion

The systems and methods described herein provide hands free motor control mechanisms based on the natural and inherent movements associated with an activity of interest, and can be combined with gesture communication based upon defined movements by the participant.. .
Medici Technologies, Llc

Surgical microscope with gesture control and a gesture control of a surgical microscope

The present invention relates to a surgical microscope (1) with a field of view (9) and comprising an optical imaging system (3) which images an inspection area (11) which is at least partially located in the field of view (9), and to a method for a gesture control of a surgical microscope (1) having an optical imaging system (3). Surgical microscopes (1) of the art have the disadvantage that an adjustment of the microscope parameters, for instance the working distance, field of view (9) or illumination mode, requires the surgeon to put down his or her surgical tools, look up from the microscope's eyepiece (5) and perform the adjustment by operating the microscope handles.
Leica Instruments (singapore) Pte. Ltd.

Gesture-based control and usage of video relay service communications

A method and system are disclosed for enabling members of the deaf, hard of hearing, or speech-impaired (d-hoh-si) community to start and control communications conducted through a video relay service (vrs) without the need for a remote-control device. A combination of standard-asl and non-asl hand commands are interpreted using “hand recognition” (similar to “face recognition”) to control the operation of a vrs system through multiple operating modes.
Purple Communications, Inc.

Control system and control processing method and apparatus

The complex operation and low control efficiency in controlling home devices, such as lights, televisions, and curtains, is reduced with a control system that senses the presence and any actions, such as hand gestures or speech, of a user in a predetermined space. In addition, the control system identifies a device to be controlled, and the command to be transmitted to the device in response to a sensed action..
Alibaba Group Holding Limited

Smart electronic device

A smart electronic device is provided for a multi-user environment to meet more convenient life requirements of a plurality of users. The smart electronic device has a camera, a microphone, a processing circuit, a network interface and a projector.
Xiamen Eco Lighting Co. Ltd.

Command processing using multimodal signal analysis

A first set of signals corresponding to a first signal modality (such as the direction of a gaze) during a time interval is collected from an individual. A second set of signals corresponding to a different signal modality (such as hand-pointing gestures made by the individual) is also collected.
Apple Inc.

Contemporaneous gesture and keyboard entry authentication

A restricted access device such as a cellphone, a tablet or a personal computer, analyzes contemporaneous keyboard inputs of a password and gestures to authenticate the user and enable further access to applications and processes of the restricted access device. The gestures may be facial gestures detected by a camera or may be gestures made by an avatar rendered on a display of the device.
International Business Machines Corporation

Touch keyboard using language and spatial models

A computing device outputs for display at a presence-sensitive display, a graphical keyboard comprising a plurality of keys, receives an indication of at least one gesture to select a group of keys of the plurality of keys, and determines at least one characteristic associated with the at least one gesture to select the group of keys of the plurality of keys. The computing device modifies a spatial model based at least in part on the at least one characteristic and determines a candidate word based at least in part on data provided by the spatial model and a language model, wherein the spatial model provides data based at least in part on the indication of the at least one gesture and wherein the language model provides data based at least in part on a lexicon.
Google Llc

Electronic device and the electronic device

The present disclosure proposes an electronic device and method for controlling the electronic device. The electronic device includes a display and a processor configured to detect a first gesture in a predefined area of the display while a first window is currently displayed in full screen on the display and upon detection of the first gesture instruct the display to display a gallery of previously opened windows; wherein the processor is further configured to subsequently upon detecting a second gesture in the predefined area on the display.
Huawei Technologies Co., Ltd.

System and 3d position and gesture sensing of human hand

A three dimensional touch sensing system having a touch surface configured to detect a touch input located above the touch surface is disclosed. The system includes a plurality of capacitive touch sensing electrodes disposed on the touch surface, each electrode having a baseline capacitance and a touch capacitance based on the touch input.
The Trustees Of Princeton University

Radar-based gesture sensing and data transmission

This document describes techniques and devices for radar-based gesture sensing and data transmission. The techniques enable, through a radar system, seamless and intuitive control of, and data transmission between, computing devices.
Google Llc

Method and selecting between multiple gesture recognition systems

A method and apparatus for selecting between multiple gesture recognition systems includes an electronic device determining a context of operation for the electronic device that affects a gesture recognition function performed by the electronic device. The electronic device also selects, based on the context of operation, one of a plurality of gesture recognition systems in the electronic device as an active gesture recognition system for receiving gesturing input to perform the gesture recognition function, wherein the plurality of gesture recognition systems comprises an image-based gesture recognition system and a non-image-based gesture recognition system..
Google Technology Holdings Llc

Radar-based gestural interface

Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for providing a gestural interface in vehicle. In one aspect, movement data corresponding to a gesture of a driver of a vehicle is received from a radar receiver arranged to detect movement at the interior of the vehicle.
Google Inc.

Information display device and information display method

An information display device includes a display control unit, a gesture detection unit, a gesture identification unit that makes identification of an operator's gesture based on gesture information and outputs a signal based on a result of the identification, a distance estimation unit that estimates a distance between the operator and a display unit, and an identification function setting unit that sets gestures identifiable by the gesture identification unit so that the number of gestures identifiable by the gesture identification unit when the estimated distance is over a first set distance is smaller than the number of gestures identifiable by the gesture identification unit when the estimated distance is less than or equal to the first set distance.. .
Mitsubishi Electric Corporation

Apparatus and method to navigate media content using repetitive 3d gestures

In a method for detecting a repetitive three-dimensional gesture by a computing device, a three-dimensional gesture sensor detects a plurality of positions corresponding to a finger movement. The computing device determines whether the plurality of positions contain a gesture cycle by: comparing at least two non-adjacent positions in the plurality of positions; and upon determining that the at least two non-adjacent positions match, determining that the plurality of positions contain the gesture cycle.

Driving support device, driving support system, and driving support method

In a driving support device, an image output unit outputs an image including a vehicle object representing a vehicle and a peripheral situation of the vehicle, to a display unit. An operation signal input unit receives a gesture operation by a user that involves moving of the vehicle object in the image displayed on the display unit.
Panasonic Intellectual Property Management Co., Ltd.

Physical gesture input configuration for interactive software and video games

Technologies are described for configuring user input using physical gestures. For example, a user can be prompted to perform a physical gesture to assign to a software application command (e.g., an action within a video game or a command in another type of application).
Microsoft Technology Licensing, Llc

Smart watch for indicating emergecny events

A portable device including a gesture recognizer module for automatically detecting a specific sequence of gestures is described. The portable device may be used to detect a health, safety, or security related event.

Method for transmitting image and electronic device thereof

A method for transmitting an image and an electronic device thereof are provided. An image transmission method of an electronic device includes displaying a message transmission/reception history with at least one other electronic device, sensing a selection of a camera execution menu, displaying a preview screen of a camera within a screen in which the message transmission/reception history is displayed, detecting a touch on the displayed preview screen, if the displayed preview screen is touched, capturing an image of a subject, detecting a gesture for the captured image, and, if the gesture for the captured image is detected, transmitting the captured image to the at least one other electronic device according to the detected gesture..
Samsung Electronics Co., Ltd.

Method for adding contact information, and user equipment

The present disclosure provides a method for adding contact information, and user equipment. The method includes: receiving gesture information input by a user on a communication interface of an instant messaging application, recognizing contact information in communication information according to the gesture information, and adding the contact information to an address book of user equipment..
Huawei Technologies Co., Ltd.

Method and password management

Systems, methods, and a security management apparatus, for password management including the determination of the identity of a service requesting a security token for access to the service. The security management apparatus generates personal identification data based on a personal identification input such as a touch selection or gesture, in order to access a service on a secured device.
Huami Inc.

Motion and gesture-based mobile advertising activation

The presentation of advertisements to a user on a mobile communications device is disclosed. A first external input corresponding to a triggering of an advertisement delivery is received on a first input modality.
Adtile Technologies Inc.

Device and operating a device

A method for operating a device, wherein a graphical user interface is generated and displayed on a display area. The user interface has at least one operating object assigned to an application program for controlling the device.
Volkswagen Aktiengesellschaft

User interface input handheld and mobile devices

Methods, systems, and computer readable media for receiving user input. According to one example method for receiving user input, the method includes identifying gestures from directional movements of a user's fingers on a touch sensor, mapping the gestures to alphanumeric characters, and outputting alphanumeric characters, including defining a set of the gestures as different sequences of at least two of: a contact event, a no contact event, a hold event, a finger movement in a first direction, a finger movement in a second direction..
The Trustees Of The University Of Pennsylvania

Interface scanning for disabled users

Systems and processes for scanning a user interface are disclosed. One process can include scanning multiple elements within a user interface by highlighting the elements.
Apple Inc.

Touch event model for web pages

One or more touch input signals can be obtained from a touch sensitive device. A touch event model can be used to determine touch and/or gesture events based on the touch input signals.
Apple Inc.

Method for determining display orientation and electronic apparatus using the same and computer readable recording medium

A method for determining display orientation is provided. The method includes the following steps.
Htc Corporation

User-defined virtual interaction space and manipulation of virtual cameras in the interaction space

The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3d) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3d sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures.
Leap Motion, Inc.

Route navigation method and system, terminal, and server

A route navigation method, include: at a first terminal in communication with a second terminal via a navigation server: obtaining a starting point and a destination that are set by a user; displaying, on a navigation interface, the starting point and the destination; drawing, on the navigation interface, route information from the starting point to the destination according to a swipe gesture of the user on the navigation interface; and sending route forwarding instruction to the navigation server, wherein sending the route forwarding instruction includes: sending the route information to the navigation server, wherein the navigation server is configured to determine a navigation path from the starting point to the destination according to the route information, and sending a request for the navigation server to forward the navigation path to the second terminal to prompt another user to arrive at the destination from the starting point in accordance with the navigation path.. .
Tencent Technology (shenzhen) Company Limited

Audio-based device control

Some disclosed systems may include a microphone system having two or more microphones, an interface system and a control system. In some examples, the control system may be capable of receiving, via the interface system, audio data from two or more microphones of the microphone system, of determining a gesture location based, at least in part, on the audio data and of controlling one or more settings of the system based on the gesture location..
Qualcomm Incorporated

Mobile terminal and control method therefor

The present invention provides a mobile terminal and a control method using an intuitive gesture as an input to perform a function. Specifically, the present invention provides a mobile terminal including a sensing unit for sensing movement of the mobile terminal, a wireless communication unit for transmitting/receiving a radio signal to/from an external terminal and a control unit for sensing a distance from the mobile terminal to the external terminal based on the strength of the received radio signal, and performing a specific function when the sensed movement corresponds to specific movement and the sensed distance is within a specific range..
Lg Electronics Inc.

Electronic device and control method

An electronic device includes a non-contact detection sensor, a display, and a controller. The display displays a first window and a second window.
Kyocera Corporation

Method for device interaction and identification

A method is provided for device interaction and identification by detecting similar or synchronous movements of two or more electronic devices using two or more movement data from two or more involved electronic devices to interact with each other and detect when the movement or motion of involved devices corresponds with certain multi-device gestures or activities.. .
16lab Inc

Gesture controlled calculator

A gesture controlled calculator has a touch screen controlled by a microprocessor. The touch screen receives a multiplication problem input by a user through a virtual keyboard.

System for identifying and using multiple display devices

Data, particularly display data, is sent to a particular peripheral device, particularly a display device) from a computer device, such as a mobile device. The method involves determining an identifier of each peripheral device and receiving (35) a user identification identifying a particular peripheral device.
Displaylink (uk) Limited

Systems and methods for shared broadcasting

Systems, methods, and non-transitory computer-readable media can provide an interface that includes a first region and a second region, wherein a live content stream being broadcasted is presented in the first region, and wherein information corresponding to users viewing the live content stream is presented in the second region. A determination is made that a first user operating the computing device has performed one or more touch screen gestures with respect to at least one user identifier in the second region, the user identifier corresponding to a second user.
Facebook, Inc.

Systems, devices, and methods for dynamically providing user interface controls at a touch-sensitive secondary display

Disclosed herein are systems, devices, and methods for dynamically updating a touch-sensitive secondary display. An example method includes receiving a request to open an application and, in response, (i) displaying, on a primary display, a plurality of user interface (ui) objects associated with the application, the plurality including a first ui object displayed with associated content and other ui objects displayed without associated content; and (ii) displaying, on the touch-sensitive secondary display, a set of affordances representing the plurality of ui objects.
Apple Inc.

Multi-touch uses, gestures, and implementation

A tablet pc having an interactive display, which is touchscreen enabled, may be enhanced to provide a user with superior usability and efficiency. A touchscreen device may be configured to receive multiple concurrent touchscreen contacts.
Microsoft Technology Licensing, Llc

Method and wearable device for providing a virtual input interface

Provided is a wearable device including: an image sensor configured to sense a gesture image of a user setting a user input region; and a display configured to provide a virtual input interface corresponding to the set user input region.. .
Samsung Electronics Co., Ltd.

Systems and methods for content-aware selection

Systems and methods detect simple user gestures to enable selection of portions of segmented content, such as text, displayed on a display. Gestures may include finger (such as thumb) flicks or swipes as well as flicks of the handheld device itself.
Fuji Xerox Co., Ltd.

Electronic device, control method, and non-transitory computer-readable recording medium

An electronic device includes a non-contact detection sensor and a controller that executes processing related to a timer in response to a gesture detected by the non-contact detection sensor.. .
Kyocera Corporation

Electronic device

An electronic device includes a controller that executes processing in response to a gesture. The controller starts the processing in response to a gesture in accordance with the physical state of the electronic device..
Kyocera Corporation

Human machine interface with haptic response based on phased array lidar

A device and a method for a human machine interface (hmi). The hmi device includes a dome having a hemispherical shape, a base attached to the dome forming an inverted cup like structure with a hollow interior, a chip scale lidar attached on the base and positioned to scan for a motion of an object external to the dome, at least one haptic device attached to the base and connected to the dome, and an hmi controller configured to send and receive signal from the chip scale lidar, detect and recognize a gesture based on the signal from the chip scale lidar, and activate or deactivate the at least one haptic device..
Toyota Motor Engineering & Manufacturing North America, Inc.

Information processing method, terminal, and computer storage medium

An information processing method, comprising: displaying, in a game user interface, a first game scene; displaying a skill selection object in the game user interface, the skill selection object includes a plurality of skill slots, and each slot includes a respective skill of a plurality of skills, wherein a total number of skill slots in the skill selection object is smaller than a total number of skills in the plurality of skills; while displaying the skill selection object, detecting a swipe gesture across a predefined region; in response: ceasing to display at least one of skills currently displayed in the skill selection object, and replacing the at least one skill with at least another skill among the plurality of skills that was not displayed in the skill selection object when the swipe gesture was detected.. .
Tencent Technology (shenzhen) Company Limited

Gestural control of visual projectors

Gestures may be performed to control a visual projector. When a device or human hand is placed into a projection field of the visual projector, the visual projector responds to gestures.
At&t Intellectual Property I, L.p.

Cyber reality device including gaming based on a plurality of musical programs

A system enabling the performance of sensory stimulating content including music and video using gaming in a cyber reality environment, such as using a virtual reality headset. This disclosure includes a system and method through which a performer can virtually trigger and control a presentation of pre-packaged sensory stimulating content including musical programs through gaming.
Beamz Interactive, Inc.

Systems and methods for advertising on virtual keyboards

Methods and systems are disclosed for interacting with advertisements on a virtual keyboard. An advertisement appears in a position that is proximate to a virtual key of the virtual keyboard.
Oversignal, Llc

Method and system for gesture-based confirmation of electronic transactions

A method for electronically transmitting data based on a physical gesture includes: storing at least one gesture pair, wherein each gesture pair includes at least a physical gesture and an associated data conveyance, the physical gesture being stored as one or more data points telegraphing three-dimensional movement; capturing a plurality of movement data points based on movement of one or more motion capturing devices; identifying a specific gesture pair where at least one of the captured plurality of movement data points corresponds to the included physical gesture; establishing a communication channel with an external computing device; and transmitting the associated data conveyance included in the identified specific gesture pair to the external computing device using the established communication channel.. .
Mastercard International Incorporated

Systems and methods for displaying an image capturing mode and a content viewing mode

Embodiments are also provided for displaying an image capturing mode and a content viewing mode. In some embodiments, one or more live images may be received from an image capturing component on a mobile device.
Dropbox, Inc.

Projecting a structured light pattern onto a surface and detecting and responding to interactions with the same

The present disclosure describes projecting a structured light pattern projected onto a surface and detecting and responding to interactions with the same. The techniques described here can, in some cases, facilitate recognizing that an object such as a user's hand is adjacent the plane of a projection surface and can distinguish the object from the projection surface itself.
Heptagon Micro Optics Pte. Ltd.

Information processing method, terminal, and computer storage medium

An information processing method includes: performing rendering on a graphical user interface to obtain at least one virtual resource object; when detecting a skill-release trigger gesture on at least one skill object, performing rendering to obtain a skill-release supplementary control object, having a skill-release control halo object and a virtual joystick object; when detecting a dragging operation on the virtual joystick object, controlling a skill-release location of the skill object to be correspondingly adjusted; and determining whether the virtual joystick object is out of a threshold range and, when the virtual joystick object is not out of the threshold range, selecting a target character object satisfying a first preset policy from at least one character object within a skill releasable range of the skill object according to a detected release operation of the dragging operation, and performing a skill-release operation on the target character object.. .
Tencent Technology (shenzhen) Company Limited

Gesture based interface system and method

A user interface apparatus for controlling any kind of a device. Images obtained by an image sensor in a region adjacent to the device are input to a gesture recognition system which analyzes images obtained by the image sensor to identify one or more gestures.
Eyesight Mobile Technologies Ltd.

Automated learning and gesture based interest processing

A system, method and program product for processing user interests. A system is provided that includes: a gesture management system that receives gesture data from a collection device for an inputted interest of a user; a pattern detection system that receives and analyzes behavior data associated with the inputted interest; an interest affinity scoring system that calculates an affinity score for the inputted interest based on the gesture data and an analysis of the behavior data; a dynamic classification system that assigns a dynamically generated tag to the inputted interest based on an inputted context associated with the inputted interest; and a user interest database that stores structured interest information for the user, including a unique record for the inputted interest that includes the affinity score and dynamically generated tag..
International Business Machines Corporation

Smart playable flying disc and methods

This disclosure is directed to a smart playable flying disc and methods for determining associated motion parameters. The playable device may capture and transmit sensor data including motion data to a computing device.
Play Impossible Corporation

Motorized shoe with gesture control

An article of footwear includes a motorized tensioning system, sensors, and a gesture control system. Based on information received from one or more sensors the gesture control system may, detect a prompting gesture and enters an armed mode for receiving further instructions.
Nike, Inc.

Smart lighting system and method

A smart lighting system for a vehicle is described. The smart lighting system includes at least one signal generator, and a lighting stem assembly.

Electronic device with gesture actuation of companion devices, and corresponding systems and methods

An electronic device includes a biometric sensor, such as a fingerprint sensor, to identify biometric input. One or more processors are then operable to identify at least one paired device and at least one companion device operating within a wireless communication radius.
Motorola Mobility Llc

Capturing smart playable device and gestures

This disclosure is generally directed to capturing aspects of a smart playable device. The playable device can include any device that is suitable for sports, games, or play, such as balls, discs, staffs, clubs, and the like.
Play Impossible Corporation

Information processing device

An information processing device includes a communication interface that is communicable with an image forming device, a camera interface configured to receive image data generated by a camera, and a controller. The controller is configured to, based on the image data received through the camera interface, determine whether or not a predetermined gesture is performed by a user during a predetermined period of time after controlling the communication interface to transmit a print command to the image forming device.
Kabushiki Kaisha Toshiba

Smart playable device and charging systems and methods

This disclosure is generally directed to a smart playable device and charging systems and methods. The playable device can include any device that is suitable for sports, games, or play, such as balls, discs, staffs, clubs, and the like.
Play Impossible Corporation

Combining gesture and voice user interfaces

A system includes a microphone providing input to a voice user interface (vui), a motion sensor providing input to a gesture-based user interface (gbi), an audio output device, and a processor in communication with the vui, the gbi, and the audio output device. The processor detects a predetermined gesture input to the gbi, and in response to the detection, decreases the volume of audio being output by the audio output device and activates the vui to listen for a command.
Bose Corporation

Synthetic data generation of time series data

A method of generating synthetic data from time series data, such as from handwritten characters, words, sentences, mathematics, and sketches that are drawn with a stylus on an interactive display or with a finger on a touch device. This computationally efficient method is able to generate realistic variations of a given sample.
University Of Central Florida Research Foundation, Inc.

Electronic device with gesture actuation of companion devices, and corresponding systems and methods

An electronic device includes a biometric sensor, such as a fingerprint sensor, that identifies biometric input received at the biometric sensor. One or more processors operable with the biometric sensor identify one or more companion devices operating within a wireless communication radius of the electronic device.
Motorola Mobility Llc

Incremental multi-word recognition

In one example, a computing device includes at least one processor that is operatively coupled to a presence-sensitive display and a gesture module operable by the at least one processor. The gesture module may be operable by the at least one processor to output, for display at the presence-sensitive display, a graphical keyboard comprising a plurality of keys and receive an indication of a continuous gesture detected at the presence-sensitive display, the continuous gesture to select a group of keys of the plurality of keys.
Google Inc.

Pressure-based gesture typing for a graphical keyboard

A computing device is described that outputs, for display, a graphical keyboard comprising a plurality of keys. The computing device receives an indication of a first gesture selecting a first sequence of one or more keys from the plurality of keys, and an indication of a second gesture selecting a second sequence of one or more keys from the plurality of keys.
Google Inc.

Interactive content for digital books

A graphical user interface (gui) is presented that allows a user to view and interact with content embedded in a digital book, such as text, image galleries, multimedia presentations, video, html, animated and static diagrams, charts, tables, visual dictionaries, review questions, three-dimensional (3d) animation and any other known media content, and various touch gestures can be used by the user to move through images and multimedia presentations, play video, answer review questions, manipulate three-dimensional objects, and interact with html.. .
Apple Inc.

System and touch/gesture based device control

A system and method for document processing includes a three-dimensional touch interface, a processor and associated memory. The processor generates thumbnail image data from received electronic document data and document format data corresponding to the electronic document data and the thumbnail image data.
Toshiba Tec Kabushiki Kaisha

Automated door

An automated door-opening device is disclosed that includes a first sensor disposed on the outside of the door. The first sensor is adapted to recognize a predetermined pattern of a gesture made by a patron.

Gesture based input system in a vehicle with haptic feedback

A vehicle haptic feedback system includes a haptic actuator, a detection device, and a controller. The haptic actuator is configured to provide haptic feedback to a vehicle driver.
Immersion Corporation

Smart playable device, gestures, and user interfaces

This disclosure is generally directed to a smart playable device and systems and methods of interacting with the playable device. The playable device can include any device that is suitable for sports, games, or play, such as balls, discs, staffs, clubs, and the like.
Play Impossible Corporation

Motor-activated multi-functional wrist orthotic

A multifunctional wrist orthotic comprising an electromyography (emg) sensor having at least two electrodes for attachment to a wrist of a user, an intertial measurement sensor (imu), a microcontroller unit (e.g., a arduino® mini) connected to the imu, a power supply unit. The microcontroller unit is configured to perform two-tiered gesture recognition, with the first tier comprising a fine gesture sensed by the emg sensor and the second tier comprising a gross gesture sensed by the imu sensor..
Purdue Research Foundation

Lighting commanding method and an assymetrical gesture decoding device to command a lighting apparatus

A gesture lighting source output command method for controlling a lighting apparatus, together with an asymmetrical, no facing, gesture decoding device for commanding a lighting source, by decoding gesture translational motions of a heat emitting object, e.g. A human hand, at distances up to 1.5 meters, in geometrical planes that may be not coplanar with the gesture decoding device plane, and comprising a casing with an embedded electronic controller and two dual pir sensors, geometrically arranged in a special manner, rotated 2α degrees one relatively to the other, fitted with specially designed fresnel lenses, their beam axis forming an angle of 2γ degrees between each other and an angle β with the vertical..
Cwj Power Electronics Inc.

Methods and systems to perform at least one action according to a user's gesture and identity

The present invention discloses methods and systems for performing at least one action at a system according to a user's gesture information. The required steps comprises of capturing the user's gestures information by a mobile apparatus, wherein the apparatus comprises an antenna, a processor, a storage medium, at least one accelerometer, wherein the accelerometer has at least 3 axis; comparing gesture information against one or more predefined gesture information at the mobile apparatus and when the gesture information matches a predefined gesture information, the mobile apparatus selects a first identity based on the predefined gesture information, sends encrypted information to a system through a reader wherein the encrypted information comprises the predefined gesture information, the first identity, a timestamp, and a device identity.
Pismo Labs Technology Ltd

Fire alarm inspection application and user interface

A system and method for facilitating inspection of fire alarm systems includes a graphical user interface rendered on a touchscreen display of a mobile computing device receiving selections of inspection results. The graphical user interface includes a testing pane, which indicates devices that are currently being tested, and a selection pane, which indicates devices yet to be tested.
Tyco Fire & Security Gmbh

Enhancing video chatting

A method for a computing device to enhance video chatting includes receiving a live video stream, processing a frame in the live video stream in real-time, and transmitting the frame to another computing device. Processing the frame in real-time includes detecting a face, an upper torso, or a gesture in the frame, and applying a visual effect to the frame.
Arcsoft Inc.

Biometric, behavioral-metric, knowledge-metric, and electronic-metric directed authentication and transaction method and system

A system to authenticate an entity and/or select details relative to an action or a financial account using biometric, behavior-metric, electronic-metric and/or knowledge-metric inputs. These inputs may comprise gestures, facial expressions, body movements, voice prints, sound excerpts, etc.
Nxt-id, Inc.

Floating soft trigger for touch displays on electronic device

A portable electronic device (100) having a touch screen (112) with a floating soft trigger icon (175) for enabling various functions of the electronic device (100), such as bar code reading, capturing rfid data, capturing video and images, calling applications, and/or placing phone calls. The floating trigger icon (175) is displayed on the touch screen (112) to enable easy identification and access of the trigger icon (175).
Datalogic Usa, Inc.

Systems and methods for adaptive gesture recognition

Systems and methods are described for adaptively recognizing gestures indicated by user inputs received from a touchpad, touchscreen, directional pad, mouse or other multi-directional input device. If a user's movement does not indicate a gesture using current gesture recognition parameters, additional processing can be performed to recognize the gesture using other factors.
Sling Media Inc.

Mechanism to create pattern gesture transmissions to create device-sourcing emergency information

A system may comprise a registration device configured to register patterns for users; a recording device configured to record a received pattern, as an electronic pattern, wherein the recording device recognizes the received pattern as one of the registered patterns; a receiving device configured to observe human movement patterns with a camera, transform the observed human movement patterns to an electronic signal, and receive the recognized registered pattern from the recording device by a first wireless transmission; a forwarding device configured to transmit the electronic signal, and the received recognized registered pattern to an alert service by a second wireless transmission; and an alert service, configured to receive the electronic signal and the received recognized registered pattern from the forwarding device and configured to transmit the electronic signal and the received recognized registered pattern to a second electronic device by a third wireless transmission.. .
International Business Machines Corporation

Dynamic user interactions for display control

The technology disclosed relates to using gestures to supplant or augment use of a standard input device coupled to a system. It also relates to controlling a display using gestures.
Leap Motion, Inc.

Gesture-based user interface

A computer-implemented method for enabling gesture-based interactions between a computer program and a user is disclosed. According to certain embodiments, the method may include initiating the computer program.
Capital One Services, Llc

User interface device, vehicle including the same, and controlling the vehicle

A user interface device includes: an output device having an output region predefined around an output unit; an acquisition unit acquiring information about a user's gesture performed around the output region; and a controller determining an area of a shielded region which is shielded by the user's gesture in the output region based on the acquired information and controlling output of the output device.. .
Hyundai Motor Company

Multimodal haptic effects

Embodiments generate haptic effects in response to a user input (e.g., pressure based or other gesture). Embodiments receive a first input range corresponding to user input and receive a haptic profile corresponding to the first input range.
Immersion Corporation

Method and smart home control based on smart watches

A method for controlling a smart home using a smart watch is disclosed. The method includes: detecting whether the smart watch has entered a sensing range of the smart home; detecting, after the smart watch has entered the sensing range of the smart home, whether the smart watch has established a wireless connection with the smart home; turning on, after the smart watch has established the wireless connection with the smart home, a smart-home-control function of the smart watch; and while controlling the smart home using the smart-home-control function, recognizing hand gestures of the user using the smart watch and controlling the smart home through the wireless connection to switch current working state of the smart home based on the recognized hand gestures of the user..
Huizhou Tcl Mobile Communication Co.,ltd



Gesture topics:
  • Virtual Keyboard
  • Touchscreen
  • Electronic Device
  • User Interface
  • Characters
  • Display Panel
  • Touch Screen
  • Output Device
  • Input Device
  • Computing Device
  • Device Control
  • Computer Vision
  • Mobile Terminal
  • Ball Mouse
  • Navigation


  • Follow us on Twitter
    twitter icon@FreshPatents

    ###

    This listing is a sample listing of patent applications related to Gesture for is only meant as a recent sample of applications filed, not a comprehensive history. There may be associated servicemarks and trademarks related to these patents. Please check with patent attorney if you need further assistance or plan to use for business purposes. This patent data is also published to the public by the USPTO and available for free on their website. Note that there may be alternative spellings for Gesture with additional patents listed. Browse our RSS directory or Search for other possible listings.


    0.9324

    file did exist - 11534

    2 - 1 - 256