Follow us on Twitter
twitter icon@FreshPatents


Gesture patents

      

This page is updated frequently with new Gesture-related patent applications.

SALE: 890+ Gesture-related patent PDFs



 Touch gesture control of video playback patent thumbnailnew patent Touch gesture control of video playback
A method of touch gesture control of video playback is discloses. The method includes providing a video item for playback the method also receives an indication of a touch gesture made by a user of mobile user device.
Google Inc.


 System and  preserving video clips from a handheld device patent thumbnailnew patent System and preserving video clips from a handheld device
A system and method for recording video that combines video capture, touch-screen and voice-control technologies into an integrated system that produces cleanly edited, short-duration, compliant video files that exactly capture a moment after it has actually occurred the present invention maintains the device in a ready state that is always ready to capture video up to n seconds or minutes in the past (where n depends on available system memory). This enables the user to run the system indefinitely without having to worry about running out of storage.

 Realtime capture exposure adjust gestures patent thumbnailnew patent Realtime capture exposure adjust gestures
Disclosed herein are systems, device, methods, and non-transitory computer-readable storage media for enabling semi-manual media capture. Semi-manual media capture can involve calculating optimal exposure settings in an auto-exposure loop, displaying a scene with optimal exposure settings in real time, receiving a manual adjust gesture, and adjusting the scene, in real time, based on the manual adjust gesture..
Apple Inc.


 Methods and systems for showing perspective in market data patent thumbnailnew patent Methods and systems for showing perspective in market data
Methods and systems are disclosed for providing perspective to market data. In one embodiment, market data is displayed in a three-dimensional perspective that emphasizes important aspects of market data while minimizing less important aspects.
Geneva Technologies, Llc


 Executing a default action on a touchscreen device patent thumbnailnew patent Executing a default action on a touchscreen device
A computer-implemented method for executing a default action on a touchscreen device is provided. The method includes receiving a touch input from a user on a touchscreen device and determining a context associated with the touch input.
Google Inc.


 Continuous circle gesture detection for a sensor system patent thumbnailnew patent Continuous circle gesture detection for a sensor system
A method for detecting a continuous circle gesture, has the following steps: receiving vectors representative of an object movement by a object detection unit; determining from the received sequence velocity vectors a sequence of velocity vectors or an approximation thereof; estimating an angle between subsequent velocity vectors; and determining a rotation direction.. .
Microchip Technology Incorporated


 Method and  gesture-based searching patent thumbnailnew patent Method and gesture-based searching
A content searching technique that includes sensing at least three types of input gestures from a user, each of the at least three input gestures representative of a respective one of a search term, a class of assets to be searched and a location to be searched. The sensed at least three types of input gestures are translated into the search term, the class of assets to be searched and the location to be searched.
Thomson Licensing


 Systems and methods of creating a realistic grab experience in virtual reality/augmented reality environments patent thumbnailnew patent Systems and methods of creating a realistic grab experience in virtual reality/augmented reality environments
The technology disclosed relates to a method of realistic rotation of a virtual object for an interaction between a control object in a three-dimensional (3d) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3d) sensory space and generating for display a 3d solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a two e sub-component free-form gesture of the control object in the 3d sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting rotation of the virtual object by the 3d solid control object model..
Leap Motion, Inc.


 User interface, a means of transportation and a  classifying a user gesture performed freely in space patent thumbnailnew patent User interface, a means of transportation and a classifying a user gesture performed freely in space
A user interface, a computer program product, a signal sequence, a means of transportation and a method for classifying a user gesture performed freely in space. A first gesture and/or a second gesture may be detected by a sensor.
Volkswagen Aktiengesellschaft


 Method of providing handwriting style correction function and electronic device adapted thereto patent thumbnailnew patent Method of providing handwriting style correction function and electronic device adapted thereto
A method of providing a handwriting style correction function and an electronic device adapted to the method are provided. The electronic device includes: a touch screen; a processor electrically connected to the touch screen; and a memory electrically connected to the processor.
Samsung Electronics Co., Ltd.


new patent

Vehicle operating system using motion capture

Vehicle operating systems for operating a vehicle having a driving seat for a vehicle driver and at least one passenger seat for passengers are described. The vehicle operating system may include one or more camera devices for shooting images of hand actions of the driver or images of hand actions of a passenger, and a storage device for storing operating signals corresponding to hand actions.
Thunder Power Hong Kong Ltd.

new patent

Three-dimensional gesture sensing method and touch sensing device using the same

Disclosed is a three-dimensional gesture sensing method. The three-dimensional gesture sensing method comprises the following steps.
Pixart Imaging Inc.

new patent

System for gaze interaction

A method and system for assisting a user when interacting with a graphical user interface combines gaze based input with gesture based user commands. A user of a computer system without a traditional touch-screen can interact with graphical user interfaces in a touch-screen like manner using a combination of gaze based input and gesture based user commands.
Tobii Ab

new patent

Control of an aerial drone using recognized gestures

A method, system, and/or computer program product controls movement and adjusts operations of an aerial drone. A drone camera observes an aerial maneuver physical gesture by a user.
International Business Machines Corporation

new patent

Method and system for real-time positioning of smart device, and determining the motion gesture of mobile phone

The present invention discloses a method for real-time positioning of a smart device, comprising: getting tri-axial acceleration sequences truncated in turn by using a preset time period in the world coordinate individually; acquiring a dominate frequency fstep by applying fast fourier transform (fft) on the truncated z-axis acceleration sequence, using a band-pass filter with a pass band [fstep−0.5 hz, fstep+0.5 hz] to filter the truncated tri axial acceleration sequences individually; comparing the time domain waveform form from the filtered z-axis acceleration sequence with a preset step threshold, and counting each peak above the threshold as a step to acquire a user's step number in the present time period; determining the moving direction of the smart device in the time period by calculating the filtered x-axis and y-axis acceleration sequences; determining the current position of the smart device by calculating the user's step number and the moving direction of the smart device.. .
Guangzhou Hkust Fok Ying Tung Research Institute

new patent

Vehicle control system

A vehicle control system includes a vehicle control apparatus mounted on a vehicle, a first portable device (electronic key) performing communication with the vehicle control apparatus, and a second portable device (smartphone) performing communication with the first portable device. The second portable device transmits, based on a predetermined operation by a user, a notification signal notifying about the operation to the first portable device.
Omron Automotive Electronics Co., Ltd.

new patent

Implantable remote control

The present application discloses systems, methods, and articles of manufacture for controlling one or more functions of a device utilizing one or more tags. In one example, a method for controlling one or more functions of a medical device includes scanning a data interface of the medical device for signals induced wirelessly by one or more gestures made with one or more tags associated with a recipient of the medical device and controlling one or more functions of the medical device based on the wirelessly induced signals..

Method and system for detecting an input to a device

System and method for detecting an input for an apparatus inside a vehicle, wherein at least one moving user device generates a changing electromagnetic field. A detection unit detects at least one field parameter of the electromagnetic field, wherein the detected field parameter is a function of the geometric arrangement of the user device relative to the detection unit.
Volkswagen Aktiengesellschaft

Controlling electronic devices based on wireless ranging

A wireless communication device may wirelessly control an object, such as a physical device, directly or through interaction with a virtual representation (or placeholder) of the object situated at a predefined physical location. In particular, the wireless communication device may identify an intent gesture performed by a user that indicates intent to control the object.
Apple Inc.

Methods and systems for a gesture-controlled lottery terminal

A method including providing a lottery terminal that includes a graphical user interface and a motion capture device to facilitate a user to play in a lottery. The method further includes displaying, on the graphical user interface, a first image with content that includes drawing lottery tickets, instant lottery tickets, lottery games, dynamically-generated animations, and advertisements and detecting, by the motion capture device, a gesture of the user in a three dimensional space surrounding the lottery terminal.
Intralot S.a. - Integrated Lottery Systems And Services

Depth-based feature systems for classification applications

Human computer interfaces (hcl) may allow a user to interact with a computer via a variety of mechanisms, such as hand, head, and body gestures. Various of the disclosed embodiments allow information captured from a depth camera on an hcl system to be used to recognize such gestures.
Youspace, Inc

Hybrid mobile interactions for native apps and web apps

There is disclosed a system, including apparatus, methods and computer programs, for running native software applications (apps) and html5 web-based apps on a computing device, particularly a mobile computing device, in a multitasking mode of operation. In one embodiment, touch screen displays having one or more browsers are adapted to run one or more html5 apps, and receive input from hand gestures.
Intel Corporation

System and redirection and processing of audio and video data based on gesture recognition

An information handling system includes a redirection module and an audio/video module. The redirection module receives an audio data frame and an image data frame, analyzes the image data frame for a trigger event.
Dell Products, Lp

Smartpad split screen desktop

A multi-display device is adapted to be dockable or otherwise associatable with an additional device. In accordance with one exemplary embodiment, the multi-display device is dockable with a smartpad.
Z124

System and providing visual feedback related to cross device gestures

An electronic device includes a touch sensitive display and a processor in communication with the touch sensitive display. The device detects that a second electronic device is in a defined position proximate the electronic device and detects a touch gesture including a swipe tracing a path across the touch sensitive display between a first position proximate the second electronic device and a second position away from the second electronic device.
Nanoport Technology Inc.

User gesture for controlling user output in content display system

There is disclosed a computer device having a display for displaying to a user at least one content item, wherein the display is provided on a user interface which is configured to detect a user input in the form of a swipe action over the displayed content item, wherein an audio output is responsive to the direction of swipe such that the volume of the audio output varies depending on the direction of swipe by a user.. .
Piksel, Inc.

Presentation of a control interface on a touch-enabled device based on a motion or absence thereof

Techniques are described herein that are capable of causing a control interface to be presented on a touch-enabled device based on a motion or absence thereof. A motion, such as a hover gesture, can be detected and the control interface presented in response to the detection.
Microsoft Technology Licensing, Llc

Systems and methods for a virtual reality editor

A system includes processors, a head mounted display, a hand-tracking input device, and an editor engine. The editor engine performs operations including identifying a set of virtual assets, each virtual asset includes data associated with a 3d object, creating a card tray within a virtual environment, creating one or more virtual cards within the card tray, the one or more virtual cards including a first virtual card, the first virtual card is configured with a first interaction mechanic including a triggering gesture and an associated gesture response, the triggering gesture allows the user to interact with the first virtual card within the virtual environment by performing the triggering gesture, detecting performance of the triggering gesture by the user on the first virtual card using the first hand-tracking input device, and based on detecting performance of the triggering gesture, performing the gesture response associated with the first interaction mechanic..
Unity Ipr Aps

Methods and systems for performing medical procedures and for accessing and/or manipulating medically relevant information

A system permits a medical practitioner to interact with medically relevant information during a medical procedure. The system comprises: a projector for projecting a user interface menu image onto a projection surface; a three-dimensional optical imaging system for capturing three-dimensional location information for objects in a sensing volume which includes the projection surface; and a controller connected to receive the three-dimensional location information from the three-dimensional optical imaging system and configured to interpret one or more gestures made by the practitioner in a space between the three-dimensional optical imaging system and the projection surface based on a location of the gesture relative to the projected user interface menu image.
The University Of British Columbia

Compound gesture-speech commands

A multimedia entertainment system combines both gestures and voice commands to provide an enhanced control scheme. A user's body position or motion may be recognized as a gesture, and may be used to provide context to recognize user generated sounds, such as speech input.
Microsoft Technology Licensing, Llc

System and controlling a piece of equipment of a motor vehicle

The system is characterised in that, the vehicle comprising a head-up-type display (34), said system comprises a gesture recognition device (26) that can supply data representative of a gesture made by the driver of the vehicle, and a control module (28) for processing the data, the control module (28) being designed so as to be able to control the display of at least one visual feedback (30, 32) on the display, upon reception of said data representative of the gesture made by the driver.. .
Valeo Comfort And Driving Assistance

Method and providing interactive content

Ways to provide interactive media content are described. Interactive displays (110) may include a sensing element (130) that is able to sense user movements in order to collect motion data.
Thomson Licensing

Methods, systems, and products for gesture-activation

Methods, systems, and products recognize a gesture to control a device or appliance. A camera generates an image of a user performing the gesture.
At&t Intellectual Property I, L.p.

Method for controlling electronic equipment and wearable device

A method for controlling an electronic equipment and a wearable device are provided, respectively. The method for controlling the electronic equipment includes the following steps.
Industrial Technology Research Institute

Tremor correction for gesture recognition

Motion detection computing devices may have difficulty determining precise motions of a user who suffers from unintended movement, such as tremors, associated with a physical or medical condition. Aspects described herein relate to motion compensation for detected motion input, e.g., 3d motion, from such users.
Comcast Cable Communications, Llc

Electronic apparatus with touch input determining mechanism

An electronic apparatus comprising an environment sensing device, a display, a touch sensing device, and a processing module comprising a main controller and a sub-controller is disclosed. The main controller performs a predetermined function based a touch input.
Htc Corporation

Floor estimation for human computer interfaces

Human computer interfaces (hci) may allow a user to interact with a computer via a variety of mechanisms, such as hand, head, and body gestures. Various of the disclosed embodiments allow information captured from a depth camera on an hci system to be used to recognize such gestures.
Youspace, Inc

Driving support device, driving support system, and driving support method

In a driving support device, an image output unit outputs an image including a vehicle object representing a vehicle and a peripheral situation of the vehicle, to a display unit. An operation signal input unit receives a gesture operation by a user that involves moving of the vehicle object in the image displayed on the display unit.
Panasonic Intellectual Property Management Co., Ltd.

Wearable device controlled vehicle systems

An example of a system to control an in-vehicle system includes a wearable device, a vehicle communications platform operatively disposed in a vehicle, a control module in communication with the vehicle communications platform, and the in-vehicle system. The wearable device is for recognizing a hand gesture.
General Motors Llc

Sensor-based action control for mobile wireless telecommunication computing devices

The present disclosure describes systems and methods in which one or more existing onboard sensors on a mobile wireless telecommunication computing device, such as a smartphone, are recruited to detect a condition in which an action should be initiated, and the mobile wireless telecommunication computing device uses a wireless signal to identify the desired action. For example, a magnetometer on a smartphone may be used to detect the presence of a nearby magnet as a condition in which an action should be initiated, or an accelerometer and gyroscope on a smartphone may be used to detect a gesture as a condition in which an action should be initiated.
Overair Proximity Technologies Ltd.

Facial gesture recognition and video analysis tool

Embodiments disclosed herein may be directed to a video communication server. In some embodiments, the video communication server includes: at least one memory including instructions; and at least one processing device configured for executing the instructions, wherein the instructions cause the at least one processing device to perform the operations of: determining a time duration of a video communication connection between a first user of a first user device and a second user of a second user device; analyzing video content transmitted between the first user device and the second user device; determining at least one gesture of at least one of the first user and the second user based on analyzing the video content; and generating a compatibility score of the first user and the second user based at least in part on the determined time duration and the at least one determined gesture..
Krush Technologies, Llc

Power consumption in motion-capture systems with audio and optical signals

The technology disclosed provides systems and methods for reducing the overall power consumption of an optical motion-capture system without compromising the quality of motion capture and tracking. In implementations, this is accomplished by operating the motion-detecting cameras and associated image-processing hardware in a low-power mode (e.g., at a low frame rate or in a standby or sleep mode) unless and until touch gestures of an object such as a tap, sequence of taps, or swiping motions are performed with a surface proximate to the cameras.
Leap Motion, Inc.

Enhanced field of view to augment three-dimensional (3d) sensory space for free-space gesture interpretation

The technology disclosed relates to enhancing the fields of view of one or more cameras of a gesture recognition system for augmenting the three-dimensional (3d) sensory space of the gesture recognition system. The augmented 3d sensory space allows for inclusion of previously uncaptured of regions and points for which gestures can be interpreted i.e.
Leap Motion, Inc.

Method and preventing screen off during automatic response system service in electronic device

A method of avoiding screen off during an automatic response system (ars) service is provided. The method includes enabling a proximity sensor in a call connection, detecting one of a first gesture and a second gesture during a call, and upon detecting the first gesture, disabling the proximity sensor..
Samsung Electronics Co., Ltd.

Gesture detection to pair two wearable devices and perform an action between them and a wearable device, a method and a system using heat as a means for communication

The present disclosure relates to devices and methods for initiating execution of actions and for communicating information to a user, and more particularly, to initiating execution of predefined actions in wearable devices and communication devices based on gestures made with the wearable devices and/or heat applied to a surface of the wearable devices. According to an aspect, the method relates to, in the wearable device, detecting a first, in the first wearable device predefined, gesture of the first wearable device, broadcasting a first signal comprising information associated with the first gesture, receiving, from a second wearable device, a second signal comprising information associated with a second gesture and initiating execution of a, in the first wearable device predefined, first action based on the first signal and the second signal..
Sony Corporation

Order entry actions

Various embodiments disclosed herein relate to order entry. In the electronic trading process, order entry involves setting one or more order entry parameters, sending one or more order entry parameters, or both setting and sending one or more order entry parameters.
Trading Technologies International, Inc.

Gesture classification apparatus and method using emg signal

A gesture classification apparatus and method is disclosed. The apparatus may include a feature extractor configured to extract a plurality of features using a electromyogram (emg) data group obtained from an emg signal sensor including a plurality of channels, an artificial neural network including an input layer to which the emg data group corresponding to the plurality of features is input and an output layer configured to output a preset gesture corresponding to the plurality of features, and a gesture recognizer configured to recognize a gesture performed by a user and corresponding to the extracted features..
Korea Advanced Institute Of Science And Technology

Fingerprint based smart phone user verification

A touchscreen, now incorporated in most smartphones, tablets, laptops, and similar devices, presents an effective and transparent method to incorporate continuous active user verification schemes. The touchscreen element grid structure can be used to capture information, such as a set of one-dimensional time-varying signals produced as the user's finger moves past the grid intersections points.
Ami Research & Development, Llc

User interface for redirection of print jobs

The present disclosure is directed to a method and user interface for redirecting print jobs. The method involves receiving a notification indicating that execution of a print job at a first printing device failed.
Kyocera Document Solutions Inc.

Touch interaction processing method, device, and system

A touch interaction processing method, device, and system. The touch interaction processing method includes: receiving first information sent by an electromyographic signal collection device and second information sent by a location capturing device (101); if it is determined that a time gap between a first touch start time and a second touch start time is less than a preset threshold, and a quantity of touch points that is corresponding to a hand gesture is the same as a quantity of touch points that is included in the second information, generating a touch instruction, where the touch instruction includes a device identifier of the electromyographic signal collection device, the hand gesture, and coordinate information of each touch point (103); and performing an interaction operation corresponding to the touch instruction (105)..
Huawei Technologies Co., Ltd.

Self-revealing gesture

Various embodiments provide self-revealing gestures that are designed to provide an indication of how to perform one or more different gestures. In at least one embodiment, an initiation gesture is received, relative to an object.
Microsoft Technology Licensing, Llc

Home security system with touch-sensitive control panel

A security system includes a control panel and sensors for monitoring conditions related to the security of a residential or commercial location. The control panel communicates with the sensors and displays information related to the operation of the security system.
Vivint, Inc.

In-application interactive notification

A system and method of providing information on a computing device is described. A client application can present a user interface on a display of the computing device.
Uber Technologies, Inc.

User interface for editing a value in place

A user interface element is displayed for in place editing of values within a document. For example, in response to selecting a value, a user interface is displayed near the value that receives a slide gesture for adjusting the value in place.
Microsoft Technology Licensing, Llc

Led/oled array approach to integrated display, focusing lensless light-field camera, and touch-screen user interface devices and associated processors

A system for implementing a display which also serves as one or more of a tactile user interface touchscreen, light-field sensor, proximate hand gesture sensor, and focusing lensless light-field imaging camera using image formation systems and methods such as those taught in the inventor's related patents as cited. In an implementation, an oled array or led array that can comprise photosensors that can be used for light sensing as well as light emission functions.

Touch input device

A touch input device includes a first sensor having a first surface to which a touch is input, and a second surface opposing the first surface, and a second sensor connected to the second surface of the first sensor, and spaced apart from the first sensor in a vertical direction, wherein the first sensor measures a first position of the touch input to the first surface, wherein the second sensor measures a force caused by the touch input to the first surface of the first sensor, and calculates a second position by applying the force measured by the second sensor to force and moment equilibrium equations, when a distance between the first position and the second position is greater than or less than a threshold, the touch input to the first surface of the first sensor is determined to be a shear force or a sliding gesture, respectively.. .
Hyundai Motor Company

Dynamic hover sensitivity and gesture adaptation in a dual display system

An information handling system including a display device in a first display device housing operable to detect a touch device hovering above the display device and operable to receive touch inputs and a processor to determine a set default sense state for the display device based on orientation, operating application, or usage mode of the information handling system where the display device detects a first gesture and the processor to determine if the first gesture is consistent with the set default sense state for the orientation, operating application, or usage mode of the information handling system and if so to process the first gesture with the default sense state and the processor to set an alternative sense state if the first gesture is not consistent with the set default sense state. .
Dell Products, Lp

Cross device gesture detection

A first electronic device comprising a plurality of connectors is disclosed. The device detects that at least one connector of a second electronic device has been connected to at least one of its connectors of the first electronic device.
Nanoport Technology Inc.

Press and move gesture

A method. The method may include obtaining force information regarding an input force applied by an input object to a sensing region of an input device.
Synaptics Incorporated

Use of transparent photosensor array to implement integrated combinations of touch screen tactile, touch gesture sensor, color image display, hand-image gesture sensor, document scanner, secure optical data exchange, and fingerprint processing capabilities

A system and method for implementing a display with a photosensor overlay, permitting the arrangement to also serve as one or more of a proximate hand gesture sensor, light field sensor, lensless imaging camera, document scanner, fingerprint scanner, and secure optical communications interface. The resulting arrangements are accordingly advantageous for use in handheld devices such as cellphone, smartphones, pdas, tablet computers, and other such devices..

Digital cursor display linked to a smart pen

A system and a method are disclosed for calibrating writing on a writing surface to a digital document. One or more calibration parameters associated with a writing surface and a digital document of a display device are determined.
Livescribe Inc.

System and using a side camera for free space gesture inputs

An information handling system including a camera mounted in the side edge surface for detecting gestures by a user on a detected working surface in a gesture detecting zone next to the system and including a gesture detection system for interpreting free space gestures and initializing cursor control commands.. .
Dell Products, Lp

Dynamic user interactions for display control and scaling responsiveness of display objects

The technology disclosed relates to distinguishing meaningful gestures from proximate non-meaningful gestures in a three-dimensional (3d) sensory space. In particular, it relates to calculating spatial trajectories of different gestures and determining a dominant gesture based on magnitudes of the spatial trajectories.
Leap Motion, Inc.

Human interface device and method

A method for state tracking based gesture recognition engine for a sensor system has the steps of: defining a plurality of sequential states of a finite-state machine, determining a sequence progress level (spl) for each state, mapping a state probability distribution to a (single) spl on run-time, and utilizing the mapped spl estimate as an output value of the sensor system.. .
Microchip Technology Germany Gmbh

Efficient gesture processing

Embodiments of the invention describe a system to efficiently execute gesture recognition algorithms. Embodiments of the invention describe a power efficient staged gesture recognition pipeline including multimodal interaction detection, context based optimized recognition, and context based optimized training and continuous learning.
Intel Corporation

System and controlling playback of media using gestures

The playback of media by a playback device is controlled by input gestures. Each user gesture can be first broken down into a base gesture which indicates a specific playback mode.
Thomson Licensing System

Interactivity model for shared feedback on mobile devices

A system that produces a dynamic haptic effect and generates a drive signal that includes a gesture signal and a real or virtual device sensor signal. The haptic effect is modified dynamically based on both the gesture signal and the real or virtual device sensor signal such as from an accelerometer or gyroscope, or by a signal created from processing data such as still images, video or sound.
Immersion Corporation

Combination gesture game mechanics using multiple devices

Embodiments provide techniques for altering a virtual world based on combinational input gestures. Embodiments retrieve a definition for a combinational gesture within a computer game, the definition specifying physical actions to perform according to a specified timing schedule in order to successfully perform the combinational gesture.
Disney Enterprises, Inc.

Image processing device and image processing image correction, and non-transitory computer readable recording medium thereof

The present disclosure is to generate a high-quality image by correcting a predetermined correction target image based on a plurality of input images. In an image processing device 3, an image correcting section 160 detects a user tap gesture on a touch panel 250.
Morpho, Inc.

Touch screen gesture for perfect simple line drawings

A method for drawing shapes includes receiving first input from a touch screen display indicating a first user touch continuing for a first touch time delay, then subsequent to said first touch time delay, receiving second input from the display indicating a second user touch in a pattern on the display. The method includes correlating the first input and the second input with a drawing shape.
Lenovo Enterprise Solutions (singapore) Pte. Ltd.

Interacting with user interfacr elements representing files

An example method is described in which files are received by a computer system. A first user interface is displayed on a first display of the computer system.
Hewlett- Packard Development Company, L.p.

Device, method, and graphical user interface for transitioning between display states in response to a gesture

An electronic device displays a user interface in a first display state. The device detects a first portion of a gesture on a touch-sensitive surface, including detecting intensity of a respective contact of the gesture.
Apple Inc.

Method and system for providing topic view in electronic device

Embodiments herein provide a method for providing a topic view using an electronic device. The method includes detecting, by a gesture detection unit of the electronic device, a user input on content displayed on a display of the electronic device.
Samsung Electronics Co., Ltd.

Input techniques for virtual reality headset devices with front touch screens

Systems and methods for detecting a user interaction by identifying a touch gesture on a touch interface on a virtual reality headset. The touch gestures are received on a front surface that is on the opposite side of the headset's inner display screen so that correspondence between the touch location and displayed content is intuitive to the user.
Adobe Systems Incorporated

Virtual reality clamshell computing device

A virtual reality clamshell computing device includes a number of projection devices to project a three-dimensional image, a number of infrared or illumination devices to illuminate a users hand, a number of ir sensors to detect ir wavelengths reflected off of the users hand, a processor, and a memory. The memory includes executable code that, when executed by the processor, extracts coordinate location data from the detected ir wavelengths reflected off of the users hand, interprets the coordinate location as a number of gestures performed by the user, and manipulates the display of the three-dimensional image based on the interpreted gestures..
Hewlett-packard Development Company, L.p

Device, device control method

A touch panel that can detect variations in electrostatic capacity in three-dimensional space detects variations in the electrostatic capacity of the gestures of a photographer, and an input unit performs acquisition. A lens adjustment unit performs focusing as a result of the recognition by a gesture recognition unit of a gesture wherein the thumb and the forefinger of the photographer are apart, an imaging unit performs shutter release as a result of the recognition of a gesture wherein the thumb and the forefinger of the photographer are brought into contact, the lens adjustment unit performs zooming as a result of the recognition of a gesture wherein the thumb and the forefinger of the photographer are rotated while apart, a light-emission unit turns a flash on as a result of the recognition of a gesture wherein one hand of an operator goes from clasped to open, and the light-emission unit turns the flash off as a result of the recognition of a gesture wherein one hand of the operator goes from open to clasped..

Proximity activated gesture

A proximity-activated gesture circuit includes an activation receiver electrode, a transmitter electrode, additional receiver electrodes, a control circuit, and a signal processor circuit. The control circuit is configured to activate the transmitter electrode and additional receiver electrodes when a capacitance measurement by the activation receiver electrode reaches a threshold.
Microchip Technology Incorporated

Method and system for generating a synthetic database of postures and gestures

Methods and systems for generating synthetic samples of postures and gestures are provided. In one embodiment, the system may include: a database configured to store at least one sample of a posture or a gesture; a sensing device; and a computer processor configured to: derive values of parameters relating to a specific user and/or environment and generate datasets of gesture and posture samples based on the derived values.
Infinity Augmented Reality Israel Ltd.

Method and system for recommending one or more gestures to users interacting with computing device

The present disclosure relates to a method and a system for recommending one or more gestures to a user interacting with a computing device. The system receives gesture data from one or more sensors.
Wipro Limited

Gesturing proximity sensor for spa operation

A spa can include one or more variable features or functions that can be controlled by a user using a gesture recognition sensor. The gesture recognition sensor can be located proximate to the spa and optionally out of the normal range of motion of a user in a basin of the spa.
Sundance Spas, Inc.

Method and assigning multi-channel audio to multiple mobile devices and its control by recognizing user's gesture

A method or apparatus for multi-channel audio data control using plural mobile devices comprising steps of automatically calculate positions of plural devices, transmit audio data to plural devices based on calculated positions, and execute control based on transmitted audio channel data. Control of the data can be executed by automatically decided or by recognized user gestures from the mobile devices..
Value Street, Ltd.

Sharing content within an evolving content-sharing zone

A user selects a content item that he wishes to send. He then performs a “sending” gesture and specifies an initial “content-sharing zone.” in order to be eligible to receive the selected content item, a receiving device must be located within the content-sharing zone.
Google Technology Holdings Llc

Animated liquid droplet environments

Embodiments herein describe a dripping system that displays an animation using different colored light sources. In one embodiment, the dripping system includes red, green, and blue light sources which can be activated individually or in combination to emit light that reflects off liquid droplets emitted by the dripping system.
Disney Enterprises, Inc.

Online detection and classification of dynamic gestures with recurrent convolutional neural networks

A method, computer readable medium, and system are disclosed for detecting and classifying hand gestures. The method includes the steps of receiving an unsegmented stream of data associated with a hand gesture, extracting spatio-temporal features from the unsegmented stream by a three-dimensional convolutional neural network (3dcnn), and producing a class label for the hand gesture based on the spatio-temporal features..
Nvidia Corporation

Partial gesture text entry

A graphical keyboard including a number of keys is output for display at a display device. The computing device receives an indication of a gesture to select at least two of the keys based at least in part on detecting an input unit at locations of a presence-sensitive input device.
Google Inc.

The controlling audio data by recognizing user gesture and position using multiple mobile devices

A method or apparatus for multi-channel audio data control using plural mobile devices comprising steps of detecting user presence through assigned frequency, transmitting audio data to the nearest device from the user based on user presence information, and recognize user gesture to control the audio data. A remote controller held by the user may also have a priority to control audio data..
Value Street, Ltd.

Method and gesture identification

A gesture identification method is provided for use in a terminal. The terminal includes a touch screen, at least one proximity sensor being distributed in the touch screen and having a transmitter and a receiver.
Beijing Xiaomi Mobile Software Co., Ltd.

Expanding a 3d stack of floor maps at a rate proportional to a speed of a pinch gesture

A digital map of a geographic area is displayed via a user interface, and a 3d representation of a multi-story building located in the geographic area is displayed on the digital map. The 3d representation includes multiple stacked floor maps corresponding to the floors of the multi-story building.
Google Inc.

Method of adjusting display area of electronic book contents

Provided is a method of adjusting a display area of electronic book contents. The electronic book contents include tags that divide texts of the electronic book contents into sentence or paragraph units.

Administration of web page

Manipulation of a web page displayed through a first device as a function of user interaction with a second device is contemplated. The manipulation may include operating the second device as a touchscreen or other gesture-based controllable device and automatically providing corresponding navigation within the web page as a function of interactions registered through the second device..
Cable Television Laboratories, Inc.

Swipe-based confirmation for touch sensitive devices

Techniques are disclosed for providing a swipe-based delete confirmation mode in electronic touch sensitive devices. The user can engage the delete confirmation mode by performing a delete command, which causes the device to display a delete confirmation swipe gesture prompt.
Barnes & Noble College Booksellers, Llc

Method and recognizing gesture

A method for recognizing a gesture includes: when light going into an ambient-light sensor is obstructed and no touch operation on the touch screen is detected, detecting whether the ambient-light sensor satisfies a preset change condition, the preset change condition including that the light going into the ambient-light sensor is changed from a non-obstructed state, in which the light is not obstructed from going into the ambient-light sensor, to an obstructed state, in which the light is obstructed from going into the ambient-light sensor, and then changed from the obstructed state to the non-obstructed state; when the ambient-light sensor satisfies the preset change condition, determining a position of the ambient-light sensor; and recognizing an operation gesture of a user according to the position of the ambient-light sensor.. .
Beijing Xiaomi Mobile Software Co., Ltd.

Gesture pre-processing of video stream using a markered region

Techniques are disclosed for processing a video stream to reduce platform power by employing a stepped and distributed pipeline process, wherein cpu-intensive processing is selectively performed. The techniques are particularly well-suited for hand-based navigational gesture processing.
Intel Corporation

Active region determination for head mounted displays

Some augmented reality (ar) and virtual reality (vr) applications may require that an “activity region” be defined prior to their use. For example, a user performing a video conferencing application or playing a game may need to identify an appropriate space in which they may walk and gesture while wearing a head mounted display without causing injury.
Eonite Perception Inc.

Method of obtaining gesture zone definition data for a control system based on user input

The invention is directed at a method of obtaining gesture zone definition data for a control system based on user input, wherein said user input is obtained through a mobile communication device external to said control system, the method comprising: receiving, by an image capture device, images of a space, and determining from the images, by a controller, a location data of the mobile communication device; providing, through the mobile communication device, a feedback signal in response to said determining of the location data, the feedback signal providing feedback information on said location data; receiving, via an input unit of the mobile communication device, an input signal indicative of an instruction command, and determining, by the controller, based on said instruction command, the gesture zone definition data. The invention is further directed at a method of operating a mobile communication device, to a computer program product, and to a control system..
Philips Lighting Holding B.v.

Gesture ambiguity determination and resolution

Systems, apparatuses, methods, and program products are disclosed. An apparatus may include a processor configured to determine whether a user-input gesture is an ambiguous gesture, present a set of candidate gestures corresponding to the user-input gesture, in response to the user-input gesture being an ambiguous gesture, and resolve the ambiguous gesture based on user-input.
Lenovo (singapore) Pte. Ltd.

Wearable consumer device

According to an embodiment, a wearable device includes a frame, a first circuit board within the frame, and a display over and coupled to the first circuit board. The wearable device further includes a second circuit board electrically coupled to the first circuit board, and a mm-wave gesture sensing system mounted on the second circuit board..
Infineon Technologies Ag

Exercise equipment with improved user interaction

Methods and systems are presented for accepting inputs into a treadmill or other exercise equipment to control functions of the treadmill or exercise equipment. An exercise control system can receive gestures and other inputs.
Flextronics Ap, Llc

Method for delivering contextual healthcare services and electronic device supporting the same

The present disclosure relates to a method for delivering a contextual healthcare service and an electronic device supporting the same. The method for delivering a contextual healthcare service according to various embodiments of the present disclosure may include the operations of generating at least one contextual group including at least one message having an identical context in an ongoing chat and displaying the at least one contextual group in the ongoing chat.
Samsung Electronics Co., Ltd

Method of verifying user intent in activation of a device in a vehicle

Vehicular systems and related methods offer improved user control over activations of vehicular components. User intent to perform an action can be detected using one or more sensors.
Huf North America Automotive Parts Mfg. Corp.

Vehicle gesture recognition system and method

Embodiments of vehicle gesture recognition systems and methods are disclosed. An example vehicle gesture recognition system comprises a data interface configured for receiving 2d image data from a 2d sensor and/or from a portable device camera via a portable device interface.
Harman Becker Automotive Systems Gmbh

System and authenticating user

Provided is a user authentication method using a natural gesture input. The user authentication method includes recognizing a plurality of natural gesture inputs from image data of a user, determining number of the plurality of natural gesture inputs as total number of authentication steps, determining a reference ratio representing a ratio of number of authentication steps requiring authentication pass to the total number of the authentication steps, determining an actual ratio representing a ratio of number of authentication steps, where authentication has actually passed, to the total number of the authentication steps, and performing authentication on the user, based on a result obtained by comparing the actual ratio and the reference ratio..
Electronics And Telecommunications Research Institute

Shared password protection within applications

Various techniques are disclosed for managing and modifying data items. In some embodiments, a first data item can be selected for password protection via establishing an active secured user session according to a set of user credentials.
Apple Inc.

System to facilitate and streamline communication and information-flow in health-care

Processes and systems for facilitating communications in a health care environment are provided. In one example, a process includes receiving a trigger from a wearable computer device to communicate with a medical application interface.

Ergonomic digital collaborative workspace apparatuses, methods and systems

The digital workspace ergonomics apparatuses, methods and systems (“dwe”) transform user multi-element touchscreen gestures via dwe components into updated digital collaboration whiteboard objects. In one embodiment, the dwe obtains user whiteboard input from a client device participating in a digital collaborative whiteboarding session.
Haworth, Inc.

Method for calling out music playlist by hand gesture

The invention discloses a method for calling out music playlist by hand gesture, comprising the following steps to: enter playing interface of music player, set up the corresponding plane coordinate system according to the playing interface, and set positive direction of x-axis and y-axis. Define a hand gesture area in the playing interface, and conduct action_move operation in the hand gesture area to call out music playlist.
Fiio Electronics Technology Co., Ltd.

Gesture-controlled tabletop speaker system

A speaker system includes a case, an audio input, speakers, an accelerometer, and a computer processor. The audio input is structured to receive a program audio signal from an audio device.
Avnera Corporation

Neural network for keyboard input decoding

In some examples, a computing device includes at least one processor; and at least one module, operable by the at least one processor to: output, for display at an output device, a graphical keyboard; receive an indication of a gesture detected at a location of a presence-sensitive input device, wherein the location of the presence-sensitive input device corresponds to a location of the output device that outputs the graphical keyboard; determine, based on at least one spatial feature of the gesture that is processed by the computing device using a neural network, at least one character string, wherein the at least one spatial feature indicates at least one physical property of the gesture; and output, for display at the output device, based at least in part on the processing of the at least one spatial feature of the gesture using the neural network, the at least one character string.. .
Google Inc.

Touch operation terminal

The present invention is applicable to the field of terminal technologies, and provides a touch operation method and apparatus for a terminal. The method includes: acquiring a touch gesture entered by a user on a screen; loading a display control in a first screen area corresponding to the touch gesture; loading a display interface of a second screen area onto the display control, where at least some different interface elements exist in a display interface of the first screen area and the display interface of the second screen area; and acquiring an operation instruction entered by the user on the display control, and operating, on the display control, the display interface of the second screen area according to the operation instruction.
Huawei Technologies Co., Ltd.

Method of character selection that uses mixed ambiguous and unambiguous character identification

Systems, devices and methods are disclosed for selection of characters from a menu using button presses and button presses that incorporate swipe gestures. In one embodiment, a button press ambiguously identifies a pair of characters in the menu.

Drag-and-set user interface for appliances

An appliance is provided that includes a control unit, a plurality of components, and a touchscreen in which the plurality of components and touchscreen are configured to operate under control of the control unit. A graphical user interface (gui) may be generated for display by the touchscreen.
Electrolux Home Products, Inc.

A controlling a hearing device via touch gestures, a touch gesture controllable hearing device and a fitting a touch gesture controllable hearing device

A method for controlling a hearing device via touch gestures carried out by a wearer of the hearing device. As part of the gesture a finger is swiped across first and second sound inlets of the hearing device.
Sonova Ag

Gesture control interacting with a mobile or wearable device utilizing novel approach to formatting and interpreting orientation data

The aim of the present invention is to provide a method to solve the common drift problems and 3d orientation errors related to the use of orientation data of a mobile or wearable device and target system and for using mobile or wearable device as a human-machine interface (hmi) using inertial measurement units (imus) and potentially other sensors (for example cameras and markers, radar systems) as input data to convert user's motion into an interaction, a pointer on the screen or a gesture. The contribution of this invention is a solution for well-known problems related to the use of imus and motion sensors as input devices for user interaction in general, as well as specific embodiments and application scenarios of these methods where wearable and/or mobile devices are used to control specific interfaces..
16lab Inc.

Gesture control module

A gesture-control interface is disclosed, comprising a camera, an infrared led flash, and a processor that identifies the hand pose or the motion of the hand.. .

Gesture control interacting with a mobile or wearable device

The aim of the present invention is to provide low-power gesture control method for mobile and wearable devices for interacting target devices. Furthermore, this invention presents a take on solving the modality switching problems known from prior art, where one modality can be used to activate another..
16lab Inc.

Interactive mirror

The present invention relates to an interactive mirror comprising a reflective mirror surface, at least one display integrated into the mirror surface, at least one active infrared sensor integrated into the mirror, and a processor adapted to determine a person's motions and/or gestures made in front of the mirror on the basis of the data captured by the at least one sensor.. .
Iconmobile Gmbh

Gesture control device with fingertip identification

A gesture-control interface is disclosed, comprising a camera, an infrared led flash, and a processor that identifies the finger motion in the image.. .
Antimatter Research, Inc.

System and feature activation via gesture recognition and voice command

Various embodiments of the present disclosure provide a system and method for activating a vehicle feature through gesture recognition and voice command by an authorized user. Generally, a vehicle control system of the present disclosure obtains gesture commands and voice commands from an authorized user and communicates with a body control module to complete the vehicle feature.
Ford Global Technologies, Llc

Surgical instrument with orientation sensing

A surgical instrument comprises a body assembly and an end effector. The body assembly includes a control module, an orientation sensor communicatively coupled to the control module, and an energy component.
Ethicon Llc

Resource allocation based on available resources via interactive interface

Embodiments of the invention are directed to a system, method, or computer program product for providing resource allocation based on available resources via an interactive resource interface. In this way, the invention provides a comprehensive integrated platform for identification, continual monitoring and optimal allocation of resources on a mobile device.
Bank Of America Corporation

Systems and methods for enabling transitions between items of content

A computing device for facilitating access to items of content includes is configured to enable communication with a companion device. A companion device includes a user interface including a touch panel.
Opentv, Inc.

Systems and methods for enabling transitions between items of content based on multi-level gestures

A computing device for facilitating access to items of content includes is configured to enable communication with a companion device. A companion device includes a user interface including a touch panel.
Opentv, Inc.

Display apparatus and control methods thereof

A display apparatus includes a communication interface configured to communicate with another display apparatus, a display configured to display contents being shared with the other display apparatus and a video call user interface (ui) for a video call with a user of the other display apparatus, and a processor, in response to at least one of a gesture and a voice of the user included in video call data received from the other display apparatus satisfying a predetermined condition, configured to control the display to change a size of the video call ui displayed on the display.. .
Samsung Electronics Co., Ltd.

Cross device information exchange using gestures and locations

A method, apparatus and software related product (e.g., a computer readable memory) are presented for exchanging information between two or more devices when they are in a close proximity using gestures and web technologies. According to an embodiment, the identification of one or more devices using gestures is asynchronous, so that the two or more devices do not have to be shaken together at the same time synchronously, which is one advantage over conventional approaches.
Excalibur Ip, Llc

Identification of computerized bots and automated cyber-attack modules

Devices, systems, and methods of detecting whether an electronic device or computerized device or computer, is being controlled by a legitimate human user, or by an automated cyber-attack unit or malware or automatic script. The system monitors interactions performed via one or more input units of the electronic device.
Biocatch Ltd.

Voice control method and apparatus

This patent disclosure relates to the field of communications, and discloses a voice control method and a device thereof. Some embodiments of the present disclosure include the following steps: generating, according to collected voice information, a corresponding instruction for execution, and generating a corresponding graph, where the corresponding graph is used to display a recognition result for the voice information; embedding the generated corresponding graph into a view page, and displaying, in a current human-computer interaction interface, a corresponding graph generated according to most recently collected voice information; and if a gesture sliding operation is detected in the human-computer interaction interface, displaying, in the human-computer interaction interface, a corresponding graph indicated by the gesture sliding operation, and executing a corresponding instruction of the indicated corresponding graph.
Le Shi Zhi Xin Electronic Technology (tianjin) Limited

Smart home control using modular sensing device

A modular sensing device and method of operating a smart home device includes initiating a control mode from a plurality of modes on the modular sensing device, where the control mode determines a manner in which user gestures are interpreted. Based on initiating the control mode, a connection with the smart home device can be established.
Sphero, Inc.

Detection of hand gestures using gesture language discrete values

(d) estimating which of the hand gestures best matches the runtime sequence depicted in the timed images by optimizing score functions using the estimation terms for the runtime hand datasets.. .

Translation of gesture to gesture code description using depth camera

A system of injecting a code section to a code edited by a graphical user interface (gui) of an integrated development environment (ide), comprising: a memory storing a dataset associating each code segment with one hand pose feature or hand motion feature; an imager adapted to capture images of a hand while an ide being executed on a client terminal; and processor for executing code of an application, comprising: code instructions to identify at least one of the features and at least one discrete value of the identified features from an analysis of the images; code instructions to select at least one of the code segments associated with the identified features; and code instructions to add automatically a code section generated based on the code segments and the discrete value to a code presented by a code editor of the ide.. .
Microsoft Technology Licensing, Llc

Quick gesture input

A computer-implemented user interface method for a computing device is disclosed. The method includes associating each of a plurality of telephone keys with a direction of each key relative to a center of a telephone keypad, receiving a contact from a user of the device at a location on a touchscreen display of a computing device and an input at a direction relative to the location of the user contact, and causing a telephone number to be entered on the computing device based on the direction of each key relative to the center of the telephone keypad corresponding to the direction relative to the location of the user contact..
Google Inc.

Out-of-band commissioning of a wireless device through proximity input

A method of performing out-of-band commissioning is provided. The method may include enabling a pairing mode on a commissioning device, generating a gesture code on the commissioning device, receiving a gesture input on a node device, verifying an agreement between the gesture code and the gesture input, and commissioning the node device based on the agreement..
Disruptive Technologies Research As

Pull-down gesture processing method, device, and system

The present disclosure discloses a pull-down gesture processing method, device and system. The method comprises: refreshing the current display interface in response to that a pull-down distance of a pull-down gesture of a user on a touch display screen is greater than a first set distance and smaller than or equal to a second set distance, wherein the second set distance is greater than the first set distance; and exiting a current display interface in response to that the pull-down distance of the pull-down gesture is greater than the second set distance..
Guangzhou Ucweb Computer Technology Co., Ltd.

Electronic device and displaying information in response to a gesture

A method includes displaying information associated with a first application on a touch-sensitive display of an electronic device. A gesture is detected on the touch-sensitive display, which gesture indicates a request to display information associated with a second application.
Blackberry Limited

Mobile device camera viewfinder punch through effect

In an example embodiment, an application is executed on a mobile device, causing generation of application graphical output in an application layer. The application layer is rendered on a touchscreen display.
Futurewei Technologies, Inc.

Hand gesture api using finite state machine and gesture language discrete values

2) code instructions to associate the unique logical sequence with application functions per the instructions for initiating execution of the functions during the application runtime in response to detection of the unique logical sequence from analysis of images depicting movement of user's hand(s).. .

Touch-based link initialization and data transfer

This disclosure is directed to touch-based link establishment and data transfer. In one embodiment, a gesture drawn on the surface of a touch-sensitive display may trigger a device to engage in link establishment, to advertise the availability of data to share, to receive shared data etc.
Intel Corporation

Accessing additional search results functionality using gestures

Techniques include transmitting a search query to a search system and receiving search results from the system. Each result may include an access url specifying a first state of a software application (app), the first state associated with an entity and a function performed for the entity, and a function url specifying a second state of a software app, the second state associated with the same entity and a different function performed for the entity.
Quixey, Inc.

Systems and methods for enabling transitions between items of content based on swipe gestures

A computing device for facilitating access to items of content includes is configured to enable communication with a companion device. A companion device includes a user interface including a touch panel.
Opentv, Inc.

Information processing device

An information processing device, includes: a gesture detection unit that recognizes gestures by a user; an output information control unit that controls output information to a display unit; and a device control unit, wherein: if the gesture detection unit has detected that the user has raised a hand for a certain time period, then the output information control unit displays upon the display unit a plurality of candidates for selection in association with gestures for operation; and if the gesture detection unit has detected a predetermined gesture by the user, the device control unit considers that, among the candidates for selection displayed upon the display unit, a candidate corresponding to the gesture that has been detected has been selected.. .
Clarion Co., Ltd.

Input device and input method using the same

An input device and an input method using the same are provided. The input device is adapted to a computing device and includes a stylus body, a tip sensing module, a gesture sensing module, a processor and a wireless module.
Asustek Computer Inc.

Gesture control

gesture control uses electromagnetic power signatures. A signal is received and a power of the signal is determined.
At&t Intellectual Property I, L.p.

Non-line-of-sight radar-based gesture recognition

This document describes techniques and devices for non-line-of-sight radar-based gesture recognition. Through use of the techniques and devices described herein, users may control their devices through in-the-air gestures, even when those gestures are not within line-of-sight of their device's sensors.
Google Inc.

Radar-based gesture-recognition through a wearable device

This document describes techniques and devices for radar-based gesture-recognition through a wearable device. The techniques enable an easy-to-use input interface through this wearable radar device, in contrast to small or difficult-to-use input interfaces common to wearable computing devices.
Google Inc.

Method for evaluating gestures

It is now the object of the invention to lower the structural requirements relating to the determination of gestures.. .

System and inputting gestures in 3d scene

The present disclosure discloses a system and a method for inputting a gesture in a 3d scene. The system comprises a gesture acquiring unit configured to simultaneously acquire at least two channels of video stream data in real time at different angles for a user's gesture; a gesture recognizing unit configured to recognize a gesture shape varying in real time from the at least two channels of video stream data; a gesture analyzing unit configured to analyze the gesture shape varying in real time to obtain corresponding gesture motion; and a gesture displaying unit configured to process the gesture motion into a 3d image and display the 3d image in the 3d scene in real time.
Qingdao Goertek Technology Co., Ltd.

Modular sensing device implementing state machine gesture interpretation

A modular sensing device can include an inertial measurement unit to generate sensor data corresponding to user gestures performed by a user, a mode selector enabling the user to select a mode of the modular sensing device out of a plurality of modes, and one or more output devices to generate output based on the user gestures and the selected mode. The modular sensing device can further include a controller to implement a plurality of state machines.
Sphero, Inc.

Modular sensing device for controlling a self-propelled device

A wearable device can be worn by a user, and can include one or more sensors to detect user gestures performed by the user. The wearable device can further include a wireless communication module to establish a communication link with a self-propelled device, and a controller that can generate control commands based on the user gestures.
Sphero, Inc.

Modular sensing device for processing gestures

A modular sensing device can generate output via one or more output devices. The modular sensing device can include an inertial measurement unit that generates sensor data corresponding to user gestures performed by a user, a wireless communication module, a mode selector enabling the user to select a mode of the modular sensing device out of a plurality of modes, and one or more output devices configured to generate output based on the user gestures and the selected mode.
Sphero, Inc.

Hand gesture recognition for cursor control

A system for hand gesture recognition is described herein. The system includes a display, camera, memory, and processor.
Intel Corporation

Gestures visual builder tool

(3) code instructions to generate a code segment defining the one or more hand pose/motion features records through the discrete pose/motion values respectively.. .

Electrical device for hand gestures detection

(4) initiate action(s) to the controlled unit. The action(s) are associated with selected hand gesture(s) based on the estimation..

Multimodal interaction using a state machine and hand gestures discrete values

(2) code instructions to associate the logical sequence with the application function(s) for initiating an execution of the application function(s) during runtime of the application in response to detection of the logical sequence by analyzing a captured data depicting a user during runtime.. .

Wearable interactive gaming device

The present invention relates to wearable interactive gaming device and a method for enabling interactive gaming. The wearable interactive gaming device includes at least one wearable unit wearable on a player.

Three-dimensional object tracking to augment display area

In some examples, a surface, such as a desktop, in front or around a portable electronic device may be used as a relatively large surface for interacting with the portable electronic device, which typically has a small display screen. A user may write or draw on the surface using any object such as a finger, pen, or stylus.
Microsoft Technology Licensing, Llc

Apparatus and disambiguating information input to a portable electronic device

An electronic device with multiple user interfaces configured such that more than one interface ambiguously responds to a user gesture intended as input to the device. To remove ambiguity, the device may operate in one of a plurality of input modes, in which outputs of different ones of the user interfaces are selectively processed.
Infinite Potential Technologies Lp

Automated vehicle operation based on gesture to pedestrian

A gesture detection system suitable to operate an automated vehicle includes a gesture-detection-device, a pedestrian-detection-device, and a controller. The gesture-detection-device is used to detect a gesture made by an occupant of a host-vehicle.
Delphi Technologies, Inc.

Method and external operation of an actuator of a vehicle

It is the object of the invention to simplify the operation of an actuator of a vehicle by a gesture.. .

Method and gesture-based device access

A system and method for completing document processing operations between a user device and a document processing device such as multifunction peripheral includes capturing a user gesture at the user device. This gesture is digitized and associated with one or more electronic documents, one or more targeted multifunction peripherals and instructions for processing of the document.
Toshiba Tec Kabushiki Kaisha

Gesture-based signature authentication

Embodiments of the invention are generally directed to systems, methods, devices, and machine-readable mediums for implementing gesture-based signature authentication. In one embodiment, a method may involve recording a first gesture-based signature and storing the recorded first gesture-based signature.
Intel Corporation

Touchless management system

A touchless management method and system can include: a server beacon, the server beacon including a gesture sensor, a motion sensor, a managed sensor, a server beacon mass storage, and a server beacon power transceiver; detecting gesture data from the gesture sensor; recording sensor data with the managed sensor; a power station including a power station power transceiver, a station control unit, upload coordinator, and a station storage unit; sending a packet from the server beacon to the power station; prioritizing the packet; uploading a message including the sensor data to the power station; and uploading the message to a database server.. .
Washsense, Inc.

Text functions in augmented reality

Various systems and methods for implementing text functions in augmented reality are described herein. A system for implementing text functions in augmented reality includes a display to display a field of view to a user of the system; a gesture detection module to detect a selection gesture performed by a user of the system, the selection gesture defining a selection area in the field of view; a camera array to capture an image of the selection area; a text module to perform a text operation on text identified in the image; and a presentation module to present an indication of the text operation to the user..

Gesture-aware friendship bands

Systems and methods may identify local gesture data in a wearable device including a wrist-worn form factor and identify remote gesture data in a wireless transmission received by the wearable device. Additionally, a loyalty tracker may be incremented based on a correlation between the local gesture data and the remote gesture data.

Method and device for distinguishing finger and wrist

The present disclosure relates to the technical field of gesture recognition, and a method and apparatus for distinguishing between fingers and a wrist are disclosed. In some embodiments of the present disclosure, the following steps are included: acquiring a hand image, where the hand image includes fingers, a wrist, and an arm; calculating a dividing line that passes through a palm center location of the hand image and is perpendicular to a main axis of the hand image; acquiring respective relationships, between an area and a circumference, of two image areas formed by dividing the hand image by using the dividing line; and determining the image areas in which the fingers and the wrist are respectively located, according to the relationships between the area and the circumference.
Le Shi Zhi Xin Electronic Technology (tianjin) Limited

Apparatus and recognizing hand gestures in a virtual reality headset

A virtual reality (vr) headset configured to be worn by a user. The vr headset comprises: i) a forward-looking vision sensor for detecting objects in the forward field of view of the vr headset; ii) a downward-looking vision sensor for detecting objects in the downward field of view of the vr headset; iii) a controller coupled to the forward-looking vision sensor and the downward-looking vision sensor.
Samsung Electronics Co., Ltd.

User terminal device, and mode conversion method and sound system for controlling volume of speaker thereof

A user terminal apparatus is disclosed. The user terminal apparatus includes a touch screen which senses a multi gesture that is performed by using at least two fingers or other input tools, and a controller which provides an individual volume control mode by which a volume of one speaker apparatus is independently controllable with respect to a volume of the remainder of a plurality of speaker apparatuses, and which is convertible into a group volume control mode in order to combine a plurality of speaker apparatuses into a group such that volumes of the plurality of speaker apparatuses can be jointly controlled in response to the multi gesture sensed via the touch screen while the individual volume control mode is provided..
Samsung Electronics Co., Ltd.

Device, method, and graphical user interface for integrating recognition of handwriting gestures with a screen reader

While an electronic device with a display and a touch-sensitive surface is in a screen reader accessibility mode, the device displays a character input area and a keyboard, the keyboard including a plurality of key icons. The device detects a sequence of one or more gestures on the touch-sensitive surface that correspond to one or more characters.
Apple Inc.

Gesture recognition a touchpad

A gesture recognition method has steps of detecting a first number of first objects touching a touchpad, detecting a second number of second objects tapping the touchpad when the first number of first objects still touches the touchpad, determining that a shortest distance between the first number of first objects and the second number of second objects is less than a preset spacing distance, and enabling a gesture function. Accordingly, gestures provided through the foregoing gesture recognition method are advantageous in being user-friendly, relaxed, convenient and smooth in operation..
Elan Microelectronics Corporation

Multi-user content presentation system

One or more embodiments of the disclosure provide systems and methods for providing content presentations to users of a content presentation system. A content presentation generally includes a plurality of content items provided by one or more users of the content presentation system.
Facebook, Inc.

Use rotate gesture to quickly scroll and select in place

The present disclosure describes a method to select an item from a list of items. Upon user touching the selectable item, a new selection interface is shown, including a scroll list of a subset of the selectable items, and an indicator that the user can rotate to select.
Vimo Labs Inc.

Multi-user content presentation system

One or more embodiments of the disclosure provide systems and methods for providing content presentations to users of a content presentation system. A content presentation generally includes a plurality of content items provided by one or more users of the content presentation system.
Facebook, Inc.

Copy-paste history on a mobile device

This disclosure provides a method, system, and computer-readable medium for maintaining a history of copied objects using a mobile device and providing a menu listing the history of copied objects when a preconfigured gesture is received by the mobile device. The history of copied objects includes text objects, such as words, phrases, sentences, or complete paragraphs, and non-text objects, such as images, sounds, movies, and other such non-text objects.
Successfactors, Inc.

Operating apparatus, control method therefor, and storage medium storing program

An image on an operating surface is captured, and first image data are acquired. A gesture performed by a user on the operating surface is recognized on the basis of the first image data.
Canon Kabushiki Kaisha

Flexible display sensing

Systems, apparatuses and methods may provide for a flexible display detection system to detect movement (e.g., flexing) of a flexible display screen on a device and to interpret the movement as a gesture or selection of a device mode. Embodiments may utilize bendable sensors located adjacent to the flexible display, and may divide a display into multiple panel sections (e.g., three panels) connected to one another via hinging portions of the flexible display.

Gesture-based controls via bone conduction

Concepts and technologies are disclosed herein for utilizing bone conduction to detect gestures. According to one aspect, a device can generate a signal and send the signal to a sensor network that is connected to a user.
At&t Intellectual Property I, L.p.

Gesture based user interface

A gesture based user interface includes a movement monitor configured to monitor a user's hand and to provide a signal based on movements of the hand. A processor is configured to provide at least one interface state in which a cursor is confined to movement within a single dimension region responsive to the signal from the movement monitor, and to actuate different commands responsive to the signal from the movement monitor and the location of the cursor in the single dimension region..
Apple Inc.

Detection device, detection method, and computer readable storage medium

A detection device including: a sensor configured to emit a light and detect an object by detecting the light reflected from the object, and a processor configured to determine, when the object is detected in a first region that is narrower than a range where the light reaches, a motion of the object to be a gesture input for the detection device.. .
Fujitsu Limited

Virtual reality system with control command gestures

A virtual reality system that uses gestures to obtain commands from a user. Embodiments may use sensors mounted on a virtual reality headset to detect head movements, and may recognize selected head motions as gestures associated with commands.
Ariadne's Thread (usa), Inc. (dba Immerex)

Method, system and smart glove for obtaining immersion in virtual reality system

Disclosed are a method, system and smart glove for obtaining immersion in a virtual reality system. The method includes: when an image capturing device captures a case where a finger of a user makes a gesture, a virtual reality client obtains the gesture of the finger and determines whether a target preset gesture identical to the gesture of the finger exists or not, and if yes, sends touch sensation demand information corresponding to the target preset gesture and identification information of the finger to a smart hand wearable device; and the smart hand wearable device converts the received touch sensation demand information into a response action and makes it act on the corresponding finger, such that the finger produces a sensation corresponding to the response action..
Le Shi Zhi Xin Electronic Technology (tianjin) Limited

Apparatus and method to visually communicate with a vehicle

A visual communication apparatus affixed to a vehicle. The visual communication apparatus includes a smart key system that detects a key fob of a user within a keyless detection zone, a projector projecting visual indications on a projecting zone lying on a ground surface of the vehicle, a sensor optically capturing gestures of the user of the vehicle, and an electrical control unit capable of actuated key elements of the vehicle.
Aisin Technical Center Of America, Inc.

Distribution system, distribution method, and distribution device

A distribution system is provided, the distribution system including a mobile communication terminal for detecting a gesture of a user by using a gesture detection device that is wirelessly connected, and a distribution device that can communicate with the mobile communication terminal, the distribution system including an accepting unit configured to accept registration of first acceleration data expressing a variation in acceleration within a predetermined time, and advertisement data indicating an advertisement to be displayed on the mobile communication terminal; a distributing unit configured to distribute the first acceleration data and the advertisement data to the mobile communication terminal; a comparing unit configured to compare the distributed first acceleration data with second acceleration data expressing the gesture of the user sent from the gesture detection device; and a first outputting unit configured to output the advertisement data, when the comparing unit makes the comparison.. .
Moff, Inc.

Adaptive group interactive motion control 2d and 3d video

Adaptive system and method for interacting with and navigating 2d and 3d audiovisual content by interpreting the side to side and/or up and down physical gestures and sounds of a multi-person audience. The system may be used in a movie theatre, stadium, music arena, or other venue where an audience views the same view screen.
Audience Entertainment Llc

Cloud-based custom metric/timer definitions and real-time analytics of mobile applications

A method for real-time capture of analytics from real users of a native mobile application (app) includes storing a custom metric/timer definition for a native mobile application (app) in a configuration file on a server. The custom metric/timer definition includes one or more identifiers of an element or object of the native mobile app selected by touch gesture input via a user interface on a mobile device running the native mobile app in a special mode.
Soasta, Inc.

System and gesture-based management

A system includes a first mobile device configured to initiate communication with at least one other mobile device. The first mobile device includes a status indicator configured to provide a persistent visual indication to a user of the status of a mute function of the first user device during the active communication.
Intel Corporation

Identity authentication method and apparatus, terminal and server

A method, an apparatus, a terminal, and a server for identity authentication are disclosed. The method includes: receiving dynamic face authentication prompt information sent by a server during identity authentication of a user; obtaining gesture recognition information of the dynamic face authentication prompt information by recognizing a facial gesture presented by the user; and sending the gesture recognition information to the server to enable the server to confirm that the identity authentication is successful for the user in response to verifying that the gesture recognition information is consistent with the dynamic face authentication prompt information.
Alibaba Group Holding Limited

Motion and gesture-based mobile advertising activation

The presentation of advertisements to a user on a mobile communications device is disclosed. A first external input corresponding to a triggering of an advertisement delivery is received on a first input modality.
Adtile Technologies Inc.

Crowd gesture recognition

Various systems and methods for implementing crowd gesture recognition are described herein. A system for implementing crowd gesture recognition includes an accelerometer; a gyrometer; a gesture detection circuit to: detect an air gesture performed by a user of the system based on data from the accelerometer and gyrometer; and parameterize an intensity of the air gesture; a processor subsystem to determine a transmission frequency band and a transmission strength based on the air gesture and the intensity of the air gesture; and a transducer to transmit a signal on the transmission frequency band with the transmission strength..
Intel Corporation

Pattern password with variable hint pattern

A device unlock pattern (“pattern password”) is static in that the same pattern is entered each time to unlock a device. Due to this repetition, a pattern password may be discovered by an application that captures touchscreen gestures, by inspection of fingerprints or smudges on a screen, or simply by an onlooker that views the pattern password being entered.
Ca, Inc.

Customizable gestures for mobile devices

Users are enabled to define and modify mappings between (1) gestures and (2) actions performed by one or more computing devices in response to a device detecting performance of a gesture. A generalized gesture-to-action mapping framework allows users to intuitively define and modify such mappings.
Yahoo! Inc.

Cascaded touch to wake for split architecture

Aspects of the disclosure are related to a touch controller for use in a device having a processor and a touch sensor panel, the touch controller being coupled to the processor and the touch sensor panel, and further comprising: an analog front-end (afe), wherein the afe is configured to generate raw touch image data based on electrical signals generated by the touch sensor panel in response to one or more detected touches thereto; a coarse processing element configured to, in response to the processor being set to a sleep mode, coarsely process the raw touch image data to generate sparse data; and an embedded memory configured to store at least the sparse data, wherein the touch controller is configured to transmit a signal to the processor to wake the processor up and transmit the stored sparse data and new touch image data to the processor, wherein the processor performs gesture recognition based on the sparse data and the new touch image data.. .
Qualcomm Incorporated

Method for interaction with terminal and electronic the same

The present application discloses a method for interaction with terminal and an electronic apparatus for the same. The method includes: determining whether a downward acceleration of a gesture is greater than a default threshold value when the gesture is detected under state of a displayed interface, wherein the displayed interface comprises: a replying information and recognition result interface, a replying information full screen interface, or a replying information full screen extension interface after record of speech in a speech recognition interface is detected to be finished; determining an operation type corresponding to the gesture, according to determination of whether the downward acceleration of the gesture is greater than the default threshold value; and executing an interaction corresponding to the operation type, according to the operation type..
Le Shi Zhi Xin Electronic Technology (tianjin) Limited

Centering gesture to enhance pinch-to-zoom gesture on touchscreens

A computing device detects movement of two contact positions on a touchscreen of the computing device as a pinch-to-zoom gesture. While detecting the movement of the two contact positions on the touchscreen, the computing device detects a third stationary contact position on the touchscreen of the computing device, as a centering gesture related to the pinch-to-zoom gesture.
Lenovo Enterprise Solutions (singapore) Pte. Ltd.

Systems and methods for identifying dominant hands for users based on usage patterns

Systems, methods, and non-transitory computer-readable media can detect a set of swiping touch gestures from a user. The set of swiping touch gestures can be analyzed to determine at least one respective movement property for each swiping touch gesture in the set of swiping touch gestures.
Facebook, Inc.

Input device

A control section allows a screen of a display section to display a plurality of numeric keys, detects a touch gesture on the individual numeric key through a touch panel to allow an input of a numeric number corresponding to the numeric key subjected to the touch gesture, and detects an action on each of hard keys to accept an instruction to perform a function corresponding to the hard key subjected to the action. When a predetermined action is performed on one or more individuals of the hard keys, the control section assigns numeric numbers to the respective hard keys and detects an action on the individual hard key to allow an input of the numeric number corresponding to the hard key subjected to the action..
Kyocera Document Solutions Inc.

Primary device that interfaces with a secondary device based on gesture commands

An incoming call from a remote device can be received by a primary device. The primary device can determine a numerical count of detected user gestures.
Google Technology Holdings, Llc.

System for hand gesture detection

A system for hand gesture detection is provided, comprising: a wrist wear adapted to be worn about a wrist of a user of the system and including a set of skin electrodes adapted to face the wrist; an impedance measurement circuit adapted to measure at least a first impedance in a first portion of the wrist and a second impedance in a second portion of the wrist which second portion is circumferentially displaced in relation to said first portion, wherein the first impedance is measured via a first electrode group including four skin electrodes of said set of skin electrodes and the second impedance is measured via a second electrode group including four skin electrodes of said set of skin electrodes, and a processing circuit adapted to detect a hand gesture of the user based on the first and the second impedance measured by the impedance measurement circuit.. .
Stichting Imec Nederland

Dynamic effects processing and communications for wearable devices

Processing techniques and device configurations for performing and controlling output effects at a plurality of wearable devices are generally described herein. In an example, a processing technique may include receiving, at a computing device, an indication of a triggering gesture that occurs at a first wearable device, determining an output effect corresponding to the indication of the triggering gesture, and in response to determining the output effect, transmitting commands to computing devices that are respectively associated with a plurality of wearable devices, the commands causing the plurality of wearable devices to generate the output effect at the plurality of wearable devices.

Two-step gesture recognition for fine-grain control of wearable applications

The present disclosure provides methods, devices, systems, and computer program products for providing fine-grain gesture-based control of wearable applications. Methods are provided for multi-step gesture-based control systems of wearable applications with an initial, easy to recognize gesture being used to place the device in a state that subtle gestures can be identified that can control navigation and interactivity on the device that rely on the user being able to view the device.
Sap Se

Hand skeleton comparison and selection for hand and gesture recognition with a computing interface

Hand skeletons are compared to a hand image and selected. The hand skeletons are used for hand and gesture recognition with a computing interface.
Intel Corporation

Free-form drawing and health applications

Various systems and methods for implementing free-form drawing for health applications are described herein. A system for implementing a health application includes a user interface module to receive, at a user device, a plurality of parameters including a free-form gesture path, the free-form gesture path representing an air gesture performed by a user of the user device; and a control module to adjust a fitness routine of the user based on the plurality of parameters..

Gesture management system

For storing gesture definitions and evaluating expressions that reference the gesture definitions, an expression evaluation engine evaluates the expressions to determine whether movements of a user satisfy the expressions. The expression evaluation engine receives expressions in user or application requests, or the expression evaluation engine may automatically evaluate the expressions when a gesture recognition system receives updated information about tracked body parts of the user.
Palantir Technologies, Inc.

Method of operating a control system and control system therefore

A method of operating a control system for controlling a device, the control system comprising a motion capture equipment, and a controller for providing control signals for controlling one or more device functions of the device, the method comprising the steps of: capturing, by the motion capture equipment, motion picture images of a space and providing the motion picture images to the controller; analyzing, by the controller, the motion picture images for detecting user input from a user in the space, and detecting by the controller a gesture performed by the user; and providing, by the controller in response to said detecting of the gesture, a control signal to the device for controlling a selected device functions of said one or more device functions; wherein said analyzing is performed by the controller by monitoring one or more gesture zones in said motion picture images, each gesture zone being associated with one respective device function of said plurality of device functions, and wherein for providing said control signal the controller determines the gesture zone wherein the gesture is detected for establishing the selected device function to control.. .
Koninklijke Philips N.v.

Systems and methods for controlling an unmanned aerial vehicle

Systems and methods for controlling an unmanned aerial vehicle recognize and interpret gestures by a user. The gestures are interpreted to adjust the operation of the unmanned aerial vehicle, a sensor carried by the unmanned aerial vehicle, or both..
Gopro, Inc.

Drone delivery of coffee based on a cognitive state of an individual

Coffee or other drink, for example a caffeine containing drink, is delivered to individuals that would like the drink, or who have a predetermined cognitive state, using an unmanned aerial vehicle (uav)/drone. The drink is connected to the uav, and the uav flies to an area including people, and uses sensors to scan the people for an individual who has gestured that they would like the drink, or for whom an electronic analysis of sensor data indicates to be in a predetermined cognitive state.
International Business Machines Corporation

Systems and methods of an adaptive interface to improve user experience within a vehicle

The present disclosure relates to a computer-readable device cause a processor to perform operations for interpreting a user request, such as a tactile inputs, gestures, and speech, to a vehicle system. The operations include receiving an input data package comprising user communications.
Gm Global Technology Operations Llc

Controlled lamp device

A controlled lamp device (1), comprising: a lamp housing (2) with a light exit opening (3), a sensor unit (4) for detecting a contactless manual intervention (15, 16-1, 16-2, 17) of an operator of the lamp device in an intervention region (5, 5-1, 5-2), and an evaluation and control device (6) for evaluating the intervention of the operator detected by the sensor unit and for influencing a control parameter for the operation of the lamp device depending on a result of the evaluation, wherein the sensor unit (4) is provided in and/or on the light housing (2) laterally adjacent to the light exit opening (3), and the evaluation and control device (6) is designed such that it only influences the control parameter if the evaluation carried out by the evaluation and control device (6) shows that a predefined path in an intervention region has been covered in a gesture-like manner during the intervention of the operator.. .
Steinel Gmbh

Motion-based authentication for a gesture-based device

A motion-based authentication method is operative in a mobile computing device having a display interface and that includes an accelerometer. Normally, the device software includes a locking mechanism that automatically locks the display interface after a configurable timeout.
Logmein, Inc.

Data transmission controlling device and data transmission controlling a mobile device

An embodiment of the present disclosure relates to the technical field of data transfer between a mobile device and a television, and discloses a data transmission controlling device for a mobile device and a data transmission controlling method for a mobile device. The data transmission controlling device for a mobile device includes: a detector, configured to detect at least one of a moving track of the mobile device, a sliding gesture on a screen of the mobile device, and a key instruction of the mobile device; a first transceiver; and a processor, configured to control the first transceiver to send a uniform resource locator of a video currently being played on the mobile device and a timestamp of the current playback, when at least one of a particular moving track, a particular sliding gesture, and a particular key instruction is detected.
Le Shi Internet Information & Technology Corp., Beijing

Method and electronic adjusting viewing angle of smart television playing panorama videos

Disclosed are a method and a electronic apparatus of adjusting a viewing angle of a smart television playing panorama videos, wherein the method is applied to a terminal apparatus and includes: displaying a touch control region for adjusting the viewing angle when opening an application program for controlling the smart television is detected; detecting a handed gesture input onto the touch control region to determine a viewing angle adjustment parameter corresponding to the detected handed gesture input; sending the viewing angle adjustment parameter to the smart television by communication with the smart television, so as to adjust the viewing angle. The disclosure uses a smart television with bluetooth and wifi communication functions to connect to the internet, and adjusts a panoramic play parameter of the smart television via a terminal apparatus communicating with the smart television..
Le Shi Internet Information Technology Corp. Beijing

Method, system and device for navigating in a virtual reality environment

A method, a system, and a device for navigating in a virtual reality scene, using body parts gesturing and posturing are provided herein. The method may include: projecting a synthetic 3d scene, into both eyes of a user, via a near eye display, so as to provide a virtual reality view to the user; identifying at least one gesture or posture carried out by at least one body part of said user; measuring at least one metric of a vector associated with the detected gesture or posture; applying a movement or action of said user in virtual reality environment, based on the measured metrics; and modifying the virtual reality view so as to reflect the movement or action of said user in the virtual reality environment..
Facebook, Inc.

Systems and methods for implementing retail processes based on machine-readable images and user gestures

Systems and methods for implementing retail processes based on machine-readable images and user gestures are disclosed. According to an aspect, a method includes capturing one or more images including a machine-readable image and a user gesture.
Toshiba Global Commerce Solutions Holdings Corporation

Information processing apparatus, control method, and program

There is provided an information processing apparatus including circuitry configured to initiate display of a virtual object, based on a gesture operation, starting from a point of origin and moving towards a target point; and continue to display the virtual object in display motion after the gesture operation, wherein a path of travel of the virtual object or a display characteristic of the virtual object is determined based on a positional relationship between the virtual object and another object that is a real object located in proximity to the path of travel of the virtual object.. .
Sony Corporation

Use of accelerometer input to change operating state of convertible computing device

A convertible computing device has an accelerometer to detect tapping gestures on the device, and a mode sensor to determine whether device is in a laptop mode or a tablet mode. The device includes a first physical human input sensor to change an operating state of the device between an off state and a non-off state and a second physical human input sensor to change an operating state of the computing device.
Google Inc.

Systems and methods for position-based haptic effects

One illustrative system disclosed herein includes a sensor configured to detect a gesture and transmit an associated sensor signal. The gesture includes a first position at a distance from a surface and a second position contacting the surface.
Immersion Corporation

Methods and apparatus using gestures to share private windows in shared virtual environments

Methods and apparatus using gestures to share private windows in shared virtual environments are disclosed herein. An example method includes detecting a gesture of a user in a virtual environment associated with a private window in the virtual environment, the private window associated with the user, determining whether the gesture represents a signal to share the private window with another, and, when the gesture represents a signal to share the private window, changing the status of the private window to a shared window..
Google Inc.

Smart watch and gesture input the smart watch

The present disclosure provides a smart watch and a gesture input method for the smart watch. The present disclosure starts to acquire data of gestures by receiving gestures of the user, collects the data of the gestures of a user at a continuous time section, and finds proximate text corresponding to the obtained data of the gestures from the prestored corresponding relationship between the data of the gestures and the text, namely the proximate text is final output text.
Huizhou Tcl Mobile Communication Co., Ltd.

Click response processing method, electronic device and system for motion sensing control

Embodiments of the present invention disclose a click response processing method for motion sensing control, including: s101: when a push gesture instruction for target content is received, acquiring a transfer point corresponding to the push gesture instruction; s102: triggering a down event corresponding to the push gesture instruction according to the transfer point, and determining and saving information of the transfer point; s103: when a pull gesture instruction is received, directly invoking the information of the transfer point, and triggering an up event corresponding to the pull gesture instruction based on the information of the transfer point; and s104: completing a click event for the target content and outputting a result. Embodiments of the present invention further disclose a electronic device and a motion sensing control system.
Le Shi Zhi Xin Electronic Technology (tianjin) Limited

Method and device for controlling operation components based on somatosensory

Disclosed are a method and electronic device for controlling an operation component based on somatosensory comprises: detecting gesture control information for the operation component; analyzing an operation event triggered by the gesture control information, wherein the operation event includes a down event, a move event and an up event, and setting the move event as an invalid event when the move event is generated between the down event and the up event; and determining that the down event and the up event form a click event so as to finish controlling on the operation component. The present disclosure avoids responding to other events formed by the move event, accurately completing control of somatosensory on the operation component, and improving success rate of triggering corresponding operations by the gesture control information..
Le Shi Zhi Xin Electronic Technology (tianjin) Limited

Intelligent gesture based word sentence augmentation and systems for the implementation thereof

Disclosed herein is a system comprising a user interface comprising an edit region in operative communication with a processor; where the processor is in operative communication with one or more modules; where the processor is operative to receive from the user interface a selection of words in the form of a sentence; use a grammar test to determine if the sentence is grammatically accurate; to parse the sentence and offer a user a choice of words to improve an accuracy of the sentence; where the choice of words is based upon a weighted probability of several possible words that can improve the accuracy of the sentence; and permitting the user to install his/her word choice in the sentence by performing an action involving one or more of swiping, tilting, steering or tapping of the user interface.. .
International Business Machines Corporation

Ultrasonic noise based sonar

The invention relates to a device with a microphone and a speaker or transducer and processing means to process audio signals from the microphone and for the transducer. Electronic devices and especially mobile devices serve several user interfaces of which the touch screen has revolutionized the market in the past few years.
Sound Solutions International Co., Ltd.



Gesture topics:
  • Virtual Keyboard
  • Touchscreen
  • Electronic Device
  • User Interface
  • Characters
  • Display Panel
  • Touch Screen
  • Output Device
  • Input Device
  • Computing Device
  • Device Control
  • Computer Vision
  • Mobile Terminal
  • Ball Mouse
  • Navigation


  • Follow us on Twitter
    twitter icon@FreshPatents

    ###

    This listing is a sample listing of patent applications related to Gesture for is only meant as a recent sample of applications filed, not a comprehensive history. There may be associated servicemarks and trademarks related to these patents. Please check with patent attorney if you need further assistance or plan to use for business purposes. This patent data is also published to the public by the USPTO and available for free on their website. Note that there may be alternative spellings for Gesture with additional patents listed. Browse our RSS directory or Search for other possible listings.


    3.4226

    file did exist - 11734

    1 - 1 - 260