Popular terms

Gesture topics
Virtual Keyboard
Electronic Device
User Interface
Display Panel
Touch Screen
Output Device
Input Device
Computing Device
Device Control
Computer Vision
Mobile Terminal
Ball Mouse

Follow us on Twitter
twitter icon@FreshPatents

Web & Computing
Cloud Computing
Search patents
Smartphone patents
Social Media patents
Video patents
Website patents
Web Server
Android patents
Copyright patents
Database patents
Programming patents
Wearable Computing
Webcam patents

Web Companies
Apple patents
Google patents
Adobe patents
Ebay patents
Oracle patents
Yahoo patents


Gesture patents


This page is updated frequently with new Gesture-related patent applications.

Date/App# patent app List of recent Gesture-related patents
 Luminaire system having touch input for control of light output angle patent thumbnailLuminaire system having touch input for control of light output angle
The invention relates to a luminaire system comprising at least one illumination unit (1) comprising at least one light emitting element arranged to emit light (5) out of at least one light emission surface (3); at least one input unit (4) arranged for detecting a finger gesture of a user, and at least one driver unit arranged for driving the at least one illumination unit (1) in accordance with the finger gesture detected by the at least one input unit (4). The luminaire system is characterized in that the at least one input unit (4) is further arranged for detecting an input angle (alpha) between the finger and the at least one input unit (4) and in that the at least one driver unit is further arranged for driving the at least one illumination unit (1) for emitting light out of the at least one light emission surface (3) under an output angle (beta) which corresponds with the detected input angle (alpha)..
Metatronics B.v.

 Method and system for speed and directional control of responsive frame or asset patent thumbnailMethod and system for speed and directional control of responsive frame or asset
Disclosed are method and apparatus either of which enables a user to capture a burst of photographs or other asset, such as for example collections of audio recordings, and subsequently display the captured burst of photographs and/or other asset such as sound (which together may form content) on a mobile device. The user interacts with the content by way of gesture or action, controlling the speed and directional movement of the collection of frames and/or other assets.
Dialapp, Inc.

 Method and system for protecting against mobile distributed denial of service attacks patent thumbnailMethod and system for protecting against mobile distributed denial of service attacks
A ddos attack mitigation system implemented by a ddos attack mitigation central processing server configured to execute server-side machine instructions and a mobile communication device configured to execute device-side machine instructions. The server-side machine instructions include: a reverse proxy traffic handler and a user-interactive ddos attack mitigation scheme handler for issuing ddos attack mitigation challenges and authenticating the users' authenticating actions.
Nxlabs Limited

 Gesture-based signature authentication patent thumbnailGesture-based signature authentication
Embodiments of the invention are generally directed to systems, methods, devices, and machine-readable mediums for implementing gesture-based signature authentication. In one embodiment, a method may involve recording a first gesture-based signature and storing the recorded first gesture-based signature.
Intel Corporation

 Method and  processing value documents patent thumbnailMethod and processing value documents
A method for processing value documents comprises the following steps: by means of a camera device an action of an operator of a value-document processing apparatus is captured. The captured image data are processed by means of an image processing device and at least one predetermined gesture is extracted from the processed image data.
Giesecke & Devrient Gmbh

 Gesture tracking and classification patent thumbnailGesture tracking and classification
A method of tracking the position of a body part, such as a hand, in captured images, the method comprising capturing (10) colour images of a region to form a set of captured images; identifying contiguous skin-colour regions (12) within an initial image of the set of captured images; defining regions of interest (16) containing the skin-coloured regions; extracting (18) image features in the regions of interest, each image feature relating to a point in a region of interest; and then, for successive pairs of images comprising a first image and a second image, the first pair of images having as the first image the initial image and a later image, following pairs of images each including as the first image the second image from the preceding pair and a later image as the second image: extracting (22) image features, each image feature relating to a point in the second image; determining matches (24) between image features relating to the second image and image features relating to in each region of interest in the first image; determining the displacement within the image of the matched image features between the first and second images; disregarding (28) matched features whose displacement is not within a range of displacements; determining regions of interest (30) in the second image containing the matched features which have not been disregarded; and determining the direction of movement (34) of the regions of interest between the first image and the second image.. .
The University Of Warwick

 Skin-based approach to virtual modeling patent thumbnailSkin-based approach to virtual modeling
A design engine for designing an article to be worn on a human body part (input canvas) in a virtual environment. A virtual model engine of the design engine is used to generate and modify a virtual model of the input canvas and a virtual model of the article based on skin-based gesture inputs detected by an input processing engine.
Autodesk, Inc.

 Skin-based approach to virtual modeling patent thumbnailSkin-based approach to virtual modeling
A design engine for designing an article to be worn on a human body part (input canvas) in a virtual environment. A virtual model engine of the design engine is used to generate and modify a virtual model of the input canvas and a virtual model of the article based on skin-based gesture inputs detected by an input processing engine.
Autodesk, Inc.

 Method and  processing voice input patent thumbnailMethod and processing voice input
Disclosed herein are a method and electronic device. The electronic device includes a first sensor configured for detecting a gesture and a second sensor for detecting a sound, and at least one processor.
Samsung Electronics Co., Ltd.

 Information processing terminal and method, program, and recording medium patent thumbnailInformation processing terminal and method, program, and recording medium
A controller detects whether a gesture, input on said input unit, is in a gesture input region where gesture input is accepted on one of the display screens of the display unit, and displays gesture input regions for the display screen when the gesture is not detected in the gesture input region.. .


Method and system for viewing stacked screen displays using gestures

An intuitive technique for inputting user gestures into a handheld computing device is disclosed allowing a user to better manipulate different types of screen display presentations, such as desktops and application windows, when performing tasks thereon, wherein a window stack for application windows and/or desktops can be navigated and sequentially displayed according to the window stack ordering without disturbing or changing this ordering.. .


Application association processing method and apparatus

Embodiments of the present invention provide an application association processing method and apparatus. The method includes: detecting a first operation instruction; and when it is determined that the first operation instruction is to perform a first preset operation on first content displayed on a display interface, displaying prompt information of second content associated with the first content, where the first preset operation is an operation gesture preset by the user, the prompt information is used to inform an association relationship exists between the first content and the second content, where when the first content is an application icon, the second content is an icon of at least one control included in an application corresponding to the application icon; or when the first content is an icon of a control, the second content is an icon of an application to which the control belongs.
Huawei Device Co., Ltd.


Enable dependency on picker wheels for touch-enabled devices by interpreting a second finger touch gesture

Methods and apparatus, including computer program products, are provided for finger gestures. In one aspect there is provided a method, which may include detecting a first finger gesture proximate to or making contact with a graphical user interface element representative of a first picker wheel presented on a user interface; detecting a second finger gesture proximate to or making contact with the user interface, the second finger gesture detected during a time period comprising a time when the first finger gesture is proximate to or making contact with the first picker wheel; changing, when the second finger gesture is detected, a second picker wheel from an independent mode to a dependent mode; and updating, when in the dependent mode, the second picker wheel based on a selection value made via the first finger gesture at the first picker wheel.


Apparatus, communication of touch sensor information

Techniques and mechanisms to communicate touch sensor information via physical layer (phy) circuitry that provides a high speed, low voltage interface. In an embodiment, a source device and a sink device each include a respective differential phy (d-phy) and respective protocol logic to support a touch serial interface protocol.


Distinguishing between touch gestures and handwriting

An approach is provided for receiving user inputs at a touch-screen of a device, with each user input including a set of input properties. Based on the input properties, identifying an intended input type from a number of input types with input types including a handwriting type and a gesture type.
Lenovo (singapore) Pte. Ltd.


Gesture inferred vocabulary bindings

The subject disclosure relates to annotating data based on gestures. Gestures include user interaction with a client device or client software.
Microsoft Technology Licensing, Llc


Gesture recognition apparatus, vehicle having the same, and controlling the vehicle

A gesture recognition apparatus, a vehicle having the same, and a method for controlling the vehicle are disclosed. A gesture recognition apparatus includes a collection unit having a single sense region, configured to collect information regarding a user gesture conducted in the sense region.
Hyundai Motor Company


Method and device for remote control of a function of a vehicle

A method and a device for the remote control of a function of a vehicle is disclosed. A wireless communication connection is established between a portable operating device and the vehicle.
Daimler Ag


Gesture recognition method in vehicle using wearable device and vehicle for carrying out the same

A method and system for enabling a vehicle to recognize a gesture of a driver using a wearable device and executing a function that corresponds to the recognized gesture is provided. The gesture recognition method in a vehicle includes performing a wireless connection with a wearable device, and receiving gesture information sensed by the wearable device from the connected wearable device.
Hyundai Motor Company


Technologies for robust two-dimensional gesture recognition

Technologies for performing two-dimensional gesture recognition are described. In some embodiments the technologies include systems, methods, and computer readable media for performing two-dimensional gesture recognition on one or more input images.


Gesture assistive zoomable selector for screen

A method of pointing a first object on a screen of an infotainment system is provided. The first object on the screen is displayed on the screen and, a location of a second object related to a user and a first predetermined gesture executed by the second object by one or more sensors can be detected.
Alpine Electronics, Inc.


Gesture based power management system and method

A gesture based power management method for a wearable device is provided. The method includes establishing a mapping relationship between a set of pre-defined gestures including at least a lookup gesture and a set of power management functions including at least a function for turning on a display screen of the wearable device.
Tcl Research America Inc.


Autonomous vehicle interaction with external environment

Arrangements relate to the interaction between an autonomous vehicle and an external environment of the autonomous vehicle. Such interaction can occur in various ways.
Toyota Motor Engineering & Manufacturing North America, Inc.


Entry assist system for a motor vehicle

An entry assist system for a motor vehicle includes a door, a door operation module to open and close the door and a starter actuator for starting an engine of the motor vehicle. The entry assist system further includes a control subsystem.
Ford Global Technologies, Llc


Nui video conference controls

A system and method providing gesture controlled video conferencing includes a local capture device detecting movements of a user in a local environment and an audio/visual display. A processor is coupled to the capture device and a remote capture device and a remote processor at a remote environment via a network.
Microsoft Technology Licensing, Llc


Augmented reality remote control

An augmented reality (ar) device, places manual controls and virtual displays onto a surface of a controllable electronic device (ced) or next to the ced as viewed through the ar device allowing the user to manipulate the controls and view feedback via the virtual displays associated with the controllable device. The ar device overlays an image on a surface of the ced with virtual control objects and virtual feedback image(s) are displayed on a surface of the ced or adjacent to the ced.
Vizio Inc.


Position capture input apparatus, system and method therefor

According to various embodiments, a position capture input system uses a camera to capture an image of a displayed graphical user interface that may be partially obstructed by an object, such as a user's hand or other body part. The position capture input system also includes a software component that causes a computing device to compare the captured image with a displayed image to determine which portion, if any, of the graphical user interface is obstructed.


Camera timer

Disclosed are devices, systems, methods, and non-transitory computer-readable storage media for displaying useful countdown timers on a media capture device. A media capture device can increase contrast between countdown timer and the video of a scene to be captured by a media capture device and adjust the position and size of a counter displayed on the device based on whether the object is determined to be closer or further than a predetermined threshold distance.
Apple Inc.


Remote controller utilized with charging dock for controlling mobile device

Provided is a console unit for controlling a mobile device. The console unit includes a remote control unit and a tower section.
Analogix Semiconductor, Inc.


Multi-purpose application launching interface

A computer and a computer-implemented method with a user interface for displaying and queueing notifications in a multi-purpose application environment are provided. The method includes displaying an application launching interface comprising a plurality of applications in response to a user gesture, wherein the application launching interface is hidden from display prior to the user gesture; displaying a notification associated with one of the plurality of applications to the user; and queueing an action when the user provides a queueing gesture for the notification, wherein the queued action is displayed for later performance when selected by the user.
Google Inc.


Method, apparatus and system for gesture based security

A method, system, token and scanning device for gesture-based security are provided. The token includes an information storage module such as an rfid system or a microchip system of a contactless smart card, and a fiducial marker of the token, such as a unique optically recognizable pattern.
Sierra Wireless, Inc.


Integrated medical record system using hologram technology

In one aspect of the invention, a computer-implemented method may include maintaining in a computer system, a plurality of relationships among users in an aggregated data network. A user may receive a plurality of data points, where at least one of the data points, was requested via a relationship inquiry, made using a health insurance portability and accountability act (hipaa) compliant security clearance user interface.


Application launching and switching interface

Techniques for application launching and switching are provided. An example method includes receiving an interactive gesture at a computing device, when the interactive gesture matches a predefined gesture, determining a current context of the computing device based at least on one or more tasks, the tasks including previously performed tasks at the computing device or predicted future tasks to be performed at the computing device, based on the determined context, identifying one or more software applications, the software applications including executing applications, terminated applications or uninstalled applications, to perform the one or more tasks, displaying one or more user interface elements representing the software applications, where the user interface elements are selectable to instantiate the identified software applications..
Google Inc.


Method and system for mobile device airspace alternate gesture interface and invocation thereof

A computing device, or electronic personal display, includes a housing and a touch screen display. The housing includes a 3-dimension motion sensor operational within an airspace thereof.
Kobo Incorporated


Electronic device and navigating pages of electronic device

In a method for searching for navigating pages of an operation object of an electronic device, a corresponding relationship between gestures of a user and page functions of the operation object is preset. When the operation object comprises more than one page, the method detects a gesture of the user on a touch screen of the electronic device.
Chiun Mai Communication Systems, Inc.


Pointer projection for natural user input

A method to identify a targeted object based on eye tracking and gesture recognition. The method is enacted in a compute system controlled by a user and operatively coupled to a machine vision system.
Microsoft Technology Licensing, Llc


Method and system for invocation of mobile device acoustic interface

The mobile computing device, or electronic personal display, includes a housing and a touch screen display providing a touch-based gesture interface. The housing includes an acoustic sensor operational to receive acoustic input generated at a tactile interface thereon.
Kobo Incorporated


Method for performing operation on intelligent wearing device by using gesture, and intelligent wearing device

A method for performing an operation on an intelligent wearing device by using a gesture, and an intelligent wearing device is presented. First, an operation gesture is identified according to an operation gesture signal.
Huawei Technologies Co., Ltd.


Apparatus for gesture recognition, vehicle including the same, and gesture recognition

A gesture recognition apparatus may execute a command by recognizing a user's gesture. The gesture recognition apparatus includes a gesture sensor that detects a position and movement of an object in space, and a cover that includes a contact surface which is positioned away from the gesture sensor by a predetermined distance and is brought into contact with the object..
Hyundai Motor Company


Remote controller for controlling mobile device

Provided is a remote control for controlling a mobile device. The remote control includes a communications transceiver configured to communicate with the mobile device and an actuator for receiving a user input while the remote control is communicatively coupled to the mobile device.
Analogix Semiconductor, Inc.


Apparatus for recognizing gesture using infrared ray and method thereof

Disclosed are an apparatus and a method of recognizing a gesture using an infrared ray. The present invention provides an apparatus of recognizing a gesture, including: a sensing unit which detects a gesture using an infrared sensor to obtain a sensing value from the sensing result; a control unit which performs gesture recognition to which an intention of a user is reflected in accordance with a predetermined recognizing mode based on the obtained sensing value; and a storing unit which stores the recognizing mode when the gesture recognition set in advance by the user is performed, in which the recognizing mode includes a first recognizing mode in which the gesture is directly recognized and a second recognizing mode in which the gesture is recognized after recognizing a hold motion for determining start of the gesture recognition..
Hyundai Mobis Co., Ltd.


Interactivity model for shared feedback on mobile devices

A system that produces a dynamic haptic effect and generates a drive signal that includes a gesture signal and a real or virtual device sensor signal. The haptic effect is modified dynamically based on both the gesture signal and the real or virtual device sensor signal such as from an accelerometer or gyroscope, or by a signal created from processing data such as still images, video or sound.
Immersion Corporation


Wearable wireless hmi device

A wearable gesture control interface apparatus is used to control a controllable device based on gestures provided by a user. The wearable gesture control interface apparatus includes (i) sensors configured to detect user orientation and movement and generate corresponding sensor data and (ii) a microcontroller configured to: sample the sensor data from the sensors, determine whether the sensor data from one of the sensors meets transmission criteria; and if the sensor data meets the transmission criteria, transmitting control data corresponding to all of the sensors to the controllable device..


Gesture recognition user interface for an aerosol delivery device

An aerosol delivery device is provided that includes a housing, motion sensor and microprocessor. The motion sensor is within the housing and configured to detect a defined motion of the aerosol delivery device caused by user interaction with the housing to perform a gesture.
R. J. Reynolds Tobacco Company


Multiple soil-topography zone field irrigation user interface system and method

A field irrigation interface display method constituted of: receiving an indication of an irrigation status of a respective one of a plurality of soil-topography zones of a field; controlling a display of a user device to display a graphical illustration of the field split into the plurality of soil-topography zones; controlling the display of the user device to display thereover an informational graphical illustration associated with the received indication of the respective soil-topography zone; controlling the display of the user device to display a first actionable graphical illustration of a first irrigation attribute of the respective soil-topography zone; and responsive to a user gesture at any one of the displayed first actionable graphical illustrations, outputting a first irrigation adjustment signal arranged to adjust the amount of irrigation provided by a particular one of a plurality of irrigation device sets to the respective soil-topography zone.. .
Cropx Technologies Ltd.


Controlling a camera with face detection

An example of an electronic device with a first camera and a second camera is described. The first camera receives a first image stream, and the second camera receives a second image stream that includes a face of a user of the electronic device.
Intel Corporation


User-friendly transaction interface

Methods and systems for facilitating electronic transactions on a user device are described. User-friendly graphical user interfaces (guis) are provided with minimal text and more pictures and images.
Ebay Inc.


Robot cleaner and controlling a robot cleaner

A robot cleaner and a method for controlling a robot cleaner are provided. The robot cleaner may include a casing, a drive disposed in the casing, a camera disposed in the casing to acquire an image of a user's gesture, and a controller that extracts an image including a user's arm image from the image acquired by the camera to determine an angle and a direction of the arm expressed by the user's gesture from the arm image and determine an intension expressed by the determined arm angle, and controls the drive based on the determined intension.
Lg Electronics Inc.


Mobile terminal and controlling method thereof

A watch type mobile terminal and controlling method thereof are disclosed, by which the watch type mobile terminal can be controlled through voice. The present disclosure includes a touch input unit configured to receive a touch input, a wireless communication unit configured to perform a wireless communication, a sensing unit configured to sense a movement of the mobile terminal, a microphone configured to receive a sound, and a controller is configured to activate the microphone when a preset first gesture input is detected, if a user voice is received via the microphone while the touch input unit is touched, control data to be transmitted to a target indicated by the user voice, and if the user voice is received via the microphone while the touch input unit is not touched, control a function indicated by the user voice to be executed on the mobile terminal..
Lg Electronics Inc.


Gesture-based visualization of data grid on mobile device

A mobile device user may quickly and naturally explore data received in a data grid from a remote database. A mobile device engine receives the database data including dimensions and measures, in a grid format comprising rows and columns of numerals.


Systems and methods for controlling viewport movement in view of user context

As part of a technique for positioning viewports over interactive digital maps, a digital map of a geographic area is provided via a user interface of a computing device. The currently visible portion of the digital map is displayed in a viewport.
Google Inc.


Partial detect mode

The disclosure relates to a touch sensitive system comprising a touch sensitive panel defining a touch surface, a plurality of emitters configured to emit light into the panel for propagation in the panel, a plurality of detectors configured to detect the light propagating in the panel, a plurality of distributed control devices each configured to control operation of a segment of emitters and detectors, a main control unit configured to control the distributed control devices. The touch sensitive system is configured to be set in a partial detect mode in which mode a first of the distributed control devices is configured to be active and to control a first emitter to emit light in a partial region of the panel coincident with a partial area of the touch surface.
Flatfrog Laboratories Ab


Motion component dominance factors for motion locking of touch sensor data

An image jaggedness filter is disclosed that can be used to detect the presence of ungrounded objects such as water droplets or coins, and delay periodic baseline adjustments until these objects are no longer present. To do otherwise could produce inaccurate normalized baseline sensor output values.
Apple Inc.


Mobile device and displaying information

A mobile device control method for displaying information on a touch screen of a mobile device is provided. The method includes determining a type of a cover for the mobile device, the cover having a screen projection portion, detecting a gesture or a trigger, and displaying a screen corresponding to a current state of the mobile device on the touch screen depending on the type of the cover, in response to the detection of the gesture or the trigger..
Samsung Electronics Co., Ltd.


Touch sensitive edge input device for computing devices

A narrow strip for use in conjunction with a computing device which enables touch sensitive edge functionality in response to fingers in contact with the strip. The edge strip allows a user to control aspects of the computing device by using various touches and gestures without occluding the face of the computing device..


Content selection in a pen-based computing system

A method of selecting content using a pen-based computing system. Gestures generated by a user with a smart pen on a writing surface are captured and used to select content.
Livescribe, Inc.


Gesture based control application for data sharing

Receiving user gesture input commands and interpreting the commands to conduct presentation level control system processing and related presentation communications includes, in one example, detecting an input gesture command via a controller and processing the input gesture command via a processor. The example may also include retrieving at least one data file object responsive to the processed input gesture command, and transmitting the at least one media object to a remote device..
Amx Llc


Mid-air gesture input method and apparatus

The present invention discloses a mid-air gesture input method and apparatus. In embodiments of the present invention, when a writing start command is detected, gesture images are collected, and a position of a fingertip in each frame of the gesture images is acquired; a writing trajectory is generated according to the acquired positions of the fingertip in the gesture images; and when a writing end command is detected, text recognition is performed on the generated writing trajectory, to obtain text corresponding to the writing trajectory.
Shenzhen Tcl New Technology Co., Ltd


Switch operating device, mobile device and operating a switch by a non-tactile push-gesture

Switch operating device (100) with: a gesture sensor operating a switch (103) with a non-tactile push-gesture performed with a heat emitting part. The gesture has an approach phase (111) during which the part approaches the sensor, a waiting phase (113) during which the part remains proximate to the sensor, and a withdrawal phase (112) during which the part is moved away from the sensor.
Pyreos Ltd.


Personal display systems

A personal display system may include a led panel having a controller, a plurality of displayable patterns being stored in a memory of the controller. The controller may be in communication with an information device, such as a smart phone, such that a communication from the information device causes the controller to display a selected one of the displayable patterns on the led panel.
Fos Labs, Inc.


Gesture-based editing of 3d models for hair transplantation applications

Methods and systems are provided for gesture-based editing of three-dimensional (3d) models of real targets, for example, for use in planning hair transplantation procedures. According to some embodiments of the methodology disclosed, a 3d control points on an initial default model are matched automatically with the drawing of an outline of a target feature that a user wishes to define and deformed appropriately to quickly and accurately modify and update the initial default model into a resulting fitting model of the real target..
Restoration Robotics, Inc.


User interface for comparing items using gestures

In an example embodiment, a method of presenting marketplace listings is provided. Search parameters are received from a user interface on an electronic device.
Ebay Inc.


Object search method and apparatus

An object search method and apparatus, where the method includes receiving voice input and gesture input that are of a user; determining, according to the voice input, a name of a target object for which the user expects to search and a characteristic category of the target object; extracting characteristic information of the characteristic category from an image area selected by the user by means of the gesture input; and searching for the target object according to the extracted characteristic information and the name of the target object. The solutions provided in the embodiments of the present disclosure can provide a user with a more flexible search manner, and reduce a restriction on an application scenario during a search..
Huawei Technologies Co., Ltd.


Supporting different event models using a single input source

In at least some embodiments, input provided by a single source generates events representing multiple source types through a mapping process, e.g. A touch input generates both touch and mouse events.
Microsoft Technology Licensing, Llc


Device, method, and graphical user interface with a dynamic gesture disambiguation threshold

An electronic device with a display, a touch-sensitive surface, one or more processors, and memory detects a first portion of a gesture, and determines that the first portion has a first gesture characteristic. The device selects a dynamic disambiguation threshold in accordance with the first gesture characteristic.
Apple Inc.


Enlarging or reducing an image on a display screen

A method, and associated apparatus and system and program product, for enlarging or reducing an image. The image is displayed on a display screen.
International Business Machines Corporation


User interface for mobile device to navigate between components

A method, system, apparatus, and computer program product provide the ability to navigate between components in a computer-aided design (cad) mobile drawing application. A drawing is opened in the cad mobile drawing application on a mobile device.
Autodesk, Inc.


Virtual measurement tool for a wearable visualization device

Disclosed are a technique of generating and displaying a virtual measurement tool in a wearable visualization device, such as a headset, glasses or goggles equipped to provide an augmented reality and/or virtual reality experience for the user. In certain embodiments, the device generates the tool by determining multiple points, each at a different location in a three-dimensional space occupied by the user, based on input from the user, for example, by use of gesture recognition, gaze tracking and/or speech recognition.


Method for providing graphical user interface and electronic device for supporting the same

An electronic device, according to certain embodiments of the present disclosure, includes: a display module that displays a plurality of image items; and a processor that, when a swipe gesture input with respect to a specific image item among the plurality of image items is detected, controls the display module to display high level items or low level items of the specific image item. Other embodiments are provided..
Samsung Electronics Co., Ltd.


Electronic apparatus and a displaying a screen of the electronic apparatus

An electronic device and a method of displaying a screen of the electronic device are provided. The method includes detecting a user gesture for unlocking a sleep state, determining a screen display direction based on the detected user gesture, and displaying the screen based on the determined screen display direction..
Samsung Electronics Co., Ltd.


Method and recognizing touch gesture

A method and an apparatus for recognizing a touch gesture are disclosed, in which the apparatus may obtain a depth image in which a touch object and a background area are captured, detect a touch input applied by the touch object to the background area in a touch detection area, and recognize a touch gesture associated with the touch input by tracking a change in the touch input.. .
Samsung Electronics Co., Ltd.


Gesture multi-function on a physical keyboard

A computer keyboard includes position sensors such as capacitive sensors on each of the keys for monitoring positions of fingertips on the keys. A processor receives an indication of contact of a finger on a surface of a key, including an identity of the key and an indication of a position of the contact on the surface.
At&t Intellectual Property I, L.p.


User interface device, user interface method, program, and computer-readable information storage medium

To allow for easy entry of a plurality of characters by handwriting gestures in the air, a user interface device includes template data storage means for storing template data indicating changes in a predetermined writing position when a gesture to write each of a plurality of characters in the air is made, position obtaining means for sequentially obtaining the predetermined writing position when a user makes gestures to sequentially writing characters in the air, similarity evaluation information output means, every time the predetermined writing position is obtained by the position obtaining means, for sequentially outputting similarity evaluation information indicating a similarity between data to be evaluated including a predetermined number of the predetermined writing positions taken in order from newly obtained data and the template data related to each of the plurality of characters, and character string determination means for determining a character string related to the gestures of the user based on the sequentially output similarity evaluation information related to each of the plurality of characters.. .
Rakuten, Inc.


Mobile terminal

There is disclosed a mobile terminal including a body comprising a predetermined flexible portion, a display provided in the body to output image information and to receive an input touch gesture, a plurality of actuators provided in the body to change a shape of the body, and a controller controls the actuators in accordance with an operation condition set corresponding to the event, when an event is generated, wherein the operation condition comprises at least one of a driving actuator of which a shape is changed, a shape-variation level of the driving actuator, a shape-variation speed of the driving actuator and a shape-variation frequency of the driving actuator. The body of the mobile terminal includes the actuator to change the shape of the mobile terminal.
Lg Electronics Inc.


System and remote virtual reality control of movable vehicle partitions

A method for remote virtual reality control of movable vehicle partitions includes displaying a graphic model of at least a portion of a vehicle on an output device. The output device is located remotely from the vehicle and the vehicle has one or more movable vehicle partitions.
Honda Motor Co., Ltd.


Formatting and navigating graphed information

A method in an electronic device that may include determining, with a processor, a quantity of group markers in a data set, and determining, with the processor and based on the quantity of group markers, a suggested view from a plurality of available views, each view in the plurality of available views diagrammatically depicting at least a portion of the data set. Also, a processor-implemented method of navigating between portions of a data set may include displaying, on a touchscreen display, a diagrammatical depiction of a first portion of the data set; detecting a continuous arc gesture at the touchscreen display; determining a direction of the arc gesture; selecting a second portion of the data set based on the determined direction of the arc gesture; and displaying a diagrammatical depiction of the second portion of the data set..
Keithley Instruments, Inc.


Gesture based input system in a vehicle with haptic feedback

A vehicle haptic feedback system includes a haptic actuator, a detection device, and a controller. The haptic actuator is configured to provide haptic feedback to a vehicle driver.
Immersion Corporation


Channel selection interface for a vehicle

Disclosed herein is inter alia a channel selection interface includes a plurality of sectors, each of the sectors including a number of tunable service identifiers divided by the plurality of sectors, and a plurality of channel markers that separate the plurality of sectors. The channel selection interface provides an even allocation of channels for a gesture recognition interface..
Toyota Motor Engineering & Manufacturing North America, Inc.


Wearable device and communication method using the wearable device

A wearable device and a communication method using the wearable device may include recognizing a gesture of a user by sensing at least one of a motion and a biosignal that occur in or around a portion of the user to which the wearable device is attached. A wireless communication connection is established for the wearable device with at least one of an external device, an internal device, or another wearable device based on the recognized gesture.
Samsung Electronics Co., Ltd.

Gesture topics: Virtual Keyboard, Touchscreen, Electronic Device, User Interface, Characters, Display Panel, Touch Screen, Output Device, Input Device, Computing Device, Device Control, Computer Vision, Mobile Terminal, Ball Mouse, Navigation

Follow us on Twitter
twitter icon@FreshPatents


This listing is a sample listing of patent applications related to Gesture for is only meant as a recent sample of applications filed, not a comprehensive history. There may be associated servicemarks and trademarks related to these patents. Please check with patent attorney if you need further assistance or plan to use for business purposes. This patent data is also published to the public by the USPTO and available for free on their website. Note that there may be alternative spellings for Gesture with additional patents listed. Browse our RSS directory or Search for other possible listings.



2 - 1 - 101