This page is updated frequently with new Gesture-related patent applications.
|Method of processing video data, device, computer program product, and data construct|
The invention relates to a method of processing video data, a device (102) and a computer program product for implementing said method, and a data construct including video data processed by said method. The method processes unprocessed video data into processed video data, said unprocessed video data being provided by picking up (112) sequential images of a situation or scene (100), and includes the steps of: applying a motion and gesture recognition technology (114) in real time to said situation or scene; identifying undesirable image contents contained in said unprocessed video data, based on a result of said motion and gesture recognition, said undesirable image contents preferably including inappropriate body expression (128-132) such as obscene gestures or indecent exposures, and providing content information relating to any identified undesirable age contents; and using said content information to produce said processed video data..
Unify Gmbh & Co. Kg
|Systems and methods for determining meaning of cultural gestures based on voice detection|
In some embodiments, control circuitry may detect a voice communication from a human using voice detection circuitry during playback of a media asset being consumed by the user. Control circuitry may then identify an accent characteristic of the voice communication.
Rovi Guides, Inc.
|Hierarchical networked command recognition|
Systems, methods, computer-readable storage mediums including computer-readable instructions and/or circuitry for generating deceptive indicia profiles may implement operations including, but not limited to: receiving one or more signals indicative of at least one of speech or one or more gestures from a network-connected device; attempting to identify one or more spoken words or gestures based on the one or more signals; selecting, from a plurality of processing entities, at least one processing entity for interpreting the one or more signals based on a result of the attempted identification; interpreting, by the selected at least one processing entity, one or more user commands based on the one or more spoken words or gestures; and generating one or more device control instructions based on the one or more user commands.. .
|System and gesture-based control of a vehicle door|
A control system for a vehicle door includes a power assist device coupled between a door and a door opening of the vehicle and a sensor. The system further includes a controller that receives a signal from the sensor, identifies a recognized user from image data within the signal, interprets a control gesture by the recognized user from video data within the signal, and causes the power assist device to move the door in response to the control gesture..
Ford Global Technologies, Llc
|Real-time virtual reflection|
Techniques are provided for a real-time virtual reflection and for video interaction with virtual objects in a system. According to one embodiment, a user's image is captured by a video camera, and is outputted to a visual display in reverse, providing a mirror-image feedback to the user's movements.
Facecake Marketing Technologies, Inc.
|Systems and methods for social networking with shared reward feature|
A method connects a first individual to a second individual. The method includes associating a first account on a first social network with the first individual and allowing the first individual to send a gesture on the first social network directed towards a second account.
|Wireless communication beacon and gesture detection system|
A secure payment system and method for enabling transactions with a merchant without communicating account or payment information to the merchant by using broadcasting beacon and a payment system which may create a user experience similar to nfc tap payment systems without using nfc communications.. .
|Method and user interface (ui) for customized user access to application functionalities|
A method and a device for direct launching of an interface to selected functionalities of an application are provided. The method includes receiving a user input gesture on an application icon, displaying a plurality of user interface (ui) elements on the application icon, in response to the received user input gesture, wherein each of the plurality of ui elements corresponds to at least one functionality of an application, receiving a user selection of at least one ui element of the plurality of ui elements, and launching the functionality corresponding to the selected ui element..
Samsung Electronics Co., Ltd.
|Vehicle and controlling the same|
A vehicle is provided and includes a display which displays a selection of an object by detecting a touch gesture moving from an edge of a concave region to a center thereof and a method of controlling the same. The vehicle includes a display configured to display a user interface (ui) having a plurality of objects and a touch input device that has a concave region configured to detect a touch gesture.
Kia Motors Corporation
|Display processing method and display processing device|
A display processing method and a display processing device are described. The method includes receiving a first gesture operation on a touch display region of an electronic device, when a first application interface of a first application installed in the electronic device is displayed in the touch display region; determining whether the first gesture operation satisfies a first predetermined condition or not; generating a first sub interface to be displayed in a first sub region of the touch display region and a second sub interface to be displayed in a second sub region of the touch display region when it is determined that the first gesture operation satisfies the first predetermined condition; and displaying the first sub interface in the first sub region, and displaying the second sub interface in the second sub region..
Beijing Lenovo Software Ltd.
Systems and methods for manipulating a virtual environment
One variation of a method for manipulating virtual objects within a virtual environment includes: receiving a touch image from a handheld device, the touch image comprising representations of discrete inputs into a touch sensor integrated into the handheld device; extracting a first force magnitude of a first input at a first location on a first side of the handheld device from the touch image; extracting a second force magnitude of a second input at a second location on a second side of the handheld device from the touch image, the second side of the handheld device opposite the first side of the handheld device; transforming the first input and the second input into a gesture; assigning a magnitude to the gesture based on the first force magnitude; and manipulating a virtual object within a virtual environment based on a type and the magnitude of the gesture.. .
Input device, vehicle including the same, and controlling the same
A touch input device includes a display configured to display a character which is inputted by a user. A touch unit has a concave shape and configured to receive a command for deleting the inputted character.
Hyundai Motor Company
Control of non-destructive testing devices
A non-transitory, computer-readable medium includes computer-executable code having instructions. The instructions are configured to receive data relating to an environment, construct an image of the environment based on the received data, and display the image on a touch-screen device.
General Electric Company
Calibrating vision systems
Methods, systems, and computer program calibrate a vision system. An image of a human gesture is received that frames a display device.
At&t Intellectual Property I, L.p.
Portable device pairing with a tracking system
In embodiments of portable device pairing with a tracking system, a pairing system includes a portable device that generates device acceleration gesture data responsive to a series of motion gestures of the portable device. The pairing system also includes a tracking system that is configured for pairing with the portable device.
Microsoft Technology Licensing, Llc
Movement detection detecting a hand movement
Movement detection apparatus for detecting a hand movement the invention relates to movement detection apparatus (1) for detecting a hand movement like a hand gesture which may be used for controlling a computer or another device. A light emitting device emits light into tissue at the wrist (5) of a person and a light detection device detects light, which has travelled through the tissue, at the wrist and generates a light detection signal based on the detected light, wherein a hand movement determination unit determines the hand movement based on the light detection signal.
Koninklijke Philips N.v.
Mobile terminal and control method thereof
Provided are a mobile terminal capable of sensing a user's gesture and a control method thereof. The mobile terminal includes a control unit configured to store data for executing a preset control operation when a preset gesture is applied, a sensing unit configured to, when the preset gesture is applied again, calculate a first sensing value based on the preset gesture, and a wireless communication unit configured to receive a second sensing value calculated in an external terminal by the preset gesture from the external terminal, wherein when the first sensing value and the second sensing value are within a preset specific range, the control unit executes the preset control operation on the basis of the data..
Lg Electronics Inc.
An intelligent wristband system is disclosed, the system comprising a wearable wristband configured to be worn by a user; a control unit within the wristband; and a sensor configured to detect at least one gesture made by the user, the at least one gesture indicating instructions to be performed by the control unit, the control unit configured translate the at least one gesture into a specific command for an action to occur within the wristband system.. .
Interactive enhancing adaptability of an interactive surface environment
The present disclosure relates to a method for enhancing adaptability of an interactive surface environment having a plurality of objects. The method comprises receiving at least one user gesture performed on a target object from the plurality of objects.
Gesture recognition of ink strokes
One embodiment provides a method, including: accepting, on a touch surface, ink stroke data; identifying, using a processor, that the ink stroke comprises a stroke change; determining, using the processor, if the stroke change is within a predetermined zone; interpreting, using the processor, the ink stroke as a gesture command if the stroke change is within the predetermined zone; and executing, based on the gesture command, at least one action. Other aspects are described and claimed..
Lenovo (singapore) Pte. Ltd.
Multi-sensor control remote signaling control of unmanned vehicles
An apparatus includes a wearable device having a multi-sensor detector to sense operator gestures directed at an unmanned vehicle (uv). The multi-sensor detector includes at least two sensors to detect motion and direction of the operator gestures with respect to operator hand movement, operator hand movement with respect to the earth, rotational movement of the operator hand, and finger movement on the operator hand.
Northrop Grumman Systems Corporation
Pivotable interior mirror for a motor vehicle
A pivotable interior mirror for a vehicle includes a mirror surface, for example in the form of a first surface of a wedge mirror, a sensor device suitable for a light propagation-time measurement, for sensing at least one gesture of a driver of the vehicle and/or light incident on the first surface of the wedge mirror, and a pivoting device for pivoting the mirror surface from a normal position into at least one dimming position, in which blinding of the driver by light reflected at the mirror surface is reduced, wherein the pivoting device can be activated in dependence on at least one output signal of the sensor device. Further, a vehicle may include such an interior mirror and a method for using such an interior mirror is described..
Smr Patents S.à.r.l.
Gesture enhanced input device
Various embodiments disclosed herein are directed to a virtual player interface such as button deck for a gaming device. The interface includes a touch screen display which displays one or more button icons.
Bally Gaming, Inc.
Gesture control earphone
An earphone and a media player system are provided. The earphone may include: a first sensor unit and a second sensor unit respectively mounted on a left portion and a right portion of the earphone, adapted to sensing gestures and generating signals according to the sensed gestures, where the left portion and the right portion are disposed on two sides of a user's head when the user wears the earphone; a processing device, adapted to translating the signals into control instructions to control a media player; and an interface, adapted to transmitting the control instructions to the media player.
Harman International Industries, Incorporated
Systems and methods for social networking
A method of delivering an advertisement and/or a message includes associating an account on a first social network with a first individual and allowing the first account to send a gesture to a second entity wherein the second entity is associated with a license plate. The second license plate can be associated with an advertiser.
Wirelessly identifying participant characteristics
A system and method for identifying persons near a mobile device includes a wireless signaling system including an incoming wireless signal receiver, a device motion sensing system including at least a first sensor and a controller configured to determine whether the device is being moved in accordance with a predetermined gesture and to responsively enter a personnel data collection mode. In the personnel data collection mode, the device may transmit a query and receives an identification signal from at least one other device and identifies a user of the device based on the received identification signal.
Motorola Mobility Llc
System and the translation of sign languages into synthetic voices
A system and method for the translation of sign languages into synthetic voices. The present invention refers to the field of assistive technologies, and comprises an instantaneous communication system between hearing- and speech-impaired individuals with hearing-able individuals.
Input methods for device having multi-language environment
Text input is corrected on a touch-sensitive display by presenting a list of candidate words in the interface which can be selected by touch input. The candidate list can include candidate words having two or more character types (e.g., roman, kana, kanji).
System and appliance control via a personal communication or entertainment device
A system for use in controlling operating functions of a controllable device includes a hand-held device and an intermediate device in communication with the hand-held device and the controllable device. The hand-held device is adapted to receive a gesture based input and to transmit a signal having data representative of the gesture based input.
Universal Electronics Inc.
Operating contents searching function and electronic device supporting the same
An electronic device and a content locating method are provided. The electronic device includes a memory configured to store a circular gesture user interface (ui) including a circular track object related to control of playback of a video; and a processor connected to the memory, wherein the processor is configured to receive a touch gesture event while the circular gesture ui is displayed on a display, and move and display an indicator along an arc path of the circular track object corresponding to an arc path passing through a plurality of areas in which the touch gesture event occurs..
Samsung Electronics Co., Ltd.
An example method is provided in according with one implementation of the present disclosure. The method includes displaying a first screen on a first display of an electronic device and a second screen on at least one second display connected to the electronic device.
Hewlett-packard Development Company, L.p.
Touch input device and control the same
A touch input device and control method thereof, wherein the touch input device includes a swiping input unit configured to receive a selection of a character through a swiping gesture by a user, a gesture input unit placed on an area different from that of the swiping input unit and in which a user inputs a gesture trough a touch, and a controller configured to determine that the selected character is input when a touch gesture is input from the swiping input unit to the gesture input unit.. .
Hyundai Motor Company
A method includes presenting a ui of a first application on a screen of a computing device and detecting a user input. For example, the detected user input may be an input tracing a continuous path on the screen of the computing device, and the path may include a first gesture extending from a first location to a second location on the screen followed by a second gesture extending from the second location to a third location on the screen.
Methods and apparatus, including computer program products, are provided for gesture detection on a user interface such as a touchscreen. In one aspect there is provided a method.
Touch alphabet and communication system
A touch alphabet and communication system is provided. The communication system uses a predetermined set of touch gestures, such as fingertip touch patterns performable on keyless touch-sensitive surfaces, to express the user's desired communication.
Keyboard and touch screen gesture system
A method at an electronic device including a touch-sensitive display for receiving touch input and a keyboard comprising a plurality of buttons, the method comprising: detecting actuation of a button on at least one of the plurality of buttons; detecting a touch input at the touch-sensitive display while the button is actuated; responding to the touch input, wherein response to the touch input while the button is actuated is different to response to touch input detected while the button is not actuated.. .
Presented is method and system for processing a gesture performed by a user of an input device. The method comprises detecting the gesture and determining a distance of the input device from a predetermined location.
Hewlett-packard Development Company, L.p.
Chart dual-y resize and split-unsplit interaction
Systems and methods are disclosed that, in various embodiments, improve chart performance by allowing users to interactively split and unsplit charts with dual-y axis using dragging gestures or a button.. .
Oracle International Corporation
Pen writing on one-dimensional capacitive touch sensor
A touch panel includes a base, which is a liquid crystal module serving as a ground; a flexible dielectric layer over the base; and a one-dimensional pattern layer with sensor cells positioned as the same layer over the flexible dielectric layer. The sensor cells form a sensor array, and each of the sensor cells is individually controlled and sensed via an independent sensing line, wherein press sensing control is conducted according to a capacitance change resulting from a distance change between the sensor array and the base in response to an external force, and touch or gesture-based sensing control is conducted according to a capacitance change in the sensor array without involving the base..
Touchplus Information Corp.
User interface and signaling a 3d-position of an input means in the detection of gestures
In a user interface and to a method for signaling a position of an input mechanism with respect to an area for 3d gesture detection for a user interface, the method includes the following steps: input mechanism of a user are detected in the area for 3d gesture detection, and the position of the input mechanism is signaled by an indicator in the area of the edge of a display unit of the user interface.. .
Camera view control using unique nametags and gestures
Embodiments disclosed herein provide systems, methods, and computer readable media for controlling a camera view using unique nametags and gestures. In a particular embodiment, a method provides identifying a plurality of items at a location from video captured of the location by the video camera and associating a unique nametag of a plurality of unique nametags to each item of the plurality of items.
Method and photographing controlling function based on gesture of user
A photographing apparatus including a sensor, a touchscreen, and a controller is disclosed. The sensor is configured to detect that a user approaches or comes within a predetermined proximity the photographing apparatus.
Samsung Electronics Co., Ltd.
Portable electronic device for photo management
A portable electronic device with a touch screen display for photo management is disclosed. One aspect of the invention involves a computer-implemented method in which the portable electronic device displays an array of thumbnail images corresponding to a set of photographic images.
Method and updating a firmware of an apparatus
A method comprising receiving motion information indicative of an input gesture by way of at least one motion sensor comprised by the apparatus, determining that the input gesture is a firmware update gesture, the firmware update gesture being indicative of a directive to update a firmware of the apparatus, sending a firmware download request to a separate apparatus based, at least in part, on determining that the input gesture is the firmware update gesture, receiving firmware update information from the separate apparatus based, at least in part, on the firmware download request, and updating the firmware of the apparatus based, at least in part, on the firmware update information is disclosed.. .
Nokia Technologies Oy
Method and system for managing applications running on smart device using a wearable device
A method and a system for managing applications running on one or more smart devices are provided. The method includes displaying a plurality of application icons on a wearable device, wherein each icon from the plurality of application icons represents an active application on the smart device connected to the wearable device, receiving a touch gesture on one or more application icons from the plurality of icons, and triggering the smart device to perform an event comprising an interaction between the active applications represented by the one or more application icons in response to the touch gesture..
Samsung Electronics Co., Ltd.
Processing device having a graphical user interface for industrial vehicle
A processing device having a graphical user interface includes a housing having a touch screen display that receives touch gesture commands from a vehicle operator. Still further, a set of controls is arranged on a front face of the housing.
Crown Equipment Corporation
Processing touch gestures in hybrid applications
The present disclosure is directed towards systems and method for receiving and processing user inputs with respect to hybrid computing applications. For example, systems and methods described herein involve detecting one or more user inputs of a touch gesture provided by a user and selectively channeling the one or more user inputs to a non-native element of the hybrid application or a native element of the hybrid application.
Adobe Systems Incorporated
Method and terminal for displaying a plurality of pages,method and terminal for displaying a plurality of applications being executed on terminal, and executing a plurality of applications
A method of displaying a plurality of pages on a screen of a terminal is provided. The method includes detecting a user's gesture that requests movement of the plurality of pages, identifying a movement mode relating to the movement of the plurality of pages, and moving and displaying a first page displayed on the screen and a second page connected to the first page displayed on the screen according to the identified movement mode, wherein the movement mode is one of a discrete mode and a continuous mode..
Samsung Electronics Co., Ltd.
Electronic device and controlling the same
An electronic device is provided. The electronic device includes a display configured to display at least one object, a sensor configured to detect a gesture, and a processor configured to move a pointer from a first position to a second position on the display, corresponding to a moving distance of the gesture, and when the pointer meets a certain condition, move the pointer to a third position on the display..
Samsung Electronics Co., Ltd.
Systems and methods for remapping three-dimensional gestures onto a finite-size two-dimensional surface
A method for operating a real-time gesture based interactive system includes: obtaining a sequence of frames of data from an acquisition system; comparing successive frames of the data for portions that change between frames; determining whether any of the portions that changed are part of an interaction medium detected in the sequence of frames of data; defining a 3d interaction zone relative to an initial position of the part of the interaction medium detected in the sequence of frames of data; tracking a movement of the interaction medium to generate a plurality of 3d positions of the interaction medium; detecting movement of the interaction medium from inside to outside the 3d interaction zone at a boundary 3d position; shifting the 3d interaction zone relative to the boundary 3d position; computing a plurality of 2d positions based on the 3d positions; and supplying the 2d positions to control an application.. .
Dynamic, free-space user interactions for machine control
Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement.
Leap Motion, Inc.
Display apparatus and control method thereof
A display apparatus includes a display configured to display an image; a communicator configured to communicate with an input apparatus, the input apparatus including at least one sensor that senses a user's gesture input; and at least one processor configured to, in response to the gesture input being sensed by the sensor of the input apparatus, determine a command which corresponds to the sensed gesture input among a plurality of commands corresponding to a plurality of functions supported by the display apparatus, and implement an operation corresponding to the determined command, and wherein the command corresponding to the gesture input is based on at least one among content and a first user interface displayed on the display, or a user's input prior to the gesture input. Thus, the display apparatus implements operations in response to a user's gesture input, thereby providing more familiar and closer interaction with a user..
Samsung Electronics Co., Ltd.
Building space control
Methods, devices, and systems for building space control are described herein. One device includes a memory, and a processor configured to execute executable instructions stored in the memory to receive a recording of a gesture interaction with a virtual control element associated with a setting of a space in a building, analyze the recorded gesture for gesture characteristics, and modify the setting of the space based on the virtual control element and the gesture characteristics..
Honeywell International Inc.