|| List of recent Gesture-related patents
| Method and apparatus for user interaction|
The subject matter discloses a method of screen navigating; the method comprises the steps of classifying a gesture of a body organ as an action of screen navigation capturing an image of a body organ; analyzing said image to determine whether the image matches said gesture of said body organ; and executing said action of screen navigation if said image matches said body organ gesture.. .
| Method of displaying calendar and electronic device therefor|
A method and an apparatus for controlling output of a calendar in an electronic device are provided. The method includes displaying an electronic calendar, detecting a gesture on the displayed electronic calendar, updating the displayed electronic calendar to a calendar which has a difference by a predetermined period according to a period of the displayed electronic calendar when the detected gesture corresponds to a predetermined gesture, and displaying the updated calendar, wherein the predetermined period corresponds to a period which is longer than that of the displayed electronic calendar..
| Data display method and apparatus|
A data display method and apparatus display data efficiently on the screen of an electronic device equipped with a touchscreen. The data display method includes setting a scroll rate to a touch movement distance; detecting a touch gesture in a first region of the touchscreen; scrolling icons in the first region at the scroll rate in response to the touch gesture; and displaying detailed information associated with at least one icon newly displayed according to the scroll in a second region of the touchscreen..
| Dynamic user interface for navigating among gui elements|
In one example, a computing device executes a plurality of application processes, each of which has an associated graphical user interface element. The computing device renders a common graphical user interface on a presence-sensitive screen.
| Gesture-based navigation using visual page indicators|
Example embodiments relate to gesture-based navigation using visual page indicators. In example embodiments, a computing device detects a held user input and a movement of the held input in a first direction.
| Electronic apparatus, processing system, and computer readable storage medium|
Provided is an electronic apparatus that perform an appropriate process according to a gesture of a subject person, the electronic apparatus including: a first input unit that inputs a detection result of a biosensor detecting a change in biological information of a person; a second input unit that inputs a recognition result of a recognition device recognizing an action of the person; and a processor that performs a process according to the action of the person based on input results of the first and second input units.. .
| System for evaluating infant movement using gesture recognition|
A system and method for measuring the movement of one or more limbs of an infant using a video system for the purpose of determining whether the infant suffers from or is at risk of suffering from a medical condition such as cerebral palsy.. .
| Heuristic-based approach for automatic payment gesture classification and detection|
A system and method for automatic classification and detection of a payment gesture are disclosed. The method includes obtaining a video stream from a camera placed above at least one region of interest, the region of interest classifying the payment gesture.
| Protocol for communications between platforms and image devices|
In accordance with some embodiments, a protocol permits communications between platforms and image devices. This allows, for example, the platform to specify particular types of information that the platform may want, the format of information the platform may prefer, and other information that may reduce the amount of processing in the platform.
| Queue group leader identification|
A system and method to identify the leader of a group in a retail, restaurant, or queue-type setting (or virtually any setting) through recognition of payment gestures. The method comprises acquiring initial video of a group, developing feature models for members of the group, acquiring video at a payment location, identifying a payment gesture in the acquired video, defining the person making the gesture as the leader of the group, and forwarding/backtracking through the video to identify timings associated with leader events (e.g., entering, exiting, ordering, etc.)..
A variety of actions are controlled based on an operator's various finger gestures used on the touch screen disposed on the display surface of the display means, for example, making contact by a finger (tap), making two consecutive contacts by a finger (double-tap), making contact by a finger, and moving the finger without releasing it (drag), making contact by a finger and maintaining the contact for a predetermined time or longer (touch-and-hold), making simultaneous contact by two fingers, and increasing spacing between the fingers (pinch-out) or decreasing the spacing (pinch-in), and making simultaneous contact by two fingers, and moving the fingers in parallel (double-drag).. .
| Method to select word by swiping capacitive keyboard|
A method for an electronic device having a keyboard and a display, including, receiving an input reflecting selection of one or more of the keys, displaying, at a location on the display, one or more characters associated with the one or more selected keys, wherein the location corresponds to a region of the keyboard determined based on a subsequent candidate input character that is based on the one or more characters associated with the one or more selected keys, and detecting a swipe input associated with the determined region. An electronic device including a display, a keyboard, a memory, and a processor, the processor being configured to execute the method.
| Text recognition apparatus and method for a terminal|
A text recognition apparatus and method of the portable terminal is provided for recognizing text image selected by a pen on a screen image as text. The text recognition method of the present invention includes displaying an image; configuring a recognition area on the image in response to a gesture made with a pen; recognizing text in the recognition area; displaying the recognized text and action items corresponding to the text; and executing, when one of the action items is selected, an action corresponding to the selected action item..
| Method for actuating a tactile interface layer|
A method for actuating a tactile interface layer for a device that defines a surface with a deformable region, comprising the steps of detecting a gesture of the user along the surface of the tactile interface layer that includes a movement of a finger of the user from a first location on the surface to a second location on the surface; interpreting the gesture as a command for the deformable region; and manipulating the deformable region of the surface based on the command.. .
| Apparatus and method for controlling key input|
A key input control apparatus includes a gesture recognizing unit which detects whether or not an input of a touch event generated on a screen corresponds to a predetermined gesture. A gesture area identifying unit which identifies a predetermined area where the input of the touch event corresponding to the predetermined gesture is generated.
| Gesture recognition apparatus, control method thereof, display instrument, and computer readable medium|
A gesture recognition apparatus for recognizing a gesture of a user from a moving image in which the user is photographed is provided, the gesture recognition apparatus comprising: a determination part configured to determine a type of the gesture; and a recognition area definition part configured to define a recognition area, which is an area where the gesture is recognized in a whole area of the moving image, based on the type of the gesture determined by the determination part.. .
| Method and apparatus for extracting three-dimensional distance information from recognition target|
A method and apparatus for extracting three-dimensional distance information from a recognition target is provided, which enables a gesture input from a user to be correctly recognized using distance information from the recognition target, and at the same time makes it possible to efficiently save power required for detection of the gesture input. The method includes determining if a recognition target exists within a predetermined range; when the recognition target exists within the predetermined range, generating a 3d image for the recognition target; and calculating a distance to the recognition target by using the 3d image..
| Gesture recognition apparatus, control method thereof, display instrument, and computer readable medium|
A gesture recognition apparatus for recognizing a gesture of a user from a moving image in which the user is photographed is provided, the gesture recognition apparatus comprising: a sight line direction estimation part configured to estimate a sight line direction of the user; a determination part configured to determine that the user intends to start the gesture when an angle formed by a first predetermined direction and the sight line direction is less than a predetermined value in a predetermined period; and a notification part configured to notify the user that the determination is made, when the determination part determines that the user intends to start the gesture.. .
| Gesture recognition apparatus, control method thereof, display instrument, and computer readable medium|
A gesture recognition apparatus for recognizing a gesture of a hand of a user from a moving image in which action of the hand of the user is photographed is provided, the gesture recognition apparatus comprising: a face detector configured to detect a face of the user; a shape identification part configured to identify whether the hand is a right hand or a left hand; and a performer specification part configured to specify a person, who is closest to the identified hand and whose face is located on a right side of the identified hand, as a performer of the gesture when the identified hand is the right hand, and specify a person, who is closest to the identified hand and whose face is located on a left side of the identified hand, as the performer of the gesture when the identified hand is the left hand.. .
| Information system|
An information system includes a camera located at a central position in a horizontal direction in front of two users who face the camera and who are aligned side by side, a gesture recognizing unit configured to recognize a gesture of a hand of a user based on a target image captured by the camera, a determining unit that determines whether a central point in a width direction of an arm in the target image is at a right side or at a left side with respect to a center of the hand in the target image, and an operation user configured to determine that a performer of the gesture is a right side or a left side user. .
| Smart signage system|
The present patent application is directed to a smart signage system. In one aspect, the smart signage system includes at least a signage device; at least a signage display being connected to the at least one signage device and configured to display contents stored in the signage device; and a plurality of client devices being in wireless communication with the at least one signage device.
| Head mounted display and method of controlling digital device using the same|
Disclosed is a method of receiving a gesture input of a user using a head mounted display (hmd) and synthetically controlling a digital device using the received gesture input. The method includes detecting whether or not the hmd is worn by a user, detecting a positional state of an external digital device linked with the hmd, the positional state including a first state in which the external digital device is located in a preset view angle region of the hmd and a second state in which the external digital device is not located in the view angle region, detecting a gesture input of the user, determining at least one digital device to be controlled based on whether or not the hmd is worn by the user and the detected positional state, and controlling a display object of the determined digital device under application of the gesture input..
| Biblical board game|
A biblical-based board game includes a board having multiple spaces along which players can move, or climb, as they get questions or tasks correct. The questions and tasks can be divided into four categories—multiple choice questions; fill-in-the-blank questions; word exclusion tasks, where one player gets their teammate to say a phrase/word without the first player mentioning excluded words; and using drawing or gestures to get their teammate to say a word or phrase.
|System and method for disabling secure access to an electronic device using detection of a unique motion|
A system and method for providing secure authorization to an electronic device by combining two or more security features of authentication processed at substantially the same time where at least one of the factors is a “tolerant” factor. By combining two factors such as facial recognition and a screen gesture, these can be analyzed at substantially the same time except when a unique or individualized motion is detected..
|Gesture based polling using an intelligent beverage container|
Systems, devices, and methods are provided for conducting a polling event. A central server computer system determines a polling event is to be conducted and associates one or more inputs from a beverage container with a corresponding selection.
|Systems and methods for editing a computer application from within a runtime environment|
Embodiments allow a runtime environment to link to an editing environment. An object or other feature may be identified for editing in a runtime environment using a specific tool or gesture.
|Screen display control method of electronic device and apparatus therefor|
A method and apparatus for zooming in or out and displaying a screen according a gesture of a user is provided. The method includes sensing gesture input, determining whether the gesture input corresponds to a predetermined pattern of a first semicircle or semi oval shape, and zooming in or out the image displayed on the screen and displaying a zoomed in or zoomed out image on a screen wherein the zoom ratio is in proportion to a radius of a first semicircle or a radius of a long or short axis of a first semi oval when the gesture input is the pattern of the first semicircle or semi oval shape..
|Data processing device and method of performing data processing according to gesture operation|
The present invention is to appropriately determine a gesture operation detected to perform data processing according thereto. In the present invention, a cpu judges a processing status at the time of the detection of a gesture operation performed on a touch panel, and after determining its gesture operation type according to the processing status, performs data processing according to the operation type.
A method for a user interface includes displaying a first item from a list of items on a screen, detecting, using a processor, a gesture comprising a circular motion, and, in response to detecting the gesture, displaying a second item from the list of items on the screen in place of the first item.. .
|Toggle gesture during drag gesture|
Methods and systems for providing input to a computing device based on a toggle gesture performed during a drag gesture are provided. A drag gesture can be performed on a touch screen to manipulate a user interface object.
|Method of controlling a list scroll bar and an electronic device using the same|
A method controls a list scroll bar, including displaying a partial area of a list having at least one first item, and a first list scroll bar to scroll the list; expanding and displaying a first index area having at least one first index corresponding to the at least one first item in the first list scroll bar; downscaling a second index area having at least one second index corresponding to at least one second item different from the at least one first item and displaying a representative second index representing the second index area in the first list scroll bar; and upon detecting a gesture to select the representative second index, expanding and displaying the second index area in the first list scroll bar while downscaling the first index area and displaying a representative first index representing the first index area in the first list scroll bar.. .
|Electronic device and method for changing page count according to a duration of input touch|
An electronic device and method for navigating content to a desired a page based on a duration of touch input includes receiving a touch input at a particular region of displayed contents, counting and displaying a page to the desired page during which the touch input is in contact with the display screen, reaching and displaying the desired page when the touch input is lifted, and further turning to the page including the content according to a further gesture detected thereon.. .
|Portable device and guide information provision method thereof|
A portable device and guide information providing method thereof for providing guide information in response to a hovering gesture input made with a pen are provided. The method includes detecting a hovering gesture input, acquiring guide information corresponding to a currently running application, and displaying the guide information in response to the hovering gesture input..
|Apparatus and method for controlling electronic book in portable terminal|
An apparatus and a method for controlling an electronic book in a portable terminal are provided. The method includes displaying a particular page of an electronic book selected from pre-stored electronic books, when an electronic book fore-edge display gesture is input, displaying a fore-edge of the electronic book while displaying the particular page, when a page turn gesture is input to the displayed fore-edge, determining a page based on the page turn gesture, and displaying the determined page..
|Near field communications-based soft subscriber identity module|
Using near field communications (nfc) to provision a user equipment (ue) with subscriber identity module (sim) data for accessing a wireless services provider's network. An nfc gesture initiates an nfc link between an nfc device and a ue containing nfc circuitry.
|Handheld device document imaging|
A method of stitching frames of a video sequence to image a target document. The method comprises capturing a group of frames of a video sequence using an image sensor of a handheld device having a display, during the capturing, analyzing the video sequence to select iteratively a group of the frames, each member of the group depicts another of segments of a target document, during the capturing, sequentially presenting a plurality maneuvering indications, each the maneuvering indication is presented after a certain frame depicting a certain of the segments is captured and indicative of a maneuvering gesture required for bringing the image sensor to capture another frame depicting another segment of the segments, the another segment being complementary and adjacent to the certain segment, and stitching members of the group to create a mosaic image depicting the target document as a whole..
|Gesture recognition system and method|
A gesture recognition system includes an edof lens, an image sensor, and a processing unit. The image sensor successively captures image frames through the edof lens.
|Computer user interface system and methods|
Systems and methods may provide user control of a computer system via one or more sensors. Also, systems and methods may provide automated response of a computer system to information acquired via one or more sensors.
|Mobile terminal and control method thereof|
A mobile terminal and a control method thereof are disclosed. The mobile terminal includes a display and a controller configured to execute a function corresponding to a user gesture acquired through a stylus when at least one of the body and tip of the stylus does not come into contact with the display.
|Method for operation of pen function and electronic device supporting the same|
An apparatus and method for operation of a pen function in an electronic device. A pen recognition panel recognizes a touch pen according to a set mode.
|Digital workspace ergonomics apparatuses, methods and systems|
The digital workspace ergonomics apparatuses, methods and systems (“dwe”) transform user multi-element touchscreen gestures via dwe components into updated digital collaboration whiteboard objects. In one embodiment, the dwe obtains user whiteboard input from a client device participating in a digital collaborative whiteboarding session.
|Input device with hand posture control|
A gesture detection system according to embodiments includes an input device including a touch sensitive surface and a contact-free detection system; and a controller configured to determine characteristic parameters for the position in three-dimensional space of a user input object and select an operational mode of the input device based on the position of the user input object.. .
|Method and apparatus for constructing a home screen in a terminal having a touch screen|
A method of configuring a home screen in an electronic device includes displaying the home screen including one or more objects on the touch screen; detecting a first touch gesture requesting a display of a list function items related to the home screen in a state where the home screen is displayed; displaying the list of the function items in response to the first touch gesture for selection; detecting a second touch gesture selecting a specific function mode in a state where the list of the item is displayed; and displaying an editing screen on the touch screen for selection.. .
|Portable electronic device and automatic unlocking method thereof|
A portable electronic device having a suspend mode and a work mode is illustrated. The portable electronic device includes a body, a touch sense module and an identifying module.
|Scaling of gesture based input|
The invention relates to method for providing an input to a device comprising the steps of sensing a gesture based interaction, classifying the gesture based interaction and depending on the classification, inputting an instruction to the device, wherein the instruction depends on the realization characteristics of the gesture based interaction, in particular trajectory and/or speed and/or duration in time of the gesture based interaction, and scales with at least one parameter independent of the gesture based interaction. The invention also relates to a corresponding device.
|Mobile terminal and controlling method thereof|
A mobile terminal and controlling method thereof are disclosed, which facilitates display screen space of mobile terminal to be flexibly utilized in consideration of user's convenience and necessity. The present invention includes displaying 1st screen on 1st region within touchscreen, if 1st gesture is detected, forming 2nd region within the touchscreen, displaying at least one portion of the 1st screen of the 1st region at a timing point of detecting the 1st gesture as 2nd screen, when touch input to the 1st region is detected or the 2nd region is formed, changing the 1st screen of the 1st region into 3rd screen automatically, if 2nd gesture is detected, generating merged screen including at least one portion of the 3rd screen of the 1st region and at least one portion of the 2nd screen of the 2nd region, and displaying the merged screen on 3rd region..
|Single contact scaling gesture|
Methods and systems for providing input to a computing device based on a single contact scaling gesture are provided. A scaling gesture can be performed on a touch-sensitive panel to zoom in or out of a displayed image, for example.
|Visual object manipulation|
In one example, a method includes outputting, at a first location of a presence-sensitive display of a computing device, a first graphical object and receiving an indication of a first touch gesture detected at a second location of the presence-sensitive display. The method may further include, in response to receiving the indication of the first touch gesture, outputting, at the second location, a second graphical object and receiving an indication of a second touch gesture originating within a predetermined distance of the second location and moving towards the first location.
|Methods, apparatuses, and computer program products for determination of the digit being used by a user to provide input|
Methods, apparatuses, and computer program products are herein provided for determination of the digit being used by a user to provide input. A method may include receiving user input defining a slide gesture from a digit of a user on a touchscreen.
|Single-gesture mobile computing device operations|
A mobile computing device comprising a user interface and a touch button. A mobile computing device operation is adapted to occur upon the touch button being engaged with a touching device and the touching device sliding to the user interface prior to removing the touching device from the mobile computing device..
|Rolling gesture detection using an electronic device|
An electronic device with one or more processors and memory detects a button press of a respective button of a plurality of buttons that include a first button that corresponds to a first type of operation and a second button that corresponds to a second type of operation. The device determines, in conjunction with detecting the button press, a rolling gesture metric corresponding to performance of a rolling gesture comprising rotation about a longitudinal axis of the electronic device.
|Enhanced detection of gesture|
The enhanced detection of a waving engagement gesture, in which a shape is defined within motion data, the motion data is sampled at points that are aligned with the defined shape, and, based on the sampled motion data, positions of a moving object along the defined shape are determined over time. It is determined whether the moving object is performing a gesture based on a pattern exhibited by the determined positions, and an application is controlled if determining that the moving object is performing the gesture..
|Flexible apparatus and control method thereof|
A flexible apparatus is provided. The flexible apparatus includes: a sensor configured to sense bending of the flexible apparatus; and when it is determined that a rubbing gesture of rubbing a plurality of different areas of the flexible apparatus is performed based on a result of the sensing, a controller configured to perform an operation corresponding to the rubbing gesture..
|Application control in electronic devices|
A portable electronic device is provided, comprising a display screen area for providing visual feedback and for receiving gestures inputs, and a switching controller to enable switching between multiple applications that have been executed on the device, the switching controller being adapted to interact with an operating system on the device and including a number of software components that interact with components that are native to an operating system on the device, and wherein the device further comprises a processor for invoking procedures relating to the particular components of the switching controller, wherein the switching controller comprises a task management component for maintaining an ordered list of tasks that are running on the device and allowing for task status to be changed. A method is also provided for controlling switching between a plurality of applications in a portable electronic device, the method comprising a display screen wherein the method includes generating an ordered list of the plurality of applications that are running on a device and controlling switching between the applications on the basis of the list.
|Computer vision gesture based control of a device|
A system and method are provided for controlling a device based on computer vision. Embodiments of the system and method of the invention are based on receiving a sequence of images of a field of view; detecting movement of at least one object in the images; applying a shape recognition algorithm on the at least one moving object; confirming that the object is a user hand by combining information from at least two images of the object; and tracking the object to control the device..
|Method of controlling function execution in a mobile terminal by recognizing writing gesture and apparatus for performing the same|
Methods and apparatus are provided for executing a function of a mobile terminal by recognizing a writing gesture. The writing gesture that is inputted on a touchscreen of the mobile terminal is detected.
|Interactive virtual display system|
An “interactive virtual display,” as described herein, provides various systems and techniques that facilitate ubiquitous user interaction with both local and remote heterogeneous computing devices. More specifically, the interactive virtual display uses various combinations of small-size programmable hardware and portable or wearable sensors to enable any display surface (e.g., computer display devices, televisions, projected images/video from projection devices, etc.) to act as a thin client for users to interact with a plurality heterogeneous computing devices regardless of where those devices are located relative to the user.
|Accessible data visualizations for visually impaired users|
Systems and methods are provided, at an accessible electronic device having a visual display with a touch-sensitive surface, for displaying on the visual display a graphic visualization having a plurality of graphic elements, and in response to detecting a navigation gesture by a finger on the touch-sensitive surface, selecting one of the plurality of graphic elements and outputting accessibility information associated with the selected graphic element. Systems and methods are also provided for generating computer code for converting a data set into graphic visualization annotated with accessibility information..