User Interface Concepts For AR Head Worn Wearable Devices
Head worn devices will re-define how we access and manage information in the post-PC wearables era. Initially, I expect head worn wearables to achieve wide adoption in the enterprise and first responder markets. The list of advantages that access to hands-free work related information is vast. At the top of this list is the one advantage every industry values above all else: safety. At the following location, you will find a description of the head worn wearables display architecture called Video See-Through Augmented Reality that I believe will dominate enterprise adoption starting in 2017.

However, the head worn form factor will require a very different set of User Interface elements than we are used to using with PCs, laptops, or tablets. The very advantage a head worn device provides to enable safer work nullifies traditional mouse and keyboard input methods: head worn wearable devices must be hands-free. With that in mind, lets explore some leading UI concepts Kopin has developed as part of their Golden-i HMD platform. Below is a graphical representation of Golden-i from granted patent D714,786S which I have marked up to highlight design advantages.

Here are full color renderings of the Golden-i 3.8M.

So now that we have established the basic form factor of the Golden-i Video See-Through AR device, lets review how a user would interact with such a device using publicly available intellectual property Kopin has developed accessible via patent filings.


Hands Free User Interface for Head Mounted Displays
REMOTE CONTROL OF HOST APPLICATION USING MOTION AND VOICE COMMAND” - Granted patent 9,235,262, filed May 5, 2010.

This foundational patent covers many aspects of interacting with the HMD using head gestures, hand gestures, and voice input.

On-screen cursor control via head and hand gestures as well as pan and zoom for large virtual workspaces:

  • “A remote control microdisplay device that uses an input device such as a head tracking accelerometer or a camera to detect movements such as head movements, hand motions and/or gestures with optional voice commands to control the parameters of a field of view for the microdisplay such as a field of view within a larger virtual display area associated with a host application.”


  • “… the user can use hand gestures or head movements to cause the cursor to move around (left, right, up, down) within the virtual display.”


  • “… use hand or head movements to position the cursor at a particular hyperlink of interest. Using the voice command “SELECT”, the selected hyperlink e.g., “About USPTO” is then activated.”


  • Keyboard shortcuts in common existing applications executed via voice commands.


  • Alternate menu placement to accommodate focal issues with display pod.



ADVANCED REMOTE CONTROL OF HOST APPLICATIONS USING MOTION AND VOICE COMMANDS” – Granted patent 9,122,307, filed September 16, 2011.

HEADSET COMPUTER WITH HEADTRACKING INPUT USED FOR INERTIAL CONTROL” – Granted patent 9,134,793, filed March 13, 2013.

Additional UI functionality added to foundational patent 9,235,262:
Increase to full 6-axis motion tracking for head gesture input.
Implementation of labels overlaid on existing applications to enable voice control of existing applications:

  • “Here, the HMD device imposes a software overlay onto the presentation of information to the user. The overlay adds a graphic picture and/or text associated with a spoken command, hand gesture, or head movement needed to initiate an action indicated by icons, buttons, sliders, drop down lists, or other objects on the screen.”

Use of head gestures and voice inputs to scroll through lists and make selections:

  • “Here, however, one can use the head-tracking inputs of the HMD device 100 to operate these user interface elements: for example, a move head left/right or up/down can cause scrolling through the list or selecting other lists. These movements can be combined with a following spoken command to select an item in the list (e.g., “Click” or “Select”).”

Implementation of multiple desktop screens and methods the user can use to switch view between desktop screens.


Selecting and Editing Text Via Voice and Gesture
TEXT SELECTION USING HMD HEAD-TRACKER AND VOICE COMMAND”– Granted patent 9,383,816, filed November 13, 2014.
Hands-free editing of text visible on the display pod will be a critical task for all HMD users:

  • “The method and system enables an end-user to select sections of text without requiring the use of a mouse cursor control input device.”​
  • “The user moves the Pointer 411, using the changes to head orientation or voice commands described herein, to either the start or end character of the portion or passages of text (text selection) that he wishes to make subject to the current operation.”
  • “Once the user has properly positioned the Point 411, he issues a voice command, for example “Place Point One …”
  • “… user moves the pointer 411, using head motion and/or voice commands as described herein, to the corresponding counterpart or pairing position to the first Point …”
  • “The user issues a pertinent voice command (for example “Place Point Two”).”
  • “… HSC 100/module 9036 places the second Point 602 (P2) and highlights 604 the displayed text between the two points …”
  • “The system 100 may then present options 450 to carry out functions on the text selection …”


TEXT EDITING WITH GESTURE CONTROL AND NATURAL SPEECH” – Patent application 20150187355 filed December 19, 2014.
Comprehensive filing regarding the use of speech to edit visible text including “Text Dictation Post-Processing” to assist with spelling and grammar:

  • “… techniques for automatically correcting dictated text using resources from the user’s own environment. Such techniques include the use of auto-correction algorithms (based on a standard language dictionary and/or the user’s personal dictionary.”
  • “… use of speech commands to edit the message as a whole. Such commands include global find and replace commands, for example.”
  • “… providing editing assistance regarding the transcription text.”
  • “… editing assistance may include suggested alternatives to one or more portions of the transcription text.”



Moving Objects On-Screen Hands Free
HEAD-TRACKING BASED TECHNIQUE FOR MOVING ON-SCREEN OBJECTS ON HEAD MOUNTED DISPLAYS (HMD)” – Patent application 20150220142 filed January 30, 2015.
How does one move objects visible on the optical pod without using hands or mouse pointing device. Herein is described a method using head gesture selection and movement for on-screen objects:

  • “With head-tracking enabled, a pointer is displayed on screen … This pointer responds to head-tracking.”
  • “When the user moves the pointer so that it hovers over a displayed object or command, module 9036 (or instructions in memory 9120) displays to the user that a “grab” action is available. At this stage, the user can issue a voice command (for example, “grab object”) …”
  • “The user can then position the object in a new place, and can issue another voice command (for example “place object”), and the cursor control software 9036 fixes the object in the new location.”



Hands Free Navigation of Web Pages and Existing Legacy Applications

SEARCHLIGHT NAVIGATION USING HEADTRACKER TO REVEAL HIDDEN OR EXTRA DOCUMENT DATA” – Granted patent 9,377,862, filed March 13, 2013.

HEAD TRACKING BASED GESTURE CONTROL TECHNIQUES FOR HEAD MOUNTED DISPLAYS” – Patent application 20150138074, filed November 13, 2014.
Head gestures for dialogue box interaction:

  • “Notification dialogue boxes can be acknowledged by head nodding or ticking movement in the user interface. Question dialog boxes can be assured by head nods or shakes in the user interface. Head swiping is also a recognizable form of user input through a head tracker of the HSC.”



Conclusion
This review is by no means exhaustive but does demonstrate the comprehensive nature of Kopin’s development in the area of head worn wearable user interface elements and systems.

Enterprise head worn devices with display pods will require UI elements that can be easily integrated with existing applications and operating systems in a user friendly manner to support worker safety and reduced training requirements. What you see assembled here is just that: wearables UI middleware known as the Golden-i OS which sits on top of Windows or Android operating systems to provide a complete hands-free UI layer for head worn devices.


Please note: All diagrams on this page have been reproduced from publicly available USPTO filings and links have been provided to the original files. Please review original files for complete information. Patent filings noted above are property of Kopin Corporation, all rights reserved.