“HEAD TRACKING BASED GESTURE CONTROL TECHNIQUES FOR HEAD MOUNTED DISPLAYS” – Patent application 20150138074, filed November 13, 2014.
Head gestures for dialogue box interaction:
This review is by no means exhaustive but does demonstrate the comprehensive nature of Kopin’s development in the area of head worn wearable user interface elements and systems.
Enterprise head worn devices with display pods will require UI elements that can be easily integrated with existing applications and operating systems in a user friendly manner to support worker safety and reduced training requirements. What you see assembled here is just that: wearables UI middleware known as the Golden-i OS which sits on top of Windows or Android operating systems to provide a complete hands-free UI layer for head worn devices.
Please note: All diagrams on this page have been reproduced from publicly available USPTO filings and links have been provided to the original files. Please review original files for complete information. Patent filings noted above are property of Kopin Corporation, all rights reserved.
User Interface Concepts For AR Head Worn Wearable Devices
Head worn devices will re-define how we access and manage information in the post-PC wearables era. Initially, I expect head worn wearables to achieve wide adoption in the enterprise and first responder markets. The list of advantages that access to hands-free work related information is vast. At the top of this list is the one advantage every industry values above all else: safety. At the following location, you will find a description of the head worn wearables display architecture called Video See-Through Augmented Reality that I believe will dominate enterprise adoption starting in 2017.
However, the head worn form factor will require a very different set of User Interface elements than we are used to using with PCs, laptops, or tablets. The very advantage a head worn device provides to enable safer work nullifies traditional mouse and keyboard input methods: head worn wearable devices must be hands-free. With that in mind, lets explore some leading UI concepts Kopin has developed as part of their Golden-i HMD platform. Below is a graphical representation of Golden-i from granted patent D714,786S which I have marked up to highlight design advantages.
Here are full color renderings of the Golden-i 3.8M.
So now that we have established the basic form factor of the Golden-i Video See-Through AR device, lets review how a user would interact with such a device using publicly available intellectual property Kopin has developed accessible via patent filings.
Hands Free User Interface for Head Mounted Displays
“REMOTE CONTROL OF HOST APPLICATION USING MOTION AND VOICE COMMAND” - Granted patent 9,235,262, filed May 5, 2010.
“HEADSET COMPUTER WITH HEADTRACKING INPUT USED FOR INERTIAL CONTROL” – Granted patent 9,134,793, filed March 13, 2013.
Additional UI functionality added to foundational patent 9,235,262:
Increase to full 6-axis motion tracking for head gesture input.
Implementation of labels overlaid on existing applications to enable voice control of existing applications:
Use of head gestures and voice inputs to scroll through lists and make selections:
Implementation of multiple desktop screens and methods the user can use to switch view between desktop screens.
Selecting and Editing Text Via Voice and Gesture
“TEXT SELECTION USING HMD HEAD-TRACKER AND VOICE COMMAND”– Granted patent 9,383,816, filed November 13, 2014.
Hands-free editing of text visible on the display pod will be a critical task for all HMD users:
“TEXT EDITING WITH GESTURE CONTROL AND NATURAL SPEECH” – Patent application 20150187355 filed December 19, 2014.
Comprehensive filing regarding the use of speech to edit visible text including “Text Dictation Post-Processing” to assist with spelling and grammar:
Moving Objects On-Screen Hands Free
“HEAD-TRACKING BASED TECHNIQUE FOR MOVING ON-SCREEN OBJECTS ON HEAD MOUNTED DISPLAYS (HMD)” – Patent application 20150220142 filed January 30, 2015.
How does one move objects visible on the optical pod without using hands or mouse pointing device. Herein is described a method using head gesture selection and movement for on-screen objects:
Hands Free Navigation of Web Pages and Existing Legacy Applications
“SEARCHLIGHT NAVIGATION USING HEADTRACKER TO REVEAL HIDDEN OR EXTRA DOCUMENT DATA” – Granted patent 9,377,862, filed March 13, 2013.
This foundational patent covers many aspects of interacting with the HMD using head gestures, hand gestures, and voice input.
On-screen cursor control via head and hand gestures as well as pan and zoom for large virtual workspaces:
“ADVANCED REMOTE CONTROL OF HOST APPLICATIONS USING MOTION AND VOICE COMMANDS” – Granted patent 9,122,307, filed September 16, 2011.