Monday, December 23, 2013

What I'm looking for at CES

If you like to experience sensory overload, go to CES. 3200 exhibitors, hundreds of large-screen TVs hanging off the ceiling, celebrity appearances, flashy cars, and of course Las Vegas outside the convention center.

Here's what I'm looking to see at CES:
  • The latest and greatest in portable (man worn) display technologies. Would love to see which are ready for prime time within a Sensics product.
  • All kinds of sensors: motion sensors, hand and finger sensors, biometric sensors, eye trackers, full body sensing, proximity sensors, embedded cameras for augmented reality. In short, anything that can be reasonably combined with a VR goggle to create the next-generation experience.
  • Companies that have VR/AR ideas and concepts and unique specifications but need help in refining these specifications and then building high-performance and affordable products around them. In the last year, my company has done several such projects and have the designs, IP and decade of experience to help. High-end gaming goggle? Unique display or signal processing requirements? We can help.
  • Last, catch up with friends and professional acquaintances. It's not often that so many of my network are in the same city at the same time.
Whether you are 'buying' or 'selling' in these categories, drop me a note and perhaps we can meet at the show.

Happy Holidays to all. Rest well and get ready for CES 2014!

Sunday, December 15, 2013

The VR goggle as a Sensory and Computing platform

While there is still a lot of work to do on display technologies and optics to get the best possible image in front of the eyes, the real promise of virtual reality goggles is to serve as a portable sensory and computing platform.

Goggles are becoming a platform for several reasons:

  • They are a physical platform. Once you securely position goggles on the head, you now have a physical base to attach additional sensors and peripherals: cameras, trackers, depth sensors and more.
  • Portable computing is becoming evermore powerful, thereby creating an incentive to process and analyze sensory data on the goggles as opposed to transmitting large amount of information to some computing base. Furthermore, a key part of the value of goggles is their portability, so the ability to process 'locally' - on the goggle - contributes to the realization of this value.
  • As goggles become increasingly immersive, the value of sensors increase as a way to tie the experience into physical objects in the immediate surroundings, as well as connect the actions and context of the user to what is happening 'inside' the display.
One could look at the following diagram - courtesy of Sensics - as a good illustration to what these sensors might be:



One could imaging several types of sensors as feeding into the goggle:
  • Orientation and position sensors - whether for the head, limbs or objects such as a gaming sword
  • Cameras that might provide visible, IR or depth map information
  • Positional sensors such as GPS or indoor location sensors
  • Eye tracking sensors to understand gaze direction and potentially provide user interface
  • Biometric sensors such as heart rate, perspiration, blood pressure. An eye tracker can also provide pupil size which is another biometric input.
  • and more
One would then need to turn this sensor data into useful information. For instance, turn position and orientation of various body parts into understanding of gestures; turn a depth map into detection of nearby obstacles; detect faces and markers from a video image.

As we have discussed before, a virtual reality abstraction layer is going to speed up this process as it will allow those that turn data into information to relieve themselves from the need to worry about the particular formats and features of individual sensors and focus on a category of sensors.

There are several places where this transformation could happen: on the goggle with the help of an embedded processor (as shown in the above diagram); Near the goggle on a device such as a tablet or powerful smartphone; At a stationary device such as a desktop PC or gaming console. Performing this transformation in or near the goggle allows running the application software in or near the goggle, leading to a truly portable solution.

What is the best place to do so? As Nelson Mandela - and others - have said "Where you stand depends on where you sit". If you are a PC vendor that is married to everything happening on the PC, you shudder at the notion of the goggle being an independent computing platform. If you are a goggle vendor, you might find that this architecture opens up additional opportunities that do not exist when you are physically and logically tied to a stationary platform. If you are a vendor making phones or tablets, this might allow you to position the phone as a next-generation portable gaming platform.

So, beyond innovations in displays and optics, I think we will see lots of interesting sensors and sensor fusion applications in 2014.

What do you think? Let me know.

Saturday, December 7, 2013

Not just micro displays: adding flat panel products to the mix

After about a decade of making virtual reality goggles and other near-eye devices based on micro displays, my company started demonstrating and shipping goggles that are based on flat-panel displays such as those found in smartphones. Many that have visited the I/ITSEC training show in Orlando had a chance to experience some of our new offering, and we were very happy with the feedback received.

It was not a difficult decision. Given the increasing resolution, diverse supplier base and lower cost of flat panel displays (as opposed to OLED micro displays), it made sense to start applying our innovation and our expertise in goggles to this new display technology.

To date, we have exclusively used OLEDs from eMagin. There are many good folks at eMagin, and they have nice products, but given their well-documented delivery challenges and new designs made possible by flat-panel displays, they won't be our exclusive display supplier anymore.

Where and when is it best to use OLED micro displays?

  • Where physical space is limited, such as when building a simulated rifle scope
  • When the contrast and response time of OLEDs are a must (that is, until OLED flat panels become widely available for goggle use)
  • Where harsh environmental conditions - especially temperature - are required, such as in our ruggedized HMD for training
  • Where high pixel density is required
  • Where being able to purchase replacement parts for many years is important
  • When low power consumption is critical

Where and when is it best to use flat panel displays?
  • Where wide field of view is particularly important
  • When it is important to have supplier diversity
  • When cost is a major factor
Not married to one display technology or another, it is now possible to choose the best one for each new product.