There’s a lot of excitement of being able to have the full augmented reality experience with AR glasses, but the reality is right now these headsets are too big and bulky trying to do everything from depth mapping, object detection, taking pictures and head tracking.
Intel’s concept approach below goes the opposite way and is focusing on keeping it simple and in many ways it’s pretty compelling if it actually works in real life as well as they portray.
Qualcomm today revealed a new reference design for a Snapdragon 845 VR headset. The headset uses the similarly named mobile Snapdragon 845 system architecture that the company announced last month, which can be used for both VR and AR.
The Snapdragon 845 headset is capable of displaying two 1024 x 1152 pixel screens at 120 frames per second, which is subpar compared to many existing headsets. (HTC’s Vive Proheadset offers 1400 x 1600 resolution per eye, for instance.)
Read more: link
The following is an interesting blog post about using Neural Networks and Machine Learning in conjunction with VR to explore better user interfaces.
Read more: https://blog.usejournal.com/neural-nets-vr-magic-3b066538aa5d
To date, there still isn’t a really good way to handle text input while using a VR headset. However, this blog post covers some existing ideas and work that people have been looking at in this area.
First one is from Google Labs on using a drum like interface for keyboard entry. Read more about this here.
Here is an open sourced implementation building on that concept:
The other approach other than keyboards is to use some sort of chorded text input interface that you hold in your hand:
“In the near future, you may not need to touch your phone, tablet, or keyboard when you want to type. That’s the concept behind the Tap Strap, an amazing wearable Bluetooth keyboard that converts finger movements into key presses, so you can tap out messages using any surface as a virtual keyboard.”
Read more: https://www.digitaltrends.com/mobile/tap-strap-wearable-keyboard-news/
This last part shows an interesting concept of what a merging of coding and development could be like while in VR:
MEETMIKE is the name of the VR experience being shown at this week at SIGGRAPH 2017 conference, which features a wholly digital version of VFX reporter Mike Seymour being ‘driven’ and rendered in real-time by the real life Seymour. Inside the experience, Seymour is to play host, interviewing industry veterans and researchers inside of VR during the conference. Several additional participants wearing VR headsets can watch the interview from inside the virtual studio.
Read more: https://www.roadtovr.com/siggraph-2017-meetmike-sets-impressive-new-bar-for-real-time-virtual-human-visuals-in-vr/