The following is an interesting blog post about using Neural Networks and Machine Learning in conjunction with VR to explore better user interfaces.
To date, there still isn’t a really good way to handle text input while using a VR headset. However, this blog post covers some existing ideas and work that people have been looking at in this area.
First one is from Google Labs on using a drum like interface for keyboard entry. Read more about this here.
Here is an open sourced implementation building on that concept:
The other approach other than keyboards is to use some sort of chorded text input interface that you hold in your hand:
“In the near future, you may not need to touch your phone, tablet, or keyboard when you want to type. That’s the concept behind the Tap Strap, an amazing wearable Bluetooth keyboard that converts finger movements into key presses, so you can tap out messages using any surface as a virtual keyboard.”
This last part shows an interesting concept of what a merging of coding and development could be like while in VR:
MEETMIKE is the name of the VR experience being shown at this week at SIGGRAPH 2017 conference, which features a wholly digital version of VFX reporter Mike Seymour being ‘driven’ and rendered in real-time by the real life Seymour. Inside the experience, Seymour is to play host, interviewing industry veterans and researchers inside of VR during the conference. Several additional participants wearing VR headsets can watch the interview from inside the virtual studio.
- Resolution: 5120 x 1440 (2560 x 1440 per eye)
- Field of view: 150–170º (depending on IPD and focus)
- Refresh rate: 60–90 Hz
- Optics: Custom-designed, exceptionally clear lenses
- Head-mount: Basic and military-grade
- High-quality 5K resolution displays
- Unique adjustable focus optical system for stunning image quality
- Displayport 1.2 video cable (up to 100m)
- Built-in soundcard with audio jack
- Integrated microphone for easy communication
- Independently adjustable lenses distance (inter-pupilar distance, also known as IPD)
- Multicast capability to connect multiple devices to one computer
- 8 programmable buttons
- Clients support for VR laboratory setup and HW and SW integration
- Regular and military-grade “nightvision” head straps available
The following is a hands on review of the device.
Google’s latest virtual reality app lets users build colorful 3D models in VR, and it’s out today for free on the Oculus Rift and HTC Vive. Google Blocks is supposed to be intuitive enough for newcomers to use, but full-featured enough to support making artistically compelling models, like the ones Google has collected in an online gallery. Users can export objects and make them viewable online, or they can put them in 3D scenes inside and outside virtual reality. On Google’s site, visitors can even spin objects around to create downloadable animated GIFs.
There is a company started from ex-Nokia/MS guys looking to try and address the current issue of having low resolution pixel density on VR headsets due to the lenses and large FOV. This is interesting to follow to see how this develops. This would be good for VR, but would be more difficult to apply to AR if you want small and light headsets.