Oculus Research, the company’s R&D division, recently published a paper that goes deeper into their eye tracking-assisted, multi-focal display tech, detailing the creation of something they dub a “perceptual” testbed.
Current consumer VR headsets universally present the user with a single, fixed-focus display plane, something that creates what’s known in the field as the vergence-accommodation conflict; the user simply doesn’t have the ability to focus correctly due to the display’s inability to provide realistic focus cues, making for a less realistic and less comfortable viewing experience. You can read more about the vergence-accomodation conflict in our article on Oculus Research’s experimental focal surface display.
Read more: https://www.roadtovr.com/oculus-research-reveals-new-multi-focal-display-tech/
Social is going to be one of the big draw for VR experiences once enough people have headsets. The following three videos show the first glimpses of what is to come where you can sit and watch your favorite sports together with other fans, or get together with friends and family to watch movies and TV shows together.
There’s a lot of excitement of being able to have the full augmented reality experience with AR glasses, but the reality is right now these headsets are too big and bulky trying to do everything from depth mapping, object detection, taking pictures and head tracking.
Intel’s concept approach below goes the opposite way and is focusing on keeping it simple and in many ways it’s pretty compelling if it actually works in real life as well as they portray.
Facebook’s Quill tool now lets you animate in VR.
Facebook’s own VR artistry experience, known as Quill, just got a major update that brings simple, but powerful animation tools to the platform.
Quill update 1.4 brings “animated paint layers” and an “animation clip panel,” as well as “animated brush settings to control how strokes are drawn while clips are playing.”
Read more: https://vrscout.com/news/quill-animation-vr-app/#
Another high quality technical post from Doc-Ok going through actual pixel resolution accounting for the distortion and FOV of the lens. Read more here: http://doc-ok.org/?p=1677
This 2016 VRDC talk covers the entire process, from start to finish, of making Epic Games’ Bullet Train VR Demo. Epic’s Nick Donaldson and Nick Whiting cover design considerations surrounding the user experience of adding interaction to traditionally passive experiences, and discuss where they had to diverge from their original design choices in order to match the players’ expectations of the world they interact with.