Neural Nets + VR = Magic!

The following is an interesting blog post about using Neural Networks and Machine Learning in conjunction with VR to explore better user interfaces.

Read more:


A Guide To Capturing and Preparing Photogrammetry For Unity

This manual is conceived from a project made in collaboration with, and funded by the The Emerging Media Lab and Department of Geography at the University of British Columbia , to provide VR field trips built using photogrammetry techniques, and showcased with the HTC Vive running in Unity. This project was also partially funded by UBC Teaching and Learning Enhancement Fund and BCcampus Open Education.

It is written by Andrew Yao-An Lee of Metanaut. Metanaut is an indie VR studio based in Vancouver, BC, Canada. Some examples in this manual are from the first field trip location made for the project, which is at Prospect Point of Stanley Park, located in Vancouver, BC, Canada.

This manual will provide best practices and complete workflow from capturing to creating a reasonably good photogrammetry model of an environment (versus an object) for viewing in real-time in Unity. Our methodology is for presenting the photogrammetry mesh that is captured and not about remodeling a scene based on photogrammetry meshes.

The manual is by no means fully complete nor tries to define the best way to approach creating photogrammetry models. It does, however, present one workflow for achieving good photogrammetry results.

Build your own VR display: An introduction to VR display systems

The following is a 3-hour video which gives a good high level primer and introduction for students as a course on implementation of a VR system.   It doesn’t assume prior knowledge on VR and covers a broad spectrum of fundamental components in building a complete system.   It covers the graphics pipeline, positional and rotational tracking for pose estimation, and 3D audio with in depth details on the math and equations involved.

Understanding Projection and Distortion in Wide-FoV HMD such as Pimax 8K

Another great blog post that’s very informative and well written by Doc-Ok who goes through the technical details on properly understanding the distortion with respect to the newly announced Pimax 8k headset with 200 degree FOV headset.

Read here:

Google and Song Exploder deconstruct songs in virtual reality

Inside Music is the latest project to come out of Google’s Creative Labs. It’s a kind of next-level equalizer in virtual reality that uses WebVR and comes from a partnership with the music-analyzing podcast Song Exploder.

Viewed either on the web or in a VR headset, Inside Music breaks a song down into its individual components, like drums, vocals, and guitar. Each component is represented by a sphere that you click to disable that part of the song, so you can hear what each part brings to the mix. The initial experiment lets you select songs from a collection of artists including PhoenixNatalia Lafourcade, and Clipping.

Read more:

PROJECTIONS, Episode 23: Inside-Out Tracking with Single Camera

One of the next frontiers for virtual and augmented reality headsets is inside-out positional tracking, a problem that’s yet to be fully solved. We visit the offices of Occipital–makers of the Structure Sensor–to get a demo of their inside-out tracking system that can work with a single camera, and chat with Occipital’s CEO about their approach to this computer vision challenge.