Intel unveils details on it’s Project Alloy project at the Intel Developer Forum 2016. What makes this headset interesting is that they’ve incorporated all the components including the graphics processing and CPU on the headset itself making it completely self contained and wireless.
It implements everything on the headset for tracking your position and hands without any external setup required. I think the tradeoff on this is that the hand tracking and movement may have latency and accuracy issues with this device. For hands, this would likely translate into lag and ability to use your hands freely since it could only track what is visible from your headset. For example, in the video you can see the inaccuracy in trying to pull the lever in the demo and the only time hands can be used is when your hands are directly in front of you. I’d be surprised if they could recreate a bowling experience with this where you’d need to swing your hands behind you to throw the ball.
The other potential impact of not having external sensors is that your physical location in the world is decoupled from your position in the virtual world. As such, after moving around a bit, it seems like a recipe for disaster for running into walls.
Don’t get me wrong, I’m impressed that they are able to get this all into the headset to make it wireless, but I think they missed the mark on being a good usable product from the consumer standpoint vs a technology showcase.
If they combined the wireless headset capabilities where the CPU and GPU graphics computation was all on the headset, but used external tracking using an IR Lighthouse for example, that would be a killer combo.