I’ve flown home from Intel’s Developer Forum and one of the things that really had me thinking was a chat I had with Achin Bhowmilk, Ph.D., who is likely one of the smartest guys I know. He runs the group at Intel that has the RealSense camera and is working on its Merged Reality effort.
This effort is in competition with Microsoft’s Mixed Reality effort, even though both companies are actually collaborating on what could be something far closer to teleportation than any video conferencing or telepresence effort currently in market. Microsoft calls its idea of this concept Holoportation. While Holoportation only works for one person at the moment, what Intel is talking about could work for both.
Let’s contrast Mixed Reality and Merged Reality and talk about why Intel’s Merged Reality effort could be a better solution for keeping our butts out of airplane seats.
Merged vs. Mixed Reality
Microsoft is clearly much farther along with its solution than Intel is, and the Microsoft solution fits better into where technology is today because it potentially uses less processing power, which makes the headset smaller, lighter, and potentially more power efficient. With Microsoft HoloLens and mixed reality, the headset scans the room and renders a virtual image in it. However, the result looks “ghostly.” While that may be perfect for situations where the product is currently targeted, industrial and scientific uses, it isn’t ideal for when you actually want to transport the individual using the device someplace else.
Intel’s approach is far more powerful because it not only scans the environment, it can render it. This allows you to alter the environment the user sees and potentially create a seamless weave between what is virtual and what is real because everything is effectively virtual. This means that, like Mixed Reality, you can interact with physical objects in your space (and, using some kind of remote robot, interact with things in the remote spaces) but you’ll feel more like you are in the remote location. And because that location is rendered, you can alter it so that it appears like something else.
Teleportation
The problem with Holoportation is that you can’t get rid of the image of the goggles. In the Holoportation video, the presenter’s daughter, who is remote, doesn’t see her father because she isn’t wearing the HoloLens and interacts with him as a disembodied voice. The father does get a more complete experience but he sees his daughter like a ghost, which could be pretty cool on Halloween but not that realistic the rest of the time. In addition, you also have to rig the room cameras like you do with VR in both rooms so the two can somewhat interact but it really isn’t real for either party.
With Intel’s merged reality, you can render out the headset as part of the process. In addition, while you are at it, you can render out your clothing, remove age from your features, render makeup, even render different hair colors and styles, and it would all be very realistic. You could imagine having women’s clothing automatically adjusted to region-specific requirements without any of the women ever having to change. You could even show up to a virtual meeting naked, though, if there was a glitch, that meeting could take an unfortunate turn. (I wonder if you’d be told at the time or after the meeting. “By the way Sally, you do realize that if you are going to conference in naked and do a presentation you need to actually select the ‘render clothing’ option, right?”)
Wrapping Up: What’s in Store
I think we are getting very close to something almost indistinguishable from Teleportation but without the nasty issues of what happens if you actually vaporize someone and recreate them someplace else. Merged Reality could finally make it so we could travel to another part of the world instantly and make us feel like we are actually there. And because Intel’s solution is self-contained (though likely will need some heavy cloud-based services to work), it would be far easier to set up.
My expectation is that we are looking at something that won’t emerge in acceptable resolutions and costs until around 2025 but when it does, this could actually change how we interact globally and put long distance business travel, and even some vacation travel, on the road to obsolescence.
Rob Enderle is President and Principal Analyst of the Enderle Group, a forward-looking emerging technology advisory firm. With over 30 years’ experience in emerging technologies, he has provided regional and global companies with guidance in how to better target customer needs; create new business opportunities; anticipate technology changes; select vendors and products; and present their products in the best possible light. Rob covers the technology industry broadly. Before founding the Enderle Group, Rob was the Senior Research Fellow for Forrester Research and the Giga Information Group, and held senior positions at IBM and ROLM. Follow Rob on Twitter @enderle, on Facebook and on Google+