I’m at IDF, the Intel Developer Forum, this week and, like most events of this type, it started with some upbeat music. Unlike most events of this type, however, the music was created with virtual instruments. Other than two physical turntables, the rest of the instruments used were virtual. The two musicians present were basically playing in the air and the computer was using Intel’s RealSense technology to create what was a rather impressive concert.
This set the stage for what was a fascinating keynote by Brian Krzanich, Intel’s CEO. Apparently Intel plans to change the world as we see it: To give users the power to be anywhere, to be anyone, and to both experience and change the story.
The idea is that you can merge reality with virtual reality, something Intel is calling mixed reality. This is very similar to what Microsoft has been calling holographics.
As Krzanich showcased during the keynote, VR is redefining how we interact with computers. He talked about what you need now for VR: cameras, sensors, a headset, and a dedicated VR room, which is impractical for most of us. However, Intel introduced Project Alloy, a self-contained headset that needs no cameras or dedicated room; it's a direct competitor with Microsoft HoloLens.
You don’t need hand sensors or gloves either; the RealSense cameras in the headset show and track your own hands. You can also see obstacles in the real world, though they do tend to detract from the experience. It’s an attempt to address a very real problem with existing VR setups -- tripping over or running into stuff in the room.
The headset was demonstrated by a user sculpting a virtual object on a potter's wheel. He used a dollar from his pocket as the tool for sculpting the object, blending the real with the virtual. While the presentation was rough, you could see the potential for the project.
Then something interesting happened. Microsoft came on stage to talk about HoloLens and how the two companies are going to work together to further this experience. Apparently, Windows 10 will get an update that will allow it to run both 2D and 3D holographic applications, with one platform to run VR, AR and MR (Mixed Reality). The implication is that this could be the beginning of the end for monitors. In the future, your mixed reality headset will become every monitor, TV, or device display you’ll ever need.
Intel also showcased its Broadwell Extreme Edition 10 core processor, which allows you to edit and create in a virtual world in real time.
One of the most compelling aspects of this was being able to virtually attend a concert or sports event and view it from any angle, because Intel’s technology allows you to scan the entire event from multiple angles and stitch together a full 3D experience.
The Intel Developer Forum is, of course, an Intel event, so it wouldn’t be the same without a processor announcement. Intel showcased PCs running its next generation technology, the 7th generation, announcing that the parts were shipping and that OEMs would have hardware in market in the fall. The demonstration included 4K video editing and high-performance gaming on thin laptops that had been configured with the new processor.
RealSense is Intel’s platform for implementing gesture-based human-computer interaction techniques, which plays into ever more intelligent machines like robots and drones. Intel showcased the Yuneec Typhoon drone, which uses the RealSense technology. The drone (selling for $1,899 at the show) can avoid obstacles automatically and is the first broad-market drone that is actually beginning to showcase limited intelligence.
Intel is announcing its new Aero Platform for UAVS, which is a complete drone system on a board. The board will be available soon for $399. Intel will also have its own complete drone developer platform due out before the end of the year.
Finally, Intel showcased Euclid, an all-in-one RealSense device that's the size of a candy bar. It is battery-powered and has a full OS (Ubuntu) running on it. This is for those who want to use the camera to develop smart products like seeing robots (which they demonstrated). This thing is tiny but doubles both the number of points captured and range over the prior version.
At the heart of autonomous driving is visual intelligence. At one point, a BMW VP, a full electric vehicle with a modified i3, drove onto the stage. The driver got out of the car and the car then drove itself off the stage and parked by itself. BMW broke self-driving down into five levels: Currently, we are at level 3, which still involves the driver for all but short distances; level 4 allows the driver to disengage; with level 5, the driver is redundant. The requirements include multiple sensors, in-car intelligence, and changes to both the outside and inside of the car.
General Electric’s Chairman and CEO Jeff Immelt then joined Krzanich on stage to talk about IoT. GE is a massive player in this space and Immelt spoke about the convergence of horizontal technology and devices in creating solutions. He demonstrated GE Current, which is a smart cities lighting platform that resides on top of Intel’s IoT platform.
Current monitors physical traffic and is able to adjust city services like stoplights and pedestrian signals based on local people traffic. Krzanich claimed that advances will allow you to get out of your car at your destination, and your car will then automatically go find the closest parking spot, where it will wait until you want to be picked up again.
Intel announced Jewel, a new tiny intelligent platform for robotics, etc. It showcased a set of intelligent safety glasses that will be used by Airbus. These glasses provide both audio and visual input to workers on both the types of part to use and their placement. This should significantly reduce the amount of aircraft assembly mistakes.
Intel sponsors a show called America’s Greatest Makers, which offers a $1M prize. Intel brought the winning team on stage with its product: a smart toothbrush that uses games to get kids to brush. Called Crush, the product will be in market by year’s end.
Intel is clearly moving beyond PCs and servers into new areas like mixed reality, autonomous cars (and hopefully autonomous flying cars) and IoT. They closed the day with a video showcasing what the world will look like once they are done. Intel is redefining the experience of computing and it can’t do that without developers. Today was about getting those developers excited and I think they did that.
Rob Enderle is President and Principal Analyst of the Enderle Group, a forward-looking emerging technology advisory firm. With over 30 years’ experience in emerging technologies, he has provided regional and global companies with guidance in how to better target customer needs; create new business opportunities; anticipate technology changes; select vendors and products; and present their products in the best possible light. Rob covers the technology industry broadly. Before founding the Enderle Group, Rob was the Senior Research Fellow for Forrester Research and the Giga Information Group, and held senior positions at IBM and ROLM. Follow Rob on Twitter @enderle, on Facebook and on Google+.