Yesterday, IBM announced its massive $3 billion investment effort to create a processor uniquely suited to the demands of the cloud and Big Data systems. This has both strategic and timing implications to the market. Efforts like this speak to the level of maturity in the market as well as IBM’s commitment to leading it. The announcement anticipates a massive improvement in processor scaling down to 7 nanometers and begins to flesh out what the Big Data world will look like in 2020.
Why Do You Care About 2020?
Often the mistake that both technology companies and IT managers make is focusing too much on the tactical and working to solve the problems of today. This tactical focus generally puts everyone in firefighting or whack-a-mole mode, constantly on the verge, and sometimes over it, of being overwhelmed by the changes they can barely keep up with.
The reason to maintain at least a five-year view is so that strategic efforts like Big Data analytics, which consume massive resources, aren’t prematurely obsolete and you can anticipate the world that will exist once they have matured. This eye on the future is often what distinguishes firms that survive decades from those that don’t make it to their first 10-year anniversary. The surviving firms have anticipated and prepared for changes.
What the IBM Announcement Means
This announcement means the market has reached the point where large solutions providers are beginning to build solutions from the ground up, not cobble together technologies that were designed for other things into a kludge that sort of works. The data analytics and cloud solutions increasingly demand a level of performance and granularity that wasn’t imagined when current chip technologies were incubated, and big vendors like IBM now understand what current technologies don’t do. Thus, they have a roadmap to create something that works better.
For those next-generation devices, which will be accessing these increasingly intelligent Watson-like back-end systems (think of Siri or more likely Microsoft’s Cortana on steroids), you’ll need advanced, low-powered transistors that are far more efficient. Many of those devices will be wearable, and to keep heat and power costs down to manageable levels, the data centers of tomorrow will find CNTs and Tunnel Field Effect Transistors (TFETs) critical. Finally, graphene will be critical to the future, as silicon runs out of performance headroom and a more efficient and powerful alternative is increasingly required.
They are layering on advancements in silicon photonics, which massively improves the speed of data transport because Big Data cloud-hosted jobs tend to be very fluid with regard to where they are located and have to be moved depending on their relative importance to the company and the proximity of the user.
Wrapping Up: How IBM Lasts
IBM’s announcement both showcases that the cloud and Big Data analytics requirements of today can now be projected into the future and that it is turning its massive R&D engine into creating the technologies that future will require. This is a massive effort by IBM, showcasing the kind of commitment that made it the longest-lasting technology vendor in the market. The firm is a survivor largely because it is able to step outside of the day-to-day tactical concerns and invest in the world of tomorrow so it has a place in it. That’s a decent example that more firms in every segment should follow.