The enterprise is about to gain access to an entirely new level of processing power in the form of quantum computing (QC). But what, exactly, can this do for garden-variety business processes, and are there any risks?
IBM took the wraps off its IBM Q platform that it touts as the first commercially available universal quantum computing service. The system will be available on the IBM Cloud and will be accompanied by a new set of APIs that make it easier for developers to write code for the new environment, plus a simulator that can model performance on a 20 qubit (quantum bit) system – about five times more powerful than existing cloud-based quantum services. Later in the year, IBM intends to release a full software development kit to streamline the creation of basic apps. (Disclosure: I provide content services for IBM.)
The development of QC is crucial to the ongoing advancement of computer sciences now that Moore’s Law is starting to hit its practical limits, says Dave Turek, vice president of High Performance Computing and Cognitive Systems at IBM. In a conference call reported by Enterprise Times, Turek explained that a conventional supercomputer could simulate a quantum burst of about 43 electrons, but not the 50-electron performance of an advanced quantum system. In practical terms, this means a quantum machine can deliver finely tuned manufacturing, order fulfillment and shipping data to cut waste, as well as open up entirely new capabilities in cognitive computing, machine learning and Big Data analytics.
The risk, of course, is that as computing power increases, so does the potential for harm. Scott Totzke, CEO of security firm ISARA Corp., notes that while quantum systems should be perfectly capable of thwarting quantum attacks, the legion of conventional infrastructure will suddenly be highly vulnerable to all manner of mischief. This means the transition to quantum performance will have to be handled very carefully to prevent everything from massive data breaches to incomprehensibly large DDoS attacks. So even as QC enters the commercial sphere, leading technologists should take a hard look at how to protect today’s systems when current cryptographic algorithms like RSA and Elliptic Curve become obsolete.
Probably the most salient fact about QC is that it is not a more powerful version of conventional computing but an entirely new way of processing information. Intel, for instance, is investigating QC, neuromorphic chips, silicon photonics and a host of other technologies as means to upend both Moore’s Law and the Von Neumann model that governs the basic exchange between storage, processors and memory. Under a quantum paradigm, the company hopes to create new chip-level data flows that end the distinctions between constructs like SSDs and DRAM and overcome the bottlenecks that arise on even the most powerful conventional architectures.
It’s been said that demand for greater computing power is always several orders of magnitude greater than what current technology can provide. Quantum computing has the potential to balance out this discrepancy, at least for a little while.
But the human mind is nothing if not innovative. So it shouldn’t come as a big surprise that even before QC hits the mainstream there will be those who start thinking about what comes next.
Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.