Some Quantum Computing to Go with Your Enterprise?

Arthur Cole
Slide Show

How the Internet of Things Will Transform the Data Center

Word on the street is that the quantum computer is almost upon us, so the next logical question is: How long before we see the quantum enterprise?

According to HPCwire, scientists at UC Santa Barbara just made what they call a breakthrough in quantum mechanics that could lead to the development of multidimensional, as opposed to binary, computing. In physicist-speak, the team has devised a new way of arranging “transmon qubits” in a linear array so that data can be transferred with fewer errors. A qubit (quantum bit) can exist in multiple states at once and can therefore push parallel processing to entirely new levels. The problem is that they are highly unstable and can lose information quite readily. The CA team says its new design can boost reliability to 99.9 percent, and then handle the remainder with advanced error correction.

Meanwhile, another team at UC San Diego says it can dramatically increase the speed of quantum computing by forging indirect links between the various states of a quantum particle’s “superposition.” Like nodes in a cluster, quantum states can share data through high-speed fabrics. When one state has a direct link to another, the data transfer is lightning fast, but if multiple states must be traversed, the process slows down. The San Diego team says it can bypass those multiple hops to forge instant connections between states. Exactly how this is done is, frankly, beyond my comprehension, so for now the only explanation I can provide is it’s done by “magic.”

If all this sounds like so much science fiction, note that companies like Google are already working with quantum devices—well, maybe. The box in question is the D-Wave, a refrigerator-sized unit that houses circuits made of niobium wire. Its builders claim it is a quantum computer, although not everyone is convinced of that, and even D-Wave executives admit they are not entirely sure how it works. Nonetheless, compared to even the most advanced classical computing optimization technologies, the D-Wave shows a 3,600-fold improvement when running highly complex workloads. That’s enough for Google to set up a test lab to see how much it can throw at the machine. One tiny drawback, however, is that the niobium wire used in it has to be kept at near absolute zero temperature to remain viable.

Meanwhile, Google is eager to share the benefits of quantum computing with others. The company has launched the Quantum Computing Playground, a Web-based simulation that calculates quantum algorithms written in QScript. It can’t give you the speed of an actual quantum computer, but it can at least let you know how various workloads and gate configurations should behave under certain circumstances. It also includes a number of training examples, from entry-level scenarios to some of the leading algorithms.

If current quantum computer models are even close to accurate, a commercially viable quantum processor could easily sweep away virtually all of the obstacles the enterprise now faces in the transition to scale-out virtual architectures. Big Data, in fact, would suddenly become small data.

Undoubtedly, new problems would arise, and many will be quite nettlesome until the exact nature of the quantum environment is known. But it would nevertheless launch an entirely new era of data communications, and it could very well be here a lot sooner than people realize.

Add Comment      Leave a comment on this blog post
May 24, 2014 7:59 PM Howard Barnum Howard Barnum  says:
What does "the transition to scale-out virtual architectures" mean? (I'm not an IT guy, more of a QC guy.) I suspect if I knew, I would be dubious about the claim that "a commercially viable quantum processor could easily sweep away virtually all of the obstacles" to it... Reply

Post a comment





(Maximum characters: 1200). You have 1200 characters left.



Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.