More

    Are We Closing in on the Quantum Enterprise?

    Slide Show

    How the Data Center Will Grow Up in Three Years

    The prevailing narrative in enterprise circles these days is that things will keep getting bigger: Big Data, regional data centers, hyperscale … everything is aimed at finding the magic formula that allows organizations to deal with larger workloads at less cost.

    It is ironic, then, that one of the ways researchers are hoping to tackle this problem is by shrinking the basic computing elements – processing, storage and networking – to atomic and even sub-atomic levels in order to derive greater power and efficiency from available resources.

    So-called quantum computing (QC) has been a facet of high-performance architectures for some time, but lately there has been steadily increasing buzz about enterprise applications as well.

    IBM, for one, is a big proponent of QC, having recently announced a new memory solution that reduces the number of atoms needed to store a bit of data from about 1 million to 12. That’s not a misprint: Twelve atoms represents a 150-fold increase in storage density compared to current state-of-the-art solid-state designs. The research is an off-shoot of IBM’s nanoscale processor investigations, which naturally would require a storage mechanism capable of supplying equivalent capacity from a drive smaller than the Empire State Building. This is all just proof-of-concept at the moment, of course, but at least IBM is confident that it has hit the floor when it comes to reliably storing data on the smallest possible footprint.

    A little closer to reality, however, is The Machine from HP. While still in the pre-prototype phase, the company is talking about a refrigerator-sized appliance that would hold as much computing power as a full-sized data center. It uses a combination of optical photonics, memresistor storage and an as-yet-undisclosed processor technology that essentially does away with the x86-based architectures that have dominated for so long. While a working model won’t be available for some time (2016 at best), the company has issued key software components such as the Linux++ emulation environment that is designed to provide a realistic view of how the system will ultimately function.

    These and other developments have a number of wags talking about the end of Moore’s Law, but as IBM’s Brad McCredie notes, the basic idea of steadily improving computer power remains intact even if the tools and techniques used to accomplish it are changing. This won’t affect consumers much, as they will have access to better and more powerful data solutions. Manufacturers, however, can no long rely on the same manufacturing processes now that the limits of traditional architectures are nearly maxed – a development that leads to the possibility of truly open hardware solutions.

    Misc5

    And of course, research is ongoing in quantum computing, so it is very likely that any actual enterprise-class products that do emerge will be very different from what is happening in the lab today. A case in point is the new “Fibonacci quasiparticle” that does away with standard ion and electron structures in favor of a more error-resistant braided particle track. The technology actually dates back to the late 1990s, but only recently have researchers at Cornell and Microsoft devised a braid configuration using a new class of anyons that are strong enough to support topological quantum computing (TQC). Again, though, the anyons can only be created in strong magnetic fields at temperatures near absolute zero.

    It would be nice to think that a sudden, dramatic improvement in data capabilities is right around the corner, but in all likelihood the gains will continue to be incremental even if enterprise-class QC products start hitting the channel. This is not necessarily a bad thing, however, given that disruptive change often produces just as many new problems as the old ones that it solves.

    A quantum enterprise would be welcome, to be sure, but we would have to figure out how best to utilize it first.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles