Big Processing Changes in Small Packages

    Sometimes, technology development is like a magic show. While the magician is drawing your attention with one hand, the other is pulling off the move that makes the illusion seem real.

    These days, while all attention is focused on the cloud, mobility and other macro forces hitting the enterprise, a number of innovative developments are taking place on the micro level that could, in relatively short order, fundamentally reshape the way we humans interact with the data universe.

    Not surprisingly, much of the research is aimed at maintaining Moore’s Law just a little bit longer. As circuitry continues to shrink and processing requirements continue to grow, heat generation is finally starting to put the brakes on the continual expansion of computing power. What’s needed, then, is an entirely new approach to chip design, with new materials and new designs that maintain structural integrity as architectures move from the micro- to the nano-level, and even smaller.

    One of the most promising efforts is IBM’s research into liquid transistor technology. The program involves highly refined nanochannels only a few atoms wide and specialized oxide materials that can change from conducting to non-conducting states depending on the voltage encountered. In this way, the device does away with the traditional gate approach to data creation in favor of a more static design that produces less heat even under heavy data loads. Prototypes are still several years away, and there are a number of technical issues to be resolved, but so far researchers say the technology seems highly efficient and extremely versatile.

    Even if IBM’s program fails to bear fruit, however, the development of  quantum computing is inevitable, according to MIT professor Seth Lloyd. QC research is already yielding its share of Nobel Prizes, and the potential gains in computing power compared to current designs are truly mind-boggling. We’re talking atomic-scale, quark-scale and even Planck-scale computing that can shrink the processing unit by several billion orders of magnitude and produce equally dramatic gains in computational prowess when harnessed together.

    And if that isn’t weird enough, how about computers made of living material? Researchers at Stanford University are reporting success in biological switch technology that could one day lead to DNA-based microcomputers for medical or other purposes. At the moment, the devices are capable of only rudimentary functions, but given time there is every possibility that scientists could manufacture probiotic (or antibiotic) systems tailored to an individual’s unique body chemistry. These could be used to attack or repair cells, either curing or causing some of the world’s most fatal diseases. As is often the case, technology is neither moral nor amoral – that determination is made by the hand that wields it.

    Of course, these kinds of systems are on a, shall we say, “long-term development path.” A little closer to reality is a new integrated processor/memory design that backers say could boost performance and lower operating costs within a year or so. The Hybrid Memory Cube consortium is already out with a standard spec for its highly dense cube architecture that promises to bridge the performance gap that exists between CPU and memory infrastructure. The group is reporting a 15-fold gain in overall performance and a 70 reduction in power consumption.

    The computing industry is a good example of the fact that small changes often yield big results. For all the talk lately that data technology is facing the upper limits of its capability and that the only way to improve performance is to increase density, the fact is that it is entirely possible that a completely new compute environment is right around the corner.

    And if we are talking about quantum leaps in processing power just around the corner, then today’s struggles with Big Data and advanced cloud architectures will look quaint indeed in a very short time.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Latest Articles