SHARE
Facebook X Pinterest WhatsApp

Big Processing Changes in Small Packages

Sometimes, technology development is like a magic show. While the magician is drawing your attention with one hand, the other is pulling off the move that makes the illusion seem real. These days, while all attention is focused on the cloud, mobility and other macro forces hitting the enterprise, a number of innovative developments are […]

Written By
thumbnail
Arthur Cole
Arthur Cole
Apr 8, 2013

Sometimes, technology development is like a magic show. While the magician is drawing your attention with one hand, the other is pulling off the move that makes the illusion seem real.

These days, while all attention is focused on the cloud, mobility and other macro forces hitting the enterprise, a number of innovative developments are taking place on the micro level that could, in relatively short order, fundamentally reshape the way we humans interact with the data universe.

Not surprisingly, much of the research is aimed at maintaining Moore’s Law just a little bit longer. As circuitry continues to shrink and processing requirements continue to grow, heat generation is finally starting to put the brakes on the continual expansion of computing power. What’s needed, then, is an entirely new approach to chip design, with new materials and new designs that maintain structural integrity as architectures move from the micro- to the nano-level, and even smaller.

One of the most promising efforts is IBM’s research into liquid transistor technology. The program involves highly refined nanochannels only a few atoms wide and specialized oxide materials that can change from conducting to non-conducting states depending on the voltage encountered. In this way, the device does away with the traditional gate approach to data creation in favor of a more static design that produces less heat even under heavy data loads. Prototypes are still several years away, and there are a number of technical issues to be resolved, but so far researchers say the technology seems highly efficient and extremely versatile.

Even if IBM’s program fails to bear fruit, however, the development of  quantum computing is inevitable, according to MIT professor Seth Lloyd. QC research is already yielding its share of Nobel Prizes, and the potential gains in computing power compared to current designs are truly mind-boggling. We’re talking atomic-scale, quark-scale and even Planck-scale computing that can shrink the processing unit by several billion orders of magnitude and produce equally dramatic gains in computational prowess when harnessed together.

And if that isn’t weird enough, how about computers made of living material? Researchers at Stanford University are reporting success in biological switch technology that could one day lead to DNA-based microcomputers for medical or other purposes. At the moment, the devices are capable of only rudimentary functions, but given time there is every possibility that scientists could manufacture probiotic (or antibiotic) systems tailored to an individual’s unique body chemistry. These could be used to attack or repair cells, either curing or causing some of the world’s most fatal diseases. As is often the case, technology is neither moral nor amoral – that determination is made by the hand that wields it.

Of course, these kinds of systems are on a, shall we say, “long-term development path.” A little closer to reality is a new integrated processor/memory design that backers say could boost performance and lower operating costs within a year or so. The Hybrid Memory Cube consortium is already out with a standard spec for its highly dense cube architecture that promises to bridge the performance gap that exists between CPU and memory infrastructure. The group is reporting a 15-fold gain in overall performance and a 70 reduction in power consumption.

The computing industry is a good example of the fact that small changes often yield big results. For all the talk lately that data technology is facing the upper limits of its capability and that the only way to improve performance is to increase density, the fact is that it is entirely possible that a completely new compute environment is right around the corner.

And if we are talking about quantum leaps in processing power just around the corner, then today’s struggles with Big Data and advanced cloud architectures will look quaint indeed in a very short time.

Recommended for you...

Top Server Management Software Tools 2022
Jenn Fulmer
Dec 13, 2021
Hyperscalers: Will They Upend the Mainframe Market?
Tom Taulli
Nov 22, 2021
PagerDuty Report: Stress on IT Teams on the Rise
Mike Vizard
Jul 30, 2021
VMware Adds Subscription Option for VMware Cloud
Mike Vizard
Mar 31, 2021
IT Business Edge Logo

The go-to resource for IT professionals from all corners of the tech world looking for cutting edge technology solutions that solve their unique business challenges. We aim to help these professionals grow their knowledge base and authority in their field with the top news and trends in the technology space.

Property of TechnologyAdvice. © 2025 TechnologyAdvice. All Rights Reserved

Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.