Usually, when we identify trends in computing at IT Business Edge, we are referring to the immediate future-a few months or, at most, a year or two ahead. But two recent pieces of news have implications worth paying attention to in a longer timeframe. And, as often is the case with futuristic computing, the ideas are strange: One is living computers based on rotten food (actually, Escherichia coli, or E. coli), and the other involves famously weird quantum science, in which the computer's individual bits aren't called on to be either a zero or a one, the bedrock process of today's devices.
Several sites, including TechEye.net, report on the use of E. coli for computations. The focus of the research is on the composition of logic gates. A logic gate, another fundamental underpinning of computing, takes one or more inputs to produce a single output. Put enough of these together and a full computer task can be completed.
There are seven types of logic gates. For instance, in an "and" logic gate (AND gate), all inputs (A and B) must be "true"-signified in computer-ese as a one-for the output also to be considered true. The other three possible combinations for a two-input AND gate (two falses, true and false, false and true) all create a false. OR gates work in a similar fashion, but the requirements to meet the conditions to reach a result of "true" are different. In an OR gate, A or B must be true for the result to be a one.
The ones and zeros are created by higher and lower levels of electricity. That's where the advance was made. Researchers at the University of California at San Francisco, possibly doing their research in a poorly kept cafeteria, used genes inserted into E. coli strains as the logic gates. Subsequently, the gates released a chemical signal that enabled them to connect to each other as they would on a circuit board, the story says. The ultimate goal was to create a language that would, in essence, enable code to be written as it is for more traditional logic gates.
The other advance was reported by Ars Technica, which describes research published by the Applied Physics Letters by English and Australian researchers. As suggested by the logic gate description above, classical computing is binary: the choices are zero or one. The status of each bit is independent of any other.
Anyone who has read anything about the quantum world probably knows what comes next. In quantum computers, the story says, quantum bits (qubits) are one and zero simultaneously. The operations that are done to the qubits don't switch them from ones to zeros or vice versa; rather, they change the probability that the qubit will eventually be in either state. The second, and related, idea is that an operation on one of the qubits impacts all on that string.
The story says that mistakes come from two areas. One is the "intrinsic uncertainty" associated with quantum operations. The other is purely physical: The quantum world is weird because it is so small. This makes it tricky (to say the least) to come up with equipment that can poke and prod the qubits without gumming things up. The remainder of the story describes what the team set out to do-which involves directional couplers and interferometers-and what it means.
It is too early to tell precisely what these new types of computers would be used for or when the research will show up in products. Quantum science already plays a role in security, but computers based on the approach would be orders of magnitude more complex.
In any case, it is important to have a general idea of what is going on. Scientists have long suspected that Moore's Law on the continuing growth of computing power and reduction in its costs would, like Brett Favre, eventually hit its physical limits. One or these seemingly strange approaches-or both-may allow Moore to play on for decades longer.