The Big Barnes Theory – The Quantum Machine

 
 

With the silicon revolution threatening to grind to a halt in the near future, Ethan Troy-Barnes considers whether the humble electron could be the future of information technology

Back in the 60s, a man named Gordon Moore noticed a trend in how information technology was evolving. Specifically he noticed that transistor technology, which is the basis of modern computing, was becoming half the price, half the size, or double the power each year. He predicted that computer technology would continue to develop at this exponential rate, and it is a principle that’s largely remained true to this day. However, in a 2005 interview with technology magazine Techworld, Moore warned of a foreseeable end to his Moore’s Law. He said that: “In terms of size [of transistors] you can see that we’re approaching the size of atoms, which is a fundamental barrier.”

Moore believes that within “two or three generations”, microchip technology will have become as small and as efficient as it’s going to get and we’ll have to start looking to new technologies if computer hardware is to continue to move forward.

All is not lost however, as computer science is joining forces with quantum physics and biological science to offer a number of successor technologies to the current silicon-microchip paradigm, such as biological computing, which is more commonly know as ‘gooware’.

Many experts believe the future lies in quantum computing. In this model, calculations are performed at the level of subatomic particles, and transcend the size barrier we run into with silicon-based technology. As quantum physicist Michio Kaku explains, quantum computers are “as small as you can get in terms of information storage. You can’t get smaller than an individual electron and they work by looking at the ‘spin’, [or] the ‘orientation’ of electrons.”

In traditional computing, calculations are performed by electricity flowing through transistors in a circuit. Each transistor represents a ‘bit’ of information that can either be on or off, and is assigned a one or a zero, respectively. Mathematically processing these bits allows us to process information in a quick and simplified way, and solve puzzles far faster than with a pen and paper.

By contrast, in a non –relativistic or quantum machine, subatomic particles such as individual electrons represent the information. As Kaku puts it: “If I put an electron in a magnetic field, it can ‘spin’ up or it can ‘spin’ down. [Up] would be a one and [down] would be a zero.”

However, according to quantum mechanics, the spin of an electron can also be anything in between one and zero. Thus, a new type of ‘bit’, called a ‘qubit’ (quantum-bit) is created, that can have a value of anything between zero and one. This provides us with a colossal increase in processing power, as the fundamental unit of information can now represent far more than merely two values.

Basically, you get more bang for your buck, with manifold applications. Quantum computers would allow us to solve mathematical equations previously thought unsolvable in an instant, and allow for giant leaps forward in the fields of artificial intelligence and physical chemistry. Quantum machines would enable us to create vast artificial environments, providing scientists with a near-limitless ability to simulate interactions between chemicals at the atomic level. This could revolutionise the pharmaceutical industry and drug design, and have unforeseen impacts on modern medicine.

Conversely, one especially nifty attribute of quantum computing is its ability to sort totally random information effortlessly. Most modern data encryption relies on the use of ciphers for which there is no mathematical solution and the only way decode them is to randomly guess. Thus, quantum computers could render modern digital security technologies utterly useless.

That said, the successful implementation of what is theoretically a very promising technology is not without its pitfalls. Most significantly, there’s the problem of decoherence. In a quantum computer, all the subatomic components are arranged with the complexity and the fragility of a house of cards. However, such a system is extremely sensitive to external interference from the likes of heat, sound and magnetic fields. Sooner or later, something from the outside will throw the whole setup into disarray.

As a result, the construction of a real-life quantum machine is a bit like playing a game of Jenga in gale force winds during an earthquake. The obvious solution is to insulate the system to within an inch of its life from its surrounding environment so that it won’t come crashing to pieces every time someone in the room sneezes.

However, this approach brings its own problems. The whole point of a computer is to perform tasks for the user. For this to occur, the user needs to be able to access the system and manipulate its components. If you glue all your Jenga blocks together you can be sure your tower will stay intact, but you also won’t be playing Jenga again any time soon.

Now, while this may all sound very sci-fi, many industry leaders believe this technology is not far off. “At IBM, we’re seeing breakthroughs pretty regularly, maybe every couple of weeks, so it’s pretty exciting,” explains Jay Gambetta, a researcher at IBM’s Experimental Quantum Computing Group, “I see us building a quantum computer. It’ll take a lot of work, but I don’t see anything that will stop us.”

However, all this does beg the question: do we really want ignite a new IT revolution? Some believe the never-ending advancement of computer technology will eventually lead to the advent of something termed a ‘technological singularity’. This describes the future emergence of a digital intelligence that surpasses the human mind. Such an intelligence would be based on computer technology so advanced that many theorise it would be able to rewrite and improve upon its design as it operates, eventually self-evolving to the point where it might be impossible for the human mind to comprehend, eclipsing the human race altogether.

As barmy as it sounds, futurist Ray Kurzweil anticipates this could occur as soon as 2045. Something to think about next time you curse your slow internet connection while trying to stream the latest episode of The Great British Bake Off.

Advertisements