Back in 2005, noted British Telecommunication "Futurologist" Ian Pearson hypothesized that computing power would be so great, and our ability to tap into it so advanced, that by 2050 we could effectively utilize the technology to store and access human consciousness. "So when you die," Pearson so eloquently understated, "it's not a major career problem." Supporting his now-celebrated speculation on future pseudo-immortality, Pearson also concluded that a "conscious computer with superhuman levels of intelligence" could be readied as early as 2020.How close will Pearson come to the truth? What will computers look like in 2050? What will they be capable of? Well, let's just say that this is the same guy who envisaged in 1999 that our pets would be robotic and our contact lenses would project HUD-like displays into the retina – a la The Terminator – by 2010. To be fair, the latter concept may not be that far off, and certainly Pearson's a bright fellow who's been proven correct enough times that his forecasts can't be seen as mere pap.
We do know this much: In 1965, Intel co-founder Gordon Moore predicted the number of transistors on a circuit board – and thusly, its speed – would double every 24 months. His prophecy has become known as Moore's Law, and it hasn't been wrong yet. In fact, it looks like it won't be wrong for quite a few more years.
Chip Speed and Processing Power
So what's the big deal about Moore's Law? It's simple – computing speed, power, and miniaturization are the secret behind virtually all the major technological advancements we've seen so far, and will see in the near future. Just look how far we've come in the last few decades. Twenty years ago, the finest desktop computer CPUs featured perhaps 100,000 transistors and chugged along at 33MHz. Today, high-end quad-core CPUs scream along at 3GHz and brandish in excess of 800 million transistors. Indeed, some of today's transistors are so small that millions could fit on the head of a pin.
In 2050, however, we will have long exhausted current design and manufacturing techniques and concepts, which up until now have involved "etching" multi-layered silicon wafers with ultraviolet light in a process called photolithography. There are several highly sophisticated reasons behind this, but suffice it to say that leading chipmakers such as Intel, already working in ridiculously sub-microscopic environments, will within the next two decades come up against a number of undeniable limitations. The current process and the current materials used in that process – and the accepted laws of physics – won't support continued miniaturization and energy efficiency as we reach molecular levels.
New PC and Computing Technologies
That's forcing scientists to look at new technologies. The bad news is that we're not really sure right now which technology will win out. In the foreseeable future, recently discovered materials such as graphene may be used instead of silicon to form circuit board wafers. Graphene, essentially a single layer of the very same graphite used in pencils, conducts electricity much faster than silicon. In the more distant future, radical ideas such as optical computing, which uses protons and light in lieu of electrons and transistors, might be the ticket.
But by 2050, we may well be in the realm of quantum computing. This is a world best understood by the proverbial rocket scientist, though the general theory involves the harnessing of quantum mechanical phenomena (the same stuff that prevents us from continuing to miniaturize today's silicon-based transistors) to do good rather than evil. Instead of utilizing "bits," which can either be on or off, like a light switch, quantum computing utilizes qubits (quantum bits), which can be on, off, or both.
Because quantum computing takes place at the atomic level, and because each qubit is capable of handling multiple computations simultaneously, a quantum-based computer of the future could very well make today's desktop look like an abacus. The only major holdup – and it's a gargantuan one – is in the development of a means of controlling and stabilizing all those qubits. If we can manage to do that, and we likely will before 2050 rolls around, the possibilities and the potential are staggering.
D-Wave Quantum Computing Processor
Tidak ada komentar:
Posting Komentar