Technology Systems - Dynamic

Dynamic Blog

Moore's Law and the Future of Computing Speeds

26 August, 2020
0 Comments

In 1965, Gordon Moore, one of Intel's founders, made a predictive observation that ended up becoming a golden law of computing. He noted that microchips were evolving at such a rate that their power doubled, and cost halved approximately every two years. What soon became known as "Moore's Law" has accurately guided chip evolution for the past fifty years. It's been so consistent that tech companies have incorporated it into long-term budgets and financial forecasting models. Experts warn that we might be reaching the end of the microchip era, however. Let’s take a look at where we are now, how we got here and what the future may hold.

 

The Anatomy of a Microchip

A microchip is a small computer circuit, most commonly made from silicon, that is loaded with transistors. A transistor is the most basic element of what makes a computer work. It's essentially a gate that can be either open or closed, and unique combinations of transistor states are translated into fundamental computer code. The more transistors, the more powerful the chip; the smaller the chip, the more can be used in a concentrated space.

The Intel 4004 was the first commercially produced microchip. Released in 1971, it was 10 micrometers and had 2,250 transistors. It was revolutionary technology for the time and changed the entire world of computing. To put into perspective how far we’ve come, IBM has developed a chip that’s 5 nanometers (0.0005% of the size of Intel 4004) and has 30 billion transistors. For comparison, the diameter of an atom is between 0.1 and 0.5 nanometers (nm).

Chips have traditionally been made of silicon because of its unique properties: it can act as both an insulator and a semiconductor, making it a perfect base element for transistors. The smallest microchips in commercial production are 10nm, although working 5nm microchips have been developed and 3nm chips are being tested, with various manufacturers announcing release dates between 2021 and 2025.

The end of the microchip era may soon be approaching, however, for a simple reason: technology is reaching the limit of what silicon, as a material, is capable of supporting. Transistors are becoming so small that the amount of electricity required to cool chips exceeds the electricity necessary to power them.

 

What Comes After the Silicon Microchip?

One approach tech companies are experimenting with involves “cold computing” or “cold operation” of supercomputers. If a computer's temperature can be lowered enough, current technologies could gain another generation or two of development. For several years, Microsoft has engaged in an experiment known as Project Natick: huge, self-contained data centers were submerged up to 100 meters below the ocean's surface off the coast of Scotland. The data centers were naturally cooled by seawater, adopting an innovative and eco-friendly approach.

Another technique involves using liquid nitrogen to cool computer systems to -270C. Because nitrogen is abundant in the atmosphere and can be easily captured in liquid form, experts estimate that this could add up to another decade of scaling with current technology.

Nitrogen doesn’t have to be used in liquid form to be useful, however. Next-generation semiconductors may move beyond silicon by employing hybrid materials made from two or more elements. One potential candidate is gallium nitride, made from a combination of gallium and nitrogen. These materials perform better than silicon in latency, speed and light detection/emission. So-called “compound semiconductors” will likely operate alongside regular silicon chips for the first few generations, although they have the potential to be as much as 100 times faster than the top speeds silicon can achieve.

Data is being processed at such a microscopic level that tech companies are experimenting with ways to store data on a single atom — an approach that IBM has already shown to be possible. The U.S. Army has sponsored a project in coordination with the University of Texas to use tellurium atoms as a replacement for silicon. They’re focused on building transistors within single atomic chains and have seen some success. One of the challenges of storing data at the atomic level is the inherent instability of nuclear particles. Using atomic chains to transmit information will likely require multiple layers that compensate for the others' variability. As each layer stabilizes the ones above and below it, a balanced composite is created.

 

The Era of Quantum Computing

Below the atomic level is the arena of quantum computing, and this is where things get interesting. Today’s chips use “bits,” long streams of optical or electrical pulses in the form of 1s and 0s. Quantum computers would use something known as “qubits,” which are subatomic particles — for example, photons or electrons. In the same way that splitting the atom generated exponentially more energy than seemed conceivable, utilizing subatomic particles to conduct computations provides capabilities far beyond what current supercomputers can do.

Manipulating qubits is a tricky process. Scientists currently use a variety of technologies to do this, from cooling atoms to nearly absolute zero (think -455 F) or electromagnetic fields in extremely high vacuum chambers. These techniques slow the speed of a subatomic particle enough for it to be manipulated to some extent. Because electrons rotate around an atom, they can exist in multiple combinations of 1s and 0s simultaneously. This is known as superposition. After a particle has been slowed as much as possible, scientists use microwave beams or lasers to place qubits into superposition.

Think of computing with superposition as the difference between a tile (current technology) and a ball (a quantum computer using qubits in superposition). Let's say every time you expose the surface of either object to the sun, you're running a computation. If you drop both on the side of a hill, the tile will lie flat while the ball will roll away. The ball would rapidly and repeatedly direct different surfaces at the sky in the same amount of time that the tile would be able to show one surface a single time.

What makes this even more exciting is a concept known as quantum entanglement. Instead of just one subatomic particle, qubits can be generated in pairs that are somehow tied together — even when they’re physically separated. No one fully understands how this works, but when the state of one of the qubits changes, the other instantly changes in a corresponding and predictable manner. Going from one to two qubits doesn’t just have a doubling effect: quantum entanglement produces an exponential increase in a qubit’s ability to perform rapid calculations.

 

The Future is Now

Experts have been predicting the natural end of Moore's Law for at least the past two generations of microchips, only to be proven wrong by fresh innovations each time. Experiments have shown that at least two more generations of chips could still be possible with current materials, technology and manufacturing techniques. Using various innovative cooling methods could add several generations of development onto the silicon chip, and new materials could extend the chip as we know it for a decade or more beyond that. We're currently living in a generation of technology that's far beyond what anyone could have dreamed even a few years ago. The thought of achieving quantum supremacy with a stable quantum supercomputer is beyond exciting, and the capabilities it would offer humanity are mind-blowing.

Moiz Bhinderwala

Moiz Bhinderwala leads the technical services and logistics teams at Dynamic. With more than 10 years of experience in the IT industry, Moiz has deep knowledge of the complex technological landscape, working closely with clients to understand their IT challenges and help design custom technical solutions to meet their business goals.

RELATED NEWS


Subscribe to
our blog


How can we
help you?

CONTACT US

© 2020 DYNAMIC COMPUTER CORPORATION   |   ALL RIGHTS RESERVED