Source: 36Kr
Editor’s Note: If you were to ask what the most important law in the computer industry has been over the past 50 years, the answer would undoubtedly be Moore’s Law. “When prices remain constant, the number of components that can be placed on an integrated circuit doubles approximately every 18 to 24 months, and performance also increases by a factor of two.” Moore’s Law has become a self-fulfilling prophecy that has determined the pace of the entire industry’s development. Exponential growth is reflected not only in chips but also in systems and adoption communities. For decades, computers have become smaller and more powerful, leading to tremendous advances in science. However, with each step forward in Moore’s Law, the resistance has increased, and this trend cannot continue indefinitely. Many experts predict that Moore’s Law will soon come to an end. What will happen when computers stop shrinking? What technological breakthroughs can succeed Moore’s Law? Let’s take a look at how “Megatech: Technology in 2050” predicts this.
In 1971, Intel, a company that was not yet known and would later become famous in Silicon Valley, released a chip called the 4004. This was the world’s first commercially available microprocessor, meaning it contained all the electronic circuits required to run advanced digital processing within a tiny body. This was considered a miracle at the time, integrating 2,300 tiny transistors, each about 10,000 nanometers (one billionth of a meter) in size — about the size of a red blood cell. A transistor acts as an electronic switch that can be toggled, representing 1s and 0s, which are the basic particles of information.
By 2015, Intel had become the world’s leading chip manufacturer, with revenues exceeding $55 billion that year. The chip giant released the Skylake chip. This time, the company did not disclose exact numbers, but the best guess is that each chip contained about 1.5 to 2 billion transistors. The spacing between each transistor was only about 14 nanometers, small enough to be invisible, as it is already an order of magnitude smaller than the wavelength of visible light.
Everyone knows that modern computers are better than older ones. But just how much better is difficult to express, as no other consumer technology improvements can keep pace with this rhythm. A standard analogy is to compare it to cars: if cars had kept up with the pace of computer chip upgrades since 1971, a new model in 2015 would have a top speed of 420 million miles per hour. That is about two-thirds the speed of light, or fast enough to circle the Earth in one-fifth of a second. If that isn’t fast enough, then by the end of 2017, cars with double the speed would start arriving in showrooms.
This rapid development is the result of a phenomenon first observed by Gordon Moore, one of Intel’s founders, in 1965. Moore noted that the number of components integrated into an integrated circuit doubles every year. The time interval was later revised to two years, and Moore’s Law became a self-fulfilling prophecy that set the pace for the entire computing industry. Every year, companies like Intel and TSMC spend billions of dollars trying to come up with ways to shrink the size of the components integrated into computer chips. Over the years, Moore’s Law has helped build a world where chips are embedded in everything from kettles to cars (cars gradually being able to drive themselves thanks to chips). In this world, millions of people relax in virtual worlds, financial markets operate on algorithms, and authorities worry that artificial intelligence will soon replace all jobs.
But this is also a force that is nearly exhausted. Each time components are shrunk, it becomes increasingly difficult; modern transistors are so small that they are measured in dozens of atoms, leaving engineers with no room to maneuver. From the release of the 4004 in 1971 to mid-2016, the bell of Moore’s Law has rung 22 times. For the law to extend to 2050 means this bell must ring 17 more times, which implies that engineers must find ways to develop computers using components smaller than the smallest element, the hydrogen atom. Everyone knows this is impossible.
However, companies will end Moore’s Law before physical laws do, as the benefits of shrinking transistor sizes are no longer what they used to be. Moore’s Law has been effective because of a related phenomenon known as the “Dennard scaling law” (named after IBM engineer Robert Dennard, who proposed this phenomenon in 1974). This law states that as chip components shrink, the chips become faster, consume less power, and are easier to manufacture. In other words, smaller components make better chips, which is why the computing industry has been persuading consumers to spend money on the latest models every few years. But this old magic is losing its power.
Shrinking chip sizes no longer makes them run faster or more efficiently. Meanwhile, the costs of the ultra-precision equipment required to manufacture chips are continuously rising, eroding their economic benefits. Moore’s second law, which is a bit easier to achieve than the first, states that the costs of so-called “foundries” double every four years. A modern machine costs about $10 billion. Even for Intel, this is a significant amount of money.
The result is that Silicon Valley experts have reached a consensus that Moore’s Law is nearing its end. Linley Gwennap, who manages an analysis firm in Silicon Valley, says, “From an economic perspective, Moore’s Law is dead.” IBM’s R&D head Dario Gil has a similar frankness: “I can absolutely say that the future of computing will no longer solely depend on Moore’s Law.” Former Intel chip designer Bob Colwell believes that by the early 2020s, the industry may still be able to produce chips with a spacing of only 5 nanometers — “but it’s hard to convince me that we can make them much smaller.”
In other words, the most powerful technological force of the past 50 years is about to reach its end. The belief that computers will continue to become better and cheaper at an astonishing pace has become a deeply rooted idea about the future. From driverless cars to better artificial intelligence and ever more appealing electronic products, it forms the basis for many technological predictions. The end of Moore’s Law does not mean that the computer revolution will stop. But it does mean that the next few decades will look very different from the past few decades, as none of the alternatives can be as reliable or repeatable as the continuous shrinkage of the past half-century.
The computing power contained in smartphones exceeds that of an entire country in 1971, and can run for a whole day on a single charge.
Moore’s Law has made computers smaller, transforming them from giants that filled entire rooms into slim tablets that can fit in your pocket. Moore’s Law has also made computers more economical: a smartphone with computing power that exceeds that of an entire country in 1971 can run for a whole day on a single charge. But its most famous effect has been to make computers faster. By 2050, when Moore’s Law will be ancient history, engineers will have to employ a series of means to make computers faster.
Some victories are easier to achieve. Better programming is one of them. The suffocating pace of Moore’s Law in the past has left software companies little time to rationalize their products. Their customers buy faster machines every few years, further weakening their motivation to optimize software: the simplest way to speed up sluggish code is to wait for hardware to catch up in a year or two. As the pace of Moore’s Law begins to slow, the computing industry’s famous short product cycles may begin to extend, giving programmers more time to refine their work.
Another approach is to sacrifice general mathematical computing power and design specialized hardware chips. Modern chips are starting to adopt specialized circuit designs to accelerate common tasks, such as movie decompression, encryption execution, or complex 3D image rendering for video games. As computers spread into a variety of products, such specialized chips will become very useful. For example, in driverless cars, the use of machine vision will become increasingly important, allowing computers to learn to interpret images of the real world, classify objects, and extract information, which is a computation-intensive task. Specialized circuits can provide significant performance boosts.
IBM believes that 3D chips could allow designers to shrink the size of a supercomputer currently occupying an entire building down to the size of a shoebox.
However, for computing to continue to improve at a high speed that everyone is accustomed to, more radical changes will be necessary. One of the ideas to extend Moore’s Law is to move it into a third dimension. Modern chips are basically flat, but researchers are exploring new ways to stack components. Even if the packaging of such chips cannot shrink further, stacking will allow designers to pack in more components, just as high-rise buildings can accommodate more people than low-rise buildings of the same footprint.
The first such devices have already emerged: a hard drive sold by South Korea’s microelectronics giant Samsung has memory chips stacked several layers high. This technology has very promising prospects.
Modern computers place memory only a few centimeters away from processors. For silicon crystals, a few centimeters is a long distance, meaning that there will be significant delays whenever new data needs to be fetched. 3D chips could potentially eliminate this bottleneck by processing logic layers and memory layers interleaved. IBM believes that 3D chips could allow designers to shrink a supercomputer currently occupying an entire building down to the size of a shoebox.
However, making this approach effective will require some fundamental design changes. Modern chips run hot and require heat sinks and fans to cool them down. The situation may be worse for 3D chips because the surface area available for heat dissipation will grow more slowly than the heat generated. For the same reason, allowing enough electrons and data to enter such a chip to maintain power and provide the necessary digits for processing will also encounter problems. IBM’s shoebox supercomputer will therefore require a liquid cooling mechanism. Microchannels will be injected into each chip, allowing coolant to flow through. At the same time, the company believes that the coolant could also serve as a power source. The idea is to use it as the electrolyte for a liquid flow battery, with the electrolyte flowing through fixed electrodes to generate electrical energy.
There are also more alternative ideas. Quantum computing suggests using counterintuitive rules of quantum mechanics to develop machines that can solve specific types of mathematical problems much faster than any traditional computer (though for many other problems, quantum machines will have no advantage). Its most famous application is breaking certain types of encryption, but perhaps its most important application is accurately simulating the quantum details of chemical processes, a problem that has thousands of uses in manufacturing and industry, but traditional machines are nearly helpless against.
Ten years ago, quantum computing was limited to university-level exploratory research. Since then, several major institutions, including Microsoft, IBM, and Google, have invested in this technology, all predicting that quantum chips should emerge in the next 10 to 20 years (indeed, anyone interested can remotely experiment with IBM’s quantum chip, programming it over the internet).
A Canadian company named D-Wave has already been selling limited-function quantum computers, which can only perform one type of mathematical function, but it remains unclear whether this specialized machine is indeed faster than non-quantum computers.
Like 3D chips, quantum computers require special care and handling. For quantum computers to operate, their interiors must be isolated from the external world. Quantum computers must be cooled to nearly absolute zero using liquid nitrogen and must be protected with complex shielding, as even the smallest thermal pulse or electromagnetic wave can destroy the quantum states on which these machines rely.
However, each of these anticipated improvements has limitations: either the benefits are one-time or only apply to specific types of computation. The greatest advantage of Moore’s Law is its regularity, improving everything at a metronomic pace every few years. But future advances will become more fragmented, harder to predict, and more unstable. Moreover, unlike the glorious past, it is unclear whether any of these will translate into consumer products. After all, very few people would want a low-temperature-cooled quantum PC or smartphone. Liquid cooling is similarly impractical due to its bulkiness, messiness, and complexity. Even specialized logic designed for specific tasks is only worthwhile if used frequently.
But all three of these technologies could work well in data centers, which could drive another major trend in the coming decades. Traditionally, computers have been either on your desk or in your pocket. In the future, the increasingly ubiquitous connectivity provided by the internet and mobile phone networks will allow a large portion of computing power to reside in data centers, available to customers as needed. In other words, computing will become a utility, used on demand, much like today’s electricity or water.
Removing the capability for heavy computational loads from user-interactive devices is known as “cloud computing,” and this will be one of the most important means for the industry to offset the impacts of Moore’s Law’s demise. Unlike smartphones or PCs, which can only grow so large, data centers can provide more computing power simply by being built larger. As the world’s demand for computing continues to grow, more and more processing will occur in data centers located miles away from users.
Google’s data center in Dalles, Oregon. As the world’s demand for computing continues to grow, more and more processing will occur in data centers located miles away from users.
This is already underway. Take Apple’s voice assistant Siri, for example. Decoding a person’s speech and understanding the intent behind commands like “Siri, find me an Indian restaurant nearby” requires computing power that the iPhone does not possess. Instead, the phone simply records the user’s voice and sends the information to a powerful computer in one of Apple’s data centers. Once the remote computer figures out an appropriate response, it sends the information back to the iPhone.
The same model can be applied far beyond smartphones. From cars to medical implants to televisions, kettles, and more, chips have penetrated areas that are not typically regarded as computers, and this process is accelerating. This is known as the “Internet of Things (IoT)” — embedding computing into almost every possible object.
Smart clothing will tell washing machines what settings to use via home networks; smart paving stones will monitor pedestrian traffic in cities and inform the government about detailed air pollution conditions.
Once again, we can already glimpse the future: for instance, companies like Rolls-Royce have engineers who are currently monitoring dozens of performance metrics of engines in flight. Allowing homeowners to control everything from lighting to kitchen appliances through smart home hubs is already gaining popularity among early adopters.
But for the Internet of Things to fully realize its potential, some means will be needed to harness the data deluge generated by billions of embedded chips. The IoT chips themselves will not handle these tasks: for example, chips embedded in smart paving stones must be as low-cost as possible and consume very little power, because it is impractical to connect every paving stone to the power grid; such chips must derive energy from heat, walking motion, or even electromagnetic radiation from the surrounding environment.
As Moore’s Law falters, the definition of “better” will change. In addition to the paths mentioned above, many other avenues appear promising. For instance, significant efforts are being made to improve the energy efficiency of computers. This makes sense for several reasons: consumers want their smartphones to have longer battery life; the IoT requires computers to be deployed in places where main power sources are hard to reach; computing currently consumes about 2% of the world’s electricity generation.
User interfaces are another area ripe for improvement, as today’s technologies are quite outdated. Keyboards are direct descendants of mechanical typewriters. The mouse was invented back in 1968, and the “graphical user interface” (like Windows or iOS), which replaced cryptic text symbols with friendly icons and windows, is similarly aging. Europe’s particle physics laboratory pioneered touch screens in the 1970s.
The computer mouse first appeared in 1968.
Siri may leave your phone and become ubiquitous: artificial intelligence (and cloud computing) will enable almost any machine to be controlled simply by conversation. Samsung is already manufacturing voice-controlled televisions.
Technologies like gesture tracking and eye tracking, currently used in virtual reality video games, may also become useful. Augmented reality, a close relative of virtual reality that overlays computer-generated information onto the real world, will begin to blend the virtual and the real. Google may have already relaunched its Glass AR glasses, but one day, something very similar will surely find its place. Moreover, this company is also working on electronic contact lenses that can perform similar functions but with much less intrusiveness.
Moore’s Law cannot continue indefinitely. But when it fades, so will its significance. When your computer is confined to a box on your desk, and when it executes many urgent tasks slowly, Moore’s Law becomes very important. It set the metronomic pace for a massive global industry, and without it, progress in computing will become increasingly difficult, more sporadic, and less predictable. But we will still make progress. Computers in 2050 will be systems with microchips embedded in everything from cabinets to cars. Most of them will be able to access vast amounts of computing power wirelessly over the internet, and you will interact with them simply by talking. Trillions of microchips will be scattered throughout the physical environment, making our world more understandable and more monitorable. Moore’s Law may soon disappear. But the computing revolution will not.
For more exciting content, click on the original text below.