Recently, OpenAI’s CEO Sam Altman claimed that the limiting factor for the future development of AI will be energy, and therefore, there is an urgent need to develop nuclear fusion. In fact, as early as 2021, he invested $375 million in a nuclear fusion company, which recently stated that its first power plant is expected to be operational by 2028.
The issue of energy consumption in computing has not been the first time it has drawn attention. During the peak of blockchain and cryptocurrency, many expressed concerns about the energy consumption of blockchain. Although the hype has slightly faded now, the energy consumption remains considerable; Bitcoin mining’s annual energy consumption is roughly equivalent to that of the entire Netherlands. A report from the U.S. Department of Energy in 2024 estimates that about 0.6% to 2.4% of the total annual energy consumption in the U.S. is used for cryptocurrency.
Currently, it seems unlikely that cryptocurrency will grow infinitely and consume the entire power grid. But will artificial intelligence follow this path? At this moment, ChatGPT provides about 200 million responses daily, consuming 500,000 kilowatt-hours of electricity (click here for details→ChatGPT’s daily electricity consumption exceeds 500,000 kWh, is energy the bottleneck for AI development? ), which is still a small proportion of the power grid; Altman’s prediction may be somewhat self-aggrandizing. However, if his prediction comes true and the scale and energy consumption of artificial intelligence grow rapidly in the long term, an energy crisis is indeed possible.

How Efficient Is the Human Brain?
In contrast, traditional natural intelligence (the human brain) seems to be much stronger. However, how efficient the human brain is remains unclear—because it is difficult to find a comparative benchmark.
For example, a common metric for measuring computer speed is clock frequency, which refers to how many electrical pulses a processor can generate per second. This largely determines how many basic operations a processor can complete per second.
If we consider the human brain using this standard, its clock frequency seems to be below one kilohertz, as neurons can fire impulses at a maximum rate of about 1,000 per second, and synaptic transmission of messages takes at least one millisecond. A clock frequency of one kilohertz is quite pathetic by computer standards; the first commercial microprocessor from the 1970s already had a frequency 700 times higher, and today’s mainstream processors are casually millions of times faster.
Does this mean that the operating speed of the human brain is only one-millionth that of contemporary computers? Obviously not, because there are fundamental differences in architecture between the two. For instance, each neuron in the human brain is typically connected to thousands of other neurons, meaning a “basic operation” often involves more than a thousand inputs, which is not comparable to a transistor that can only handle three input-output connections. In fact, even within computers, clock frequencies cannot be compared arbitrarily across different architectures.
So how does the brain compare to other biological organs? At rest, the human brain weighs about 2% of total body weight but consumes 19% of the body’s energy. This sounds a bit exaggerated, but it is not particularly special.
The liver and spleen weigh only slightly more than the brain but account for 27% of energy consumption. The combined weight of the two kidneys is less than one-fifth of the brain, yet they consume 10% of the body’s energy, which is half of what the brain consumes. The heart also weighs less than one-fifth of the brain, consuming 7% of the body’s energy, equivalent to one-third of what the brain consumes. It is expected that a few active organs consume most of the energy, and the brain is merely at a normal level among these organs.
Today, neural networks have become the mainstream of artificial intelligence and provide a new perspective for comparison: rather than making purely hardware-level comparisons, we can compare the human brain to specific neural networks.
Of course, there currently exists no neural network that can be compared functionally to the human brain, but from a scale perspective, if humanity can unlock the secrets of the human brain in the future, about 1,000 intelligent agents running on 1,000 GPUs could achieve a model at the scale of the human brain. Each agent would require about 1 kilowatt of power, totaling 1 megawatt for 1,000 agents, which is 50,000 times that of the human brain. (By the way, at this moment, the world’s floating-point computing power can support about 5 million such brain models.) Again, it is emphasized that this comparison relies on future theoretical advancements; current neural networks still lack comparability.
Regardless, the human brain does seem to be more efficient than computers. This is certainly a product of billions of years of natural selection—primitive nervous systems must have been constrained by energy, and as energy efficiency improved, the emergence of the brain became possible. However, the current human brain may not have reached the theoretical limit.
Is the Human Brain the “Optimal Solution”?
A common misconception about evolution is that it inevitably produces optimal solutions. This misunderstanding has multiple implications:
First, there is no general notion of an optimal solution; all judgments of merit depend on the given environment, which is constantly changing.
Second, even in a stable environment, a global optimal solution may not be achievable. Evolution is, in most cases, gradual and shortsighted, often getting trapped in local optima, much like a climber who insists that every step must go uphill, ultimately stopping on a small hill and failing to reach the true highest peak.
Third, the speed of evolution is proportional to selection pressure; when selection pressure is low, reaching a local optimal solution can still take a long time, and there is no reason to believe that the current human brain has reached the peak.
Fourth, evolution involves a massive amount of chance factors, the significance of which remains undetermined, but these chance factors should be sufficient to prevent the realization of an optimal solution.
Some AI theorists are very concerned about whether the human brain has reached the energy efficiency limit of neural networks, as this fact determines the long-term development direction of artificial general intelligence (AGI): if the human brain is far from the theoretical limit, then AGI could surpass the human brain in the future, leading to accelerated technological progress and potentially even the birth of a technological singularity.
But if the human brain is indeed the limit, then AGI will be severely constrained by the current human energy output, and its take-off speed will be very slow, significantly reducing the likelihood of a singularity occurring, which also means that simulating the human brain will become the only practical route for AGI.
However, as of now, both sides of the debate lack substantial evidence. It is possible that future advancements in the field of AI will prove that the neural network route is actually a dead end, and that true AGI comes from other directions; in that case, these discussions would lose their significance.

Regardless, grappling with computational efficiency is a significant advancement for human civilization, because setting aside the limitations of neural networks, computing itself is still far from its limits. This is fundamentally different from nearly all natural forces that humanity has controlled in the past.
To lift a certain weight, one must apply a certain force; to achieve a certain speed, one must provide that amount of kinetic energy. Manipulating physical substances has basic energy demands, leaving little room for savings; currently, humanity’s least efficient energy sources are already over ten percent, leaving less than ten times the room for improvement. These improvements are not unimportant; they may be sufficient to turn around the climate crisis we are facing, but they are far from enough to support an infinitely growing civilization.
However, computation manipulates not physical entities, but information. The energy consumed in computation does have theoretical limits, but they are minuscule; at room temperature, this limit is approximately 2.9 × 10^-21 joules. Therefore, advancements in this field are unprecedented.
The UNIVAC I from 1951 could perform 0.015 operations for every joule of energy consumed, while the 2022 supercomputer “Henry” can perform 6.5 billion operations for the same amount of energy, showing an increase of several orders of magnitude over seventy years, yet it is still ten orders of magnitude away from the room temperature limit.
If we abandon the room temperature constraint, efficiency can be further improved.The lower limit of computational energy consumption is proportional to environmental temperature; a highly advanced technological civilization would likely conduct most of its computations in outer space, using the temperature of the cosmic microwave background radiation at 2.7K as a baseline.
If a civilization exists for a sufficiently long time, it could even choose to wait for the universe to expand and cool to achieve higher computational efficiency—after 10^12 years, the efficiency limit for computation would increase by another 30 orders of magnitude compared to today’s limits.
In other words, humanity still has a long way to go concerning computational energy consumption. Perhaps, over the past few decades, humanity has been spoiled by Moore’s Law, indulging in cheap and fast chips while neglecting efficiency on many levels— even adjusting the temperature requires running on an Android system.
However, Moore’s Law is not a natural law; it is more of a KPI—actual progress in chip technology has lagged behind since 2010. Currently, neural networks are not the main contributor to human energy consumption, but one day computation will reach this position, and we should prepare for all possible outcomes before that day arrives.
Planning and Production
Related Articles
1. Trending! Breakfast vs. Dinner: Which Has a Greater Impact on Health? 2. What Really Burdens Women: Not Just “Invisible Labor”, But More… 3. Female Celebrity Allegedly Experiences Emotional Abuse in Breakup! Why Is Breaking Up So Difficult? 4. Does Regular Exercise Equate to Earning an Extra 170,000 Yuan a Year? The Benefits of Exercise Are More Than Just One! 5. Don’t Waste the Golden Time After Waking Up: Change These 4 Bad Habits for Better Health and Longevity!
All images in this article are from copyright libraries
Reproduction may lead to copyright disputes. For original text and images, please reply “Reprint” in the backend.
