Top 10 Technology Trends in the Global Semiconductor Industry for 2024

Top 10 Technology Trends in the Global Semiconductor Industry for 2024

Throughout 2023, the shadow of the semiconductor industry’s downturn seems to remain, but there is a glimmer of hope. The emergence of ChatGPT at the beginning of the year has sparked global enthusiasm for Generative Artificial Intelligence (AIGC). The rise of AI and large models has created diverse application scenarios, greatly benefiting applications such as data centers and automotive electronics, while also presenting new challenges for chip computing power, energy efficiency, storage, and integration.
However, it is precisely these challenges that have stimulated the development of semiconductor technologies from materials, design, manufacturing to packaging. After the large-scale commercialization of third-generation semiconductors like silicon carbide and gallium nitride, the fourth-generation semiconductor, gallium oxide, is beginning to emerge; AI chips, riding the wave of large models, have become the main battlefield for major chip manufacturers and even complete machine manufacturers; to achieve greater computing power and faster storage speeds, the commercialization of Chiplet, 3D-IC, HBM, and a series of new types of memory is also on the agenda; even the programmable optical computing technology, long regarded as a “laboratory technology,” is beginning to attempt to replace the linear computing parts in GPUs.
The AspenCore global analyst team has communicated with industry experts and manufacturers throughout the year, summarizing and analyzing to select the top 10 technology trends that will emerge or develop rapidly in the global semiconductor industry in 2024 for everyone’s reference.
Top 10 Technology Trends in the Global Semiconductor Industry for 2024
Trend 1: AI Chips Accelerate Generative AI
2023 has been a significant year for AI, with the explosion of generative AI represented by ChatGPT at the end of 2022 and the beginning of 2023. The term “Generative AI” (also known as AIGC) has been widely mentioned in 2023, as if it marks the dawn of the so-called “strong AI” era. In fact, Nvidia’s special addition of the Transformer engine for data center GPUs was not a 2023 development, but clearly, this early layout has provided a foundation for the underlying computing power acceleration of generative AI.
The reason generative AI is described as an “explosion” is that, from the chip perspective, shortly after the emergence of models like GPT and Stable Diffusion, almost all companies targeting data center AI chips—whether for training or inference—seem to have rewritten the script, with nearly every company promoting their chips as capable of providing computing power for generative AI and achieving cooperation or support for different large models. The 2023 WAIC World Artificial Intelligence Conference was almost a special event for generative AI.
Not only data centers, but also edge and endpoint AI chip companies are eagerly discussing the concept of generative AI. From Intel’s six-month promotion for AI PCs—where 2024 PC processors will fully integrate dedicated AI acceleration units—to MediaTek’s announcement at the end of 2023 about mobile generative AI chips—where mobile phones can perform local inference of generative AI models—many chip companies involved in embedded applications are also discussing generative AI.
In fact, even without discussing generative AI, the AI wave it has brought has greatly reignited the fervor for edge AI: traditional MCU/MPU manufacturers like TI, Renesas, and Infineon have once again emphasized the enormous value of edge AI today. This trend, along with the popularity of generative AI in data centers and PCs/mobile phones, will continue into 2024 and further develop, even bringing more possibilities for the digital transformation of society as it gains traction in applications.
Top 10 Technology Trends in the Global Semiconductor Industry for 2024
Trend 2: Chiplet Technology Supporting Computing Power Expansion Becomes Mainstream
With the slowdown of Moore’s Law and the increasing demands for storage and computing power from new applications like AI, autonomous driving, and data centers, relying solely on the continuous evolution of advanced chip processes is no longer sustainable. Chiplet and three-dimensional heterogeneous integration will provide new growth drivers to break through the bottleneck of integrated circuit development. In 2023, driven by chip giants like TSMC, Samsung, and Intel, as well as industry chain companies, the Chiplet industry chain has gradually improved, forming a complete Chiplet ecosystem composed of Chiplet system-level design, EDA/IP, chiplets (core, non-core, IO Die, Base Die), manufacturing, and packaging.
Currently, global semiconductor technology giants are actively launching products that include Chiplet technology, such as Tesla’s Dojo deep learning and model training chip, AMD’s MI300 APU acceleration graphics card, and Nvidia’s Ampere A100 GPU. Domestic computing power chip manufacturers are also following suit. In 2024, with the continuous development of large AI models, the use of Chiplet technology to customize efficient computing power expansion will become a mainstream trend, and it will also be applied in board-level multi-chip interconnect and even larger-scale multi-board multi-cabinet interconnect solutions.
However, although Chiplet is becoming one of the key technologies to meet current computing power needs, it still faces many design challenges, such as interconnect, heat dissipation, yield, warpage, passive component integration, parasitic efficiency, cost, and reliability. Effective integration of multiple Chiplets can only be achieved through packaging technology, including the design, production, and verification of high-density advanced packaging, high-speed channel design and verification, power supply solutions, heat dissipation solutions, stress solutions, and reliability. At the same time, the limitations of Chiplet applications are still evident, mainly because Chiplets are still dominated by the vertical systems of international large manufacturers, and the related design systems are relatively closed and still need to improve interconnect standards.
Top 10 Technology Trends in the Global Semiconductor Industry for 2024
Trend 3: HBM, Price and Volume Rise Together
With the rapid rise of Artificial Intelligence/Machine Learning (AI/ML) globally, in 2020, high bandwidth memory (HBM, HBM2, HBM2E, HBM3) began to gradually emerge as a representative of ultra-bandwidth solutions. Entering 2023, the crazy expansion of the generative artificial intelligence market represented by ChatGPT has rapidly increased the demand for AI servers, also driving up the sales of high-end products like HBM3.
Research by Omdia shows that from 2023 to 2027, the annual growth rate of HBM market revenue is expected to soar by 52%, with its share in the DRAM market revenue expected to increase from 10% in 2023 to nearly 20% in 2027. Moreover, the price of HBM3 is about 5-6 times that of standard DRAM chips, which is why in 2023, HBM shipments accounted for only 1.7% of total DRAM shipments, yet its sales revenue proportion reached 11%. This is also the reason why chip giants like Nvidia, AMD, Microsoft, and Amazon are scrambling to secure supply, even considering premium pricing.
HBM technology was launched in 2013 and is a high-performance 3D stacked DRAM architecture, with data transfer rates reaching around 1Gbps. Since then, the technology standard has been updated approximately every 2-3 years, continuously breaking the bandwidth and maximum data transfer rate records of second-generation (HBM2), third-generation (HBM2E), fourth-generation (HBM3), and fifth-generation (HBM3E) products. Given that other products’ bandwidth has only increased two to three times during the same period, we have reason to attribute the rapid development of HBM products to fierce competition among memory manufacturers.
Currently, as a significant technological innovation, the development prospects of HBM are quite bright, especially in AI training applications. However, compared to GDDR DRAM, which often reaches rates of 16/18Gbps, HBM3’s rate, even at 9.2Gbps, still falls short. The limitations on HBM’s development mainly stem from two aspects: the intermediate layer and the complexity and increased manufacturing costs of 3D stacking. However, we believe that as global storage giants become deeply involved, these challenges will eventually be resolved, and the competition in the HBM market will intensify.
Top 10 Technology Trends in the Global Semiconductor Industry for 2024
Trend 4: Satellite Communication Technology Makes Significant Progress, 6G Takes Shape
In last year’s predictions, we mentioned that mobile satellite communication technology would begin to be fully rolled out in 2023. Now, this technology has made significant strides under Huawei’s breakthroughs in RF antenna technology. With the launch of Huawei’s Mate60Pro series, the mobile phone industry has transitioned from point-to-point, one-way satellite short message mode to the era of satellite calling.
In the past, people mainly focused on 5G and its chips, neglecting satellite communications. Currently, a series of companies developing satellite communication chips, including Huali Chuantong and Haige, have seen rapid development.
In terms of SoC, Unisoc has also launched its first 5G satellite communication chip, the V8821, compliant with IoT NTN R17 standards, supporting L-band maritime satellites and S-band Tiantong satellites, and can also be expanded to support access to other NTN satellite systems, providing data transmission, text messaging, calling, and location sharing functions. In addition to being used in smartphones for direct satellite connections, it can also be applied in IoT, wearable products, and vehicle networks.
MediaTek also introduced the MT6825 IoT-NTN chipset at the MWC 2023 conference, which can connect to geostationary orbit (GEO) satellites, easily convertible for use in 3GPP NTN standard satellite networks. In August 2023, MediaTek released a white paper on 6G NTN technology titled “Integration of Satellite Networks and Ground Networks,” which aims to create a comprehensive network coverage for land, sea, and air by ensuring compatibility and complementarity between satellite and ground networks, providing users with seamless intelligent communication services.
Thus, with the continuous breakthroughs in satellite communication technology in mobile phones and IoT, the future of 6G is taking shape. 2024 will be a year of flourishing satellite communication technology.
Top 10 Technology Trends in the Global Semiconductor Industry for 2024
Trend 5: The Commercialization Process of Gallium Oxide is Approaching
Currently, the development of wide bandgap semiconductors is progressing rapidly, with gallium oxide, as a fourth-generation semiconductor, gradually coming to the forefront. Compared to other fourth-generation semiconductors like diamond and aluminum oxide, gallium oxide can achieve breakthroughs in larger wafer sizes, with data predicting that the market size of gallium oxide is expected to surpass that of gallium nitride devices by 2023.
Gallium oxide has five confirmed crystalline forms, the most stable of which is β-gallium oxide. Currently, most research and development is focused on β-gallium oxide, which has the characteristic of high breakdown field strength and significantly lower conduction resistance compared to gallium nitride and silicon carbide, effectively reducing the conduction loss of devices.
The growth process of gallium oxide can utilize the liquid melt method under atmospheric pressure, which provides a cost advantage in manufacturing. Currently, the development prospects of gallium oxide are becoming increasingly prominent, with the market mainly occupied by two giants: Japan’s Novel Crystal Technology (NCT) and Flosfia. The industry has successfully achieved mass production of 4-inch gallium oxide wafers, with expectations to expand to 6 inches in the coming years. Meanwhile, the commercialization development process of β-gallium oxide Schottky diodes is also accelerating.
In the power electronics market, gallium oxide overlaps with the applications of gallium nitride and silicon carbide. Currently, the adoption rate of automotive-grade power devices is increasing year by year, providing gallium oxide with greater opportunities for application scenarios. In the short term, there is also significant potential in consumer electronics, home appliances, and high-reliability, high-performance industrial power supplies. Although gallium oxide brings new possibilities, materials like silicon carbide and gallium nitride also have their unique advantages and application fields. With continuous advancements in research technology and the expansion of application scenarios, gallium oxide is expected to play a more significant role in the semiconductor field.
Top 10 Technology Trends in the Global Semiconductor Industry for 2024
Trend 6: Active Promotion of 3D-IC Commercialization by Upstream and Downstream
For more than 50 years, Moore’s Law has driven the development of the entire semiconductor industry. Now, due to the slowdown of advanced process upgrades and high R&D costs, it is no longer possible to double the number of transistors every 18-24 months as in the past, resulting in performance improvements facing bottlenecks. However, advanced packaging continues to evolve, opening another door for innovation in the semiconductor industry, from MCM in the 1970s to SiP, and now to 2.5D and 3D-IC, heterogeneous integration.
The PCB-level packaging 3D-IC that appeared a few years ago is no longer new; nowadays, discussions revolve around two wafer stacking (WoW) 3D-IC, which allows for greater bandwidth for inter-chip communication. However, 3D-IC has not yet achieved large-scale commercialization, mainly facing two major challenges: heat dissipation issues and chip surface tension issues.
The complexity, compactness, and increased density of 3D-IC make it more challenging to dissipate heat compared to traditional 2D chips. The stress generated between stacked chips of different processes is also varied. Not only does it require special D2D interface IP, but also TSV technology to achieve high-speed and efficient data communication between chips, and suitable EDA tools for thermal analysis and stress analysis before bonding to help chip design engineers optimize the complete system integration.
Throughout 2023, the industry has been actively accelerating the progress of 3D-IC and innovative system-level deployments. For example, TSMC launched the new 3Dblox 2.0 open standard, and the 3DFabric platform allows customers to freely select technologies related to 3D-IC front-end and back-end assembly testing, including System on Chip (SoIC), integrated fan-out (InFO), and CoWoS; UMC, Huabang, and other semiconductor companies have established the W2W 3D IC project to accelerate the production of 3D packaging products, with system-level verification expected to be completed in 2024; Synopsys has collaborated with Powerchip Semiconductor to jointly launch new W2W and wafer-stacked chip (CoW) solutions, enabling developers to stack and bond DRAM memory directly onto chips.
Top 10 Technology Trends in the Global Semiconductor Industry for 2024
Trend 7: Micro OLED on the Verge of Scalable Application
Although Micro LED is considered the most perfect display technology, it still faces challenges such as massive transfer, full-color capability, and testing and repair, making it impossible for large-scale application in the short term. Micro OLED represents a deep integration of display technology and semiconductor technology, combining CMOS technology with OLED technology and inorganic semiconductor materials with organic semiconductor materials. Despite the many technical challenges facing Micro OLED, particularly the complexity and strict integration requirements of combining CMOS and OLED processes, its chances for scalable application are greater and earlier compared to Micro LED.
Micro OLED technology has several advantages over the current mainstream VR/AR display technology, Fast-LCD, mainly in low power consumption, wide operating temperature range, high contrast, and fast response speed, which almost compensates for the shortcomings of Fast-LCD, making it the most suitable micro-display technology for near-eye displays. Apple’s release of the Vision Pro headset featuring Micro OLED displays at the 2023 Worldwide Developers Conference is bound to drive the commercialization of this technology.
However, Micro OLED faces two inherent technical barriers due to the properties of its organic luminescent materials: brightness and lifespan, which, like other OLED technologies, suffer from burn-in and shorter lifespan issues. However, given that Micro OLED is primarily used in consumer electronics, this lifespan drawback is similar to the application of OLED displays in smartphones and does not pose a significant impact. The brightness of Micro OLED, however, does not meet the requirements for fully simulating VR/AR devices.
In 2023, global Micro OLED manufacturers are actively expanding production lines for Micro OLED 8-inch and 12-inch, some of which have already achieved mass production, and it is expected that in 2024, more Micro OLED screens will be available for virtual reality terminal applications. Meanwhile, since Micro LED cannot achieve large-scale mass production in the short term, Micro OLED has the opportunity to become the mainstream micro-display technology for a period of time.
Top 10 Technology Trends in the Global Semiconductor Industry for 2024
Trend 8: Programmable Optical Computing Chips Meet the Surge in Computing Power Demand
The wave of generative AI has driven a surge in computing power demand, but as Moore’s Law approaches its limits, traditional electronic technologies can hardly meet the needs of a new round of technological revolution, leading to the search for “turning electricity into light” to further enhance computing power.
Optical chips have been around for a long time, but the vast majority are non-programmable optical linear computing units. To enhance computing power through light, the computing units must possess programmability. This breakthrough in programmable optical computing chips only occurred in 2017, when researchers like Shen Yichen published a paper in the journal Nature Photonics proposing a new computing architecture based on optical neural networks.
Programmable optical computing chips offer advantages such as high integration, fast speed/low latency, low energy consumption, proficiency in AI matrix calculations, potential for cost reduction, and excellent waveguide transmission performance. However, challenges also exist, such as the need for complex calculations that require a large number of optical devices, leading to more complex structures and larger sizes; achieving programmability requires control over each device, which imposes higher integration requirements, bringing challenges in cost, stability, and yield; and environmental temperature affecting computational accuracy, resulting in temperature control challenges.
The main approach to commercializing silicon photonic chips is their technological universality, for example, prioritizing replacing the linear computing cores in GPUs with optical computing cores, forming a new paradigm of photonic-electronic hybrid computing networks that minimizes customer learning costs and usage barriers.
Secondly, modularizing optical chip solutions to meet computing application requirements while pursuing “plug-and-play” for inter-chip optical module transmission. This also involves on-chip and inter-chip optical network technologies, utilizing the low latency and low energy consumption advantages of light to replace electrical interconnects between modules, with wafer-level optical interconnect networks achieving higher utilization rates when mapping computing tasks across different chips.
Top 10 Technology Trends in the Global Semiconductor Industry for 2024
Trend 9: New Memory Technologies Transitioning from Theory to Practice
The development of the Internet of Things and artificial intelligence has led to an explosive growth of information, requiring all data to be collected, processed, transmitted, stored, and analyzed across multiple levels from the edge to the cloud. On the other hand, Moore’s Law faces a rapid slowdown in its expansion speed, unable to provide simultaneous improvements in power, performance, and area cost (PPAC).
In this context, companies of all sizes are competing to develop new hardware platforms, architectures, and designs to enhance computing efficiency, with new memory technologies represented by MRAM (Magnetic Random Access Memory), PCRAM (Phase Change Random Access Memory), and ReRAM (Resistive Random Access Memory) being key areas of research for chip and system designers. These new memory technologies can provide more tools to enhance Near Memory Compute and are also building blocks for the next phase of In-Memory Compute.
Research indicates that whether as standalone chips or embedded in ASICs, Microcontrollers (MCUs), and processors, they could become more competitive than existing mainstream memory technologies. For example, replacing eFlash and SRAM in microcontrollers with embedded MRAM could save up to 90% in power consumption; using single-transistor MRAM instead of six-transistor SRAM could achieve higher bit density and smaller chip sizes, making MRAM a strong competitor for edge devices. Compared to traditional NAND flash, PCRAM or ReRAM storage-class memory can offer over 10 times the access speed, making it more suitable for cloud data storage.
However, these emerging memory technologies also face some common key issues, such as thermal stability and the conflict between write current and fatigue characteristics at the unit level, which need to be overcome through material selection, integration processes, and comprehensive circuit optimization; from the perspective of array architecture, cross-array structures face crosstalk issues caused by leakage currents. Based on current research progress, frontier technologies like phase change material heterostructure design and spin-orbit torque (SOT) are expected to effectively address these challenges.
Top 10 Technology Trends in the Global Semiconductor Industry for 2024
Trend 10: The Usability and Commercialization of Silicon-Based Quantum Computing Advances
Many companies and institutions researching quantum computers have focused their attention on materials like superconducting qubits. In recent years, an increasing number of research institutions have turned their attention to silicon-based quantum computing. After all, silicon is a more readily available material, which gives it a natural advantage. Moreover, for silicon-based quantum computing, qubits can be single electrons and can be made extremely small.
Silicon-based quantum computing technology is more amenable to large-scale mass production, even though it may be weaker in operational time (similar to gate delay) compared to superconducting qubits. The past two years have been fruitful for silicon-based quantum computing. In 2022, significant technological breakthroughs were achieved in the silicon-based quantum computing field, including the realization of quantum computing with extremely low error rates, giving this computing technology potential value for large-scale, practical use. Moreover, research has demonstrated longer coherence times for spin qubits—platforms developed are also compatible with CMOS manufacturing.
In 2023, several significant events occurred in silicon-based quantum computing. In June, IBM announced that quantum computers had entered the “utility” phase; in September, Australia’s Chief Scientist Cathy Foley remarked that she saw “the dawn of the quantum era.” Meanwhile, physicist Michelle Simmons received Australia’s highest scientific award for developing silicon-based quantum computers.
In terms of commercialization, Intel has been conducting research on quantum computing over the years, naturally based on its accumulated expertise in transistor design and manufacturing, all based on silicon. Additionally, companies including Quantum Motion and Silicon Quantum Computing are also engaged in the research and development of silicon quantum computers. Silicon-based quantum computing may see further advancements in commercialization in 2024.

Top 10 Technology Trends in the Global Semiconductor Industry for 2024

Top 10 Technology Trends in the Global Semiconductor Industry for 2024

Scan the code for free listening and consultation on IC courses!

Top 10 Technology Trends in the Global Semiconductor Industry for 2024
Top 10 Technology Trends in the Global Semiconductor Industry for 2024

E课网(www.eecourse.com) is a professional integrated circuit education platform under Moore Elite, dedicated to cultivating high-quality professional talents in the semiconductor industry. The platform is oriented towards the job requirements of integrated circuit companies, providing practical training platforms that align with corporate environments, quickly training students to meet corporate needs.

E课网 has a mature training platform, a comprehensive course system, and a strong faculty. It plans to establish a premium course system for the semiconductor sector with 168 courses covering the entire integrated circuit industry chain and has four offline training bases. To date, it has deeply trained a total of 15,367 people and directly supplied 4,476 professional talents to the industry. It has established deep cooperative relationships with 143 universities and has held 240 corporate-specific IC training sessions.

Leave a Comment