From 2006 to 2013, AMD and Nvidia completely mismanaged their market competition in mobile platforms, losing their status as major global GPU suppliers while Apple gradually replaced them to become the most powerful and mainstream producer of GPU processors. This article reviews this history to analyze why Apple can still replicate its success, hoping to provide some insights for the tech industry.
AMD and Nvidia Repeat Intel’s Mistakes, Losing the Mobile GPU Market
Apple developed mobile CPUs without Intel’s assistance; similarly, it developed mobile GPUs without AMD and Nvidia’s help. During both development processes, Apple’s design teams referenced existing processors: ARM’s CPUs and Imagination Technology’s PowerVR GPUs.
Twenty to thirty years ago, both PowerVR and ARM focused on PCs. However, after struggling in the PC market for nearly a decade, both companies decided to seek breakthroughs in the mobile platform market. At that time, competition in the mobile platform market was not fierce, allowing both companies to make significant progress, and both were adopted by Apple for the first-generation iPhone in 2007.
With substantial financial support from Apple (thanks to its profitability and the economic environment), the technological levels of ARM and PowerVR rapidly advanced, surpassing their competitors that had previously suppressed them in the PC platform, becoming technology leaders in the mobile platform.
Intel failed to foresee the potential of the iPhone and neglected the mobile market, allowing ARM to leap from a basic mobile processor manufacturer to a major producer of smartphone and tablet processors, while the history of the mobile GPU market is quite similar.
Reviewing the evolution of GPUs helps to forecast the future of this market and explains why Apple has left its competitors far behind over the past decade—especially in mobile devices and image processors.
The Birth of GPUs and the Monopoly of ATI and Nvidia
The first dedicated graphics processing hardware for consumer hardware appeared in the 1980s with arcade and home gaming consoles. Commodore’s Amiga was almost a gaming machine sold as a home computer; a year after Apple released the Macintosh computer in 1984, Amiga utilized advanced image and video technology. The Macintosh pioneered the professional image processing market for desktop publishing and CAD, which became the only reason to purchase expensive video hardware aside from gaming.
Initially, Apple and other PC manufacturers developed their own graphics hardware. Later, two companies founded in 1985—ATI and VideoLogic—began selling dedicated video hardware to other PC manufacturers to enhance graphics performance and effects.
By 1991, ATI had started selling graphics cards that accelerated graphics independently of the CPU. ATI’s video hardware significantly improved the effects of the 1993 game ‘Doom’, placing it in a leading position in a rapidly growing but increasingly competitive new market. In 1993, CPU designers from AMD and Sun founded Nvidia. The following year, some former SGI employees established 3dfx; both companies targeted the rapidly evolving hardware-accelerated graphics technology.
Microsoft made gaming (and extensive graphics card support) a key strategy for promoting Windows 95. Initially, Microsoft was at a disadvantage against 3dfx’s API-based graphics cards, but later breakthroughs in abstract graphics API technology led to the downfall of many graphics card manufacturers, similar to how the emergence of Windows directly caused many operating systems to exit the competition (excluding Apple, of course).
3dfx and VideoLogic’s PowerVR technology became the losers in this competition. VideoLogic licensed its technology to NEC, which used it to develop the 1998 Dreamcast game console for Sega. This strategy succeeded, leading VideoLogic to exit the PC graphics card market and focus on providing technology to other companies. During this period, the company rebranded as Imagination Technologies.
3dfx, once an industry leader, became a struggling company. After failing to secure orders for Dreamcast, 3dfx was acquired by Nvidia. Thus, by the late 1990s, only two companies remained in the PC graphics card market: ATI and Nvidia. In 1999, Nvidia began promoting its GeForce product, which it called ‘the world’s first graphics processor’ (GPU), while ATI named its product a ‘visual processing unit’ (VPU). Subsequently, Nvidia’s terminology became mainstream.

Apple Enters the GPU Market
In the mid-1990s, Apple was struggling in the personal computer market. During this period, Apple began equipping its high-end Mac products with third-party graphics cards, discontinuing the use of its own developed graphics hardware. After Steve Jobs returned in 1997, Apple further leveraged rapidly advancing graphics technology by integrating ATI’s Rage II 3D graphics accelerator into the PowerMac G3 and the following year’s iMac.
When Jobs introduced the new architecture of Mac OS X, he specifically emphasized two new strategies for Apple regarding graphics and GPUs. First, Apple would no longer develop its own 3D image API but would support the industry standard OpenGL. This allowed OpenGL to support both ATI and Nvidia hardware, giving Apple two GPU supplier options for its Mac product line.
Second, the entire user interface of OS X would be rendered by Quartz technology. The Quartz engine technology was originally developed for gaming, enabling Apple to achieve smooth animations, real-time video transitions, alpha transparency, shadows, and full 3D video hardware acceleration. This was a significant advancement compared to the 2D pixel raster technology used by the previous generation of Mac OS and Windows.
Previously, GPUs were only used for video games and other software requiring image technology, while Macs utilized GPUs to make computers more ‘magical’, responsive, and visually appealing. Although Microsoft was far ahead in gaming experience, it wasn’t until 2006, with the release of Windows Vista, that it had a graphics engine comparable to Apple’s, allowing Apple to stand out in the computer market for the previous five years.
However, in the gaming sector, Apple lagged far behind Microsoft. This was because the main platforms for games were low-end handhelds, home gaming consoles, and high-end PCs, the latter of which were entirely dominated by Microsoft’s technology—especially DirectX and graphics APIs. The relatively small user base of Macs could not attract game developers to the Mac platform.
As a result, high-end GPUs only appeared on Windows platforms; on the Mac platform, a vicious cycle emerged—lack of high-end GPU technology led to a scarcity of games, and the lack of games further resulted in the absence of high-end GPU technology. This further caused low adoption rates among users relying on graphics performance (such as high-end CAD software users) on Macs.
Solving Problems with Profit: 2000-2006
In the first decade of the 21st century, Apple, under Jobs’ leadership, addressed these issues with three strategies. The first was the iPod. The iPod significantly increased Apple’s revenue and shipments, allowing Apple to open numerous retail stores to further sell iPods and increase product exposure. The second strategy was the shift to Intel platforms in 2005, enabling Macs to run Windows systems and allowing developers to easily port popular PC games to Macs. The third was the iPhone, which ran a mobile version of OS X. The iPhone’s shipment volume far exceeded that of the iPod, bringing Apple hundreds of billions in cash flow.
The iPod itself was a hard drive combined with components based on ARM developed by PortalPlayer (a fabless chip designer). Initially, Apple purchased 90% of PortalPlayer’s components. However, when PortalPlayer failed to deliver in 2006, Apple abandoned the partnership and turned to Samsung for the ARM chip supply for the iPod Classic. That year, IBM also failed to deliver processors as per Apple’s requirements, prompting Apple to switch from PowerPC to Intel platforms.
The following year, the reason for Apple’s shift to collaboration with Samsung increased: the iPhone. Only a few manufacturers worldwide could supply Apple with application processors in quality and quantity, and Samsung was one of them.
Samsung, Texas Instruments, and Intel’s XScale were all capable of producing ARM core processors and Imagination’s PowerVR mobile graphics. Samsung had already proven itself as a memory and hard drive supplier to Apple, while Intel chose not to collaborate with Apple.
The GPU Became Key for the iPhone
After Apple abandoned PortalPlayer, Nvidia acquired the company, hoping to regain Apple’s contract. Nvidia intended to enhance the media player’s performance with its upcoming Tegra chip. However, Apple’s ambitions were much greater.
Apple’s iPhone was not just about using a CPU to run a mobile-optimized OS X system; it also aimed to feature a rich, animated, GPU-accelerated UI. This UI concept was similar to the previous Quartz engine on OS X.
By leveraging GPU performance, OS X was five years ahead of Windows from the start, and the new mobile operating system iOS was also far ahead of other smartphones. At that time, Nokia’s Symbian, Palm OS, and BlackBerry were essentially enhanced versions of PDAs and pagers; Sun’s JavaME, Google’s Android, and Qualcomm’s BREW (licensed to Microsoft) were merely simplified JAVA virtual machines.
At that time, only Microsoft considered shrinking desktop systems for mobile, but Microsoft ported the least suitable desktop UI for mobile (which was borrowed from Apple)—windows, mouse pointers, and desktop icons. In contrast, Apple—the only thing Apple did not port from Mac to iPhone was the desktop UI.
Emphasizing GPUs, Keeping iOS Ahead of Android
Apple’s ‘desktop-level’ mobile operating system and development tools required a high-performance mobile processor, which became a crucial reason for the iPhone’s significant lead over competitors. When the iPhone was released, all other mobile operating systems failed to survive.
iOS was highly aligned with user perception, visually appealing, and had a rich graphical environment (thanks to GPU acceleration), which was also a factor in the iPhone’s high sales. Google completely failed to recognize the key to the iPhone’s success.
After the iPhone’s release, Google attempted to overhaul Android to mimic the iPhone. However, Google forgot to enhance graphics effects (just like Microsoft did with Windows). It wasn’t until 2011, with the release of Android 3.0, that improvements were made—and 3.0 was a system released by Google in response to the iPad 3, targeting tablets. It wasn’t until 2012 that Google released Android 4.0 for smartphones—five years after the iPhone’s launch.
Apple’s emphasis on GPUs also made the iPhone highly suitable for gaming. From the beginning, games were the stars of Apple’s App Store, changing the public’s perception of Apple as ‘having no games’. The iPhone, iPod touch, and the 2010 iPad all disrupted the gaming market as independent mobile gaming devices.
AMD and Nvidia Lost Direction in Mobile GPUs
Imagination targeted its PowerVR technology at the emerging mobile device market, and in 2002, ATI also launched the Imageon product line. In 2008, ATI was acquired by AMD. AMD’s strategy included integrating ATI’s desktop graphics technology with AMD’s CPU technology to compete in the x86 market against a resurgent Intel.
In 2008, AMD decided to abandon Imageon, selling it to Qualcomm, which rebranded it as Adreno and integrated it into the new Snapdragon application processors. This decision was undoubtedly influenced by the success of the iPhone in integrating CPU and GPU.
Meanwhile, Nvidia was fully focused on competing with ATI in desktop graphics technology, leading it to completely neglect the mobile platform. Later, the company realized this issue and began developing Tegra—a chip originally intended for iPod-like video playback products. In 2006, rumors circulated that Nvidia had secured an order from Apple, but Nvidia’s Tegra did not achieve mass production until 2009. This was two years after the iPod touch was released, and in that year, Microsoft launched the Zune HD, which used a new chip.

Apple Begins Designing Custom Mobile Chips
At that time, Apple had already invested significant manpower and resources in its internal chip design team. In 2008, Apple acquired PA Semi and secretly obtained architecture licenses, allowing Apple to not only design peripherals for CPUs and GPUs but also make significant customizations to the cores.
Samsung continued to produce components for some iOS devices as Apple’s chip supplier, but Samsung was also manufacturing ARM chips for other companies. These ARM chips typically came with basic graphics processors rather than the more expensive, higher-performance PowerVR GPUs ordered by Apple.
The basic graphics processor refers to ARM Mali, which was a GPU project from a Norwegian university named Falanx. After running out of funds, it was sold to ARM in 2006.
Samsung Underestimates Its Own Chips
Despite supplying ARM chips for Apple’s iPod since 2006, in 2009, Samsung chose Nvidia’s Tegra for its imitation iPod product, the M1.
That same year, Samsung released the Omnia HD flagship phone with the Symbian system, responding to the iPhone 3G that had been released six months earlier. Samsung did not use its own chips but opted for Texas Instruments’ OMAP 3, which featured the new Cortex-A8 CPU and PowerVR SGX530 graphics processor. The new Palm Pre and Google’s Android 2.0 flagship Motorola Droid also used OMAP 3.
At that time, many believed the iPhone 3G would be defeated by a phone using Texas Instruments’ OMAP 3 chip. The Samsung chip used by Apple featured the older ARM11 CPU and PowerVR MBX Lite graphics processor. However, that summer, Apple released the 3GS, which used the more advanced SGX535 GPU, maintaining the iPhone’s hardware lead in the smartphone market and providing a GPU-optimized UI that surpassed Android competitors.
Apple’s A4 vs. Samsung’s 5 CPUs and GPUs
In mid-2010, Apple released the A4 chip, manufactured in collaboration with Samsung. Apple used this chip in the first-generation iPad and iPhone 4, while Samsung placed it in the first Galaxy S, Nexus S, and Galaxy Tab.
Samsung’s tablet, which mimicked the iPad, was released 11 months later, but Samsung had already launched two smartphones before the iPhone 4, giving Samsung a home advantage as Apple’s chip supplier.
Samsung’s designs were largely based on Apple products, with the Galaxy S closely resembling the iPhone 3GS, which ultimately led Apple to take legal action to stop Samsung from copying its designs.
The following year, Apple released a CDMA version of the iPhone 4 for Verizon. This iPhone 4 used the same A4 processor but replaced the original Infineon baseband processor with Qualcomm’s, which did not affect developers and users.
In contrast, after the Galaxy S, Samsung released five sub-products, each using different CPUs and GPUs. This included two generations of Qualcomm Snapdragon processors, different Adreno graphics processors, an Ericsson CPU paired with an ARM Mali graphics processor, and a Broadcom CPU with a VideoCore GPU. Among the five sub-products of the same brand, there were four different CPU architectures and GPU designs (each GPU’s performance was completely different).
Apple A5 vs. More Samsung CPUs and GPUs
In March 2011, Apple released the iPad 2, which featured the A5 dual-core processor and SGX543 dual-core graphics processor. During the launch event, Jobs emphasized that Apple was the first company to mass-produce a dual-core tablet, and the new GPU of the iPad 2 had the same power consumption as the A4 while improving graphics performance by eight times.
At that time, Google had just released Android 3.0 Honeycomb for tablets, and Motorola’s Xoom tablet, equipped with Nvidia’s new Tegra 2 chip, used this system.
Samsung also released its Android 3.0 tablet, the Galaxy Tab 10.1. It was puzzling that this tablet also used Nvidia’s Tegra 2 chip instead of Samsung’s own chips.
Samsung did not respond to Apple’s new A5 processor with its products but instead developed its Exynos 4 using a low-end Mali graphics processor. Samsung used this chip in one version of the Galaxy SII. The Galaxy SII also had three different versions: the first used OMAP 4 and PowerVR graphics processor; the second used Broadcom and VideoCore chips; the third used Snapdragon S3 and Adreno graphics processor.
GPU Fragmentation Affects Android Hardware Manufacturers
While Apple optimized iOS for a single GPU architecture, Google’s Android had to support at least five different GPUs just in Samsung’s flagship devices, in addition to other Samsung models and devices from smaller manufacturers.
Because most of Apple’s tablets and phones are based on the same graphics architecture, Apple has the ability to invest more in next-generation graphics technology rather than exhausting resources supporting devices with different architectures that have low sales.
Additionally, due to pricing reasons, Android products that use the best GPUs often have the lowest sales. This forces Google and its partners to optimize only for the ‘lowest common denominator’. High-end products, due to their low sales, are not worth further development.
The series of failures of OMAP-based products was one of the reasons Texas Instruments decided to exit the consumer market and stop development in 2012. Nvidia also exited the smartphone market afterward.
The sales of Apple devices drove its investment in developing new high-end GPU technology, while most Android devices were low-end devices equipped with Mali graphics processors. Despite high shipment volumes, the market still needed new cheap Mali chips for future low-end products.
Apple’s A5X and A6: Fast, but Not Cheap
This trend became even more apparent the following year. That year, Apple released the ‘new iPad’ with a 2048*1536 Retina display, featuring the A5X processor and quad-core PowerVR SGX543MP4 GPU. In the fall of that year, Apple released the iPad 4, which featured a more powerful A6X processor and a new PowerVR SGX554MP4 GPU. The new iPhone 5 used the A6 processor, which doubled the graphics performance compared to the previous A5.
Apple was pushing the high-end market, while Google and its Android partners rapidly lowered prices in 2012, focusing on the low-end market. Google’s Nexus 7, equipped with the Tegra 3 chip, made many sacrifices to achieve a low price of $199. This also led to various flaws in the software layer of the Nexus 7, which were only addressed a year after its release.
However, tech media praised the low price of the Nexus 7 (including the 2013 version), further entrenching the image of Android tablets as cheap and low-quality in consumers’ minds. In contrast, Apple’s products were perceived as higher-priced but reliable in performance, with good software support and smooth graphics, as Apple would not use low-end components to cut costs. This created a stark contrast.
Apple’s A7 and A8: 64-bit Rogue 6 Architecture GPU
Apple naturally released the iPad Air with the A7 processor. The new product had a 64-bit processor performance without sacrificing battery life, and the device was thinner. The significant improvement in GPU performance meant Apple no longer needed to produce dedicated chips for the iPad; this allowed Apple to focus more on transitioning to 64-bit, expanding its performance lead in products.
The release of the A7 prompted Google to change its low-cost product strategy. In 2014, Google’s Nexus 9 had only one option (Texas Instruments’ OMAP had exited the market): Nvidia’s Tegra K1. Choosing this processor meant Google had to double the price, setting it at $399, while some cheap Android tablets were available for as low as $50 (with poor quality).
At the same time, Apple introduced a new API: Metal. Many excellent developers utilized this new API to develop new applications. Metal allows games and other graphics-intensive applications to bypass OpenGL, leveraging Apple’s powerful 64-bit processors’ GPUs.
Apple had a significant advantage in graphics processing speed, especially evident in mobile gaming; Metal further increased this advantage.
Last year, Blizzard released the iOS version of ‘Hearthstone: Heroes of Warcraft’. The game’s director, Jason Chayes, told Gamespot why porting the game to the Android platform was very challenging.
“There are too many screen resolutions, and the performance varies greatly between devices—ensuring the quality of our game’s interface requires considering all of these factors.”
While Apple’s GPU advantage is most evident in gaming, Apple also showcased a series of graphics editing and video editing applications to demonstrate that Metal could not only enhance performance but also enable mobile devices to accomplish many tasks that were previously impossible.

What Lies Ahead for AMD and Nvidia?
Nvidia’s latest tablet chip can ‘almost’ match the performance of Apple’s A8X, but Nvidia currently has no mobile products, which is something Nvidia does not want to see.
Meanwhile, AMD not only lacks mobile GPUs but also has almost no market share. AMD’s position is even behind Intel, at least Intel subsidizes hardware manufacturers to use Intel products (resulting in a loss of $40 billion annually).
The two most influential GPU manufacturers in the world, AMD and Nvidia, along with the inventors of microprocessors, Intel and Texas Instruments, have all been almost driven out of the mobile market by Apple—leaving only low-end component manufacturers to compete with Apple’s high-end, high-volume iOS devices—this is undoubtedly thought-provoking.
This also indicates that whatever Apple’s next move is, it should be closely monitored—at least more seriously than the bluster and pretense of Samsung, Microsoft, Google, and Nvidia. The concern is not just Qualcomm—time will prove this.
