Disclaimer: This text is personally translated from “Chip War: The Fight for the World’s Most Critical Technology” and does not represent my personal views, nor is it for any commercial use.
“The competition is fierce“
“Since you wrote that paper, my life has been terrible!“A chip salesperson complained to HP senior manager Chad Anderson. Anderson is responsible for deciding which chips meet HP’s strict standards. For the entire U.S. semiconductor industry, the 1980s were a hellish decade. Silicon Valley thought it was the leader of the global technology industry, but after 20 years of rapid growth, it now faced a survival crisis: brutal competition from Japan.On March 25, 1980, when Anderson took the stage at an industry conference held at the historic Mayflower Hotel in Washington, D.C., the audience listened intently as everyone was trying to sell him chips.In the 1930s, Stanford graduates Dave Packard and Bill Hewlett started fixing electronics in a garage in Palo Alto, and the company where he (Anderson) worked, Hewlett-Packard, invented the concept of Silicon Valley startups. Now it is one of the largest technology companies in the U.S. and one of the biggest buyers of semiconductors.
Anderson’s judgment on chips could determine the fate of any semiconductor company, but salespeople in Silicon Valley were never allowed to take him out for drinks or meals.“Sometimes I let them take me out for lunch,” he admitted shyly. But the entire valley knew he was the gatekeeper for almost everyone’s most important customer. His job gave him a comprehensive understanding of the semiconductor industry, including the performance of every company.
In addition to American companies like Intel and Texas Instruments, Japanese companies like Toshiba and NEC were also producing DRAM memory chips, although most people in Silicon Valley did not take these companies seriously. American chip manufacturers were run by people who invented high technology. They joked that Japan was a “click click” country, referring to the sound made by cameras brought by Japanese engineers at chip conferences to better replicate ideas. The fact that major American chip manufacturers were embroiled in intellectual property lawsuits with their Japanese competitors was interpreted as evidence that Silicon Valley was still far ahead.
However, at HP, Anderson did not take Toshiba and NEC seriously; he tested their chips and found their quality far superior to that of American competitors. He reported that none of the three Japanese companies had a failure rate exceeding 0.02% during the initial 1000 hours of use. The lowest failure rate among three American companies was 0.09%, meaning that American-made chips failed four and a half times more often. The worst American company’s chips had a failure rate of 0.26%, more than ten times that of Japan. American DRAM chips worked the same and cost the same, but failed much more frequently. So why would anyone buy them?
Chips were not the only American industry facing pressure from high-quality, ultra-efficient Japanese competitors. In the years shortly after the war, “Made in Japan” had long been synonymous with “cheap”. But entrepreneurs like Sony’s Akio Morita had shed the low-price reputation, replacing it with products of quality equal to any American competitor. Morita’s transistor radios were the first prominent challengers to America’s economic advantage, and their success inspired Morita and his Japanese peers to aim higher. From automobiles to steel, American industries faced fierce competition from Japan.
By the 1980s, consumer electronics had become Japan’s signature products, with Sony leading the way in launching new consumer goods, capturing market share from American competitors. Initially, Japanese companies successfully replicated American competitors’ products, producing them with higher quality and lower prices. Some Japanese exaggerated the idea that America excelled at “innovation” while they were better at “implementation.”“We don’t have Dr. Noyce or Dr. Shockley,” a Japanese journalist wrote, even though the country had begun to accumulate its share of Nobel laureates. However, prominent Japanese continued to downplay their country’s scientific achievements, especially when speaking to American audiences. Sony’s research director, renowned physicist Makoto Kikuchi, told an American reporter that Japan had fewer geniuses than the U.S., which was a country with “outstanding elites.” But Kikuchi believed that the U.S. also had “a long tail” of “below-average intelligence,” explaining why Japan was better at mass production.
American chip manufacturers insisted that Kikuchi’s view of America’s innovation advantage was correct, despite the contradictory data piling up. The best evidence against the argument that Japan was “implementers” rather than “innovators” was Kikuchi’s boss, Sony CEO Akio Morita. Morita knew that copying was just the secret to achieving second-class status and second-class profits. He drove his engineers not only to make the best radios and televisions but to completely imagine new products.
In 1979, just months before Anderson’s speech on American chip quality issues, Sony launched the Walkman, a portable music player that revolutionized the music industry, integrating the company’s five cutting-edge integrated circuits into each device. Now, teenagers around the world could carry their favorite music in their pockets, powered by integrated circuits pioneered in Silicon Valley but developed in Japan. Sony sold 385 million units, making the Walkman one of the most popular consumer devices in history. This was pure innovation, and it was made in Japan.America supported Japan’s post-war transformation into a transistor seller. The American occupying authorities imparted knowledge about the invention of transistors to Japanese physicists, while Washington policymakers ensured that Japanese companies like Sony could easily sell in the American market. The goal of turning Japan into a democratic capitalist nation worked. Now, some Americans are asking whether this effect has gone too far. The strategy of empowering Japanese companies seems to be undermining America’s economic and technological advantages.
Charlie Sporck was an executive who was burned in effigy while managing General Electric’s production line, and he found Japan’s productivity both fascinating and frightening. Sporck worked in the chip industry at Fairchild before leaving National Semiconductor, then a large memory chip manufacturer. The ultra-efficient Japanese competition seemed certain to bankrupt him. Sporck earned a hard-won reputation for squeezing efficiency from assembly line workers, but Japan’s productivity levels far exceeded anything his workers could accomplish.
Sporck sent one of his foremen and a group of assembly line workers to Japan to visit semiconductor facilities for several months. When they returned to California, Sporck made a film about their experiences. They reported that Japanese workers were “incredibly company-oriented,“ and that “the foreman placed the company above family.”
Japanese bosses did not have to worry about their effigies being burned. Sporck declared it a “beautiful story.”
“All of our employees saw the intensity of the competition.“