Congress and the Biden administration are embracing industrial policy to ensure competitiveness in science and technology. They hope that by ranking industries and technologies based on strategic importance, they can position the United States to extend the long history of the government driving development well into the 21st century.
Historically, this strategy of government direction and funding ensured the United States was at the vanguard of global technological innovation, as it launched revolutions in information, health and weapons since 1945.
The roots of how the government can drive the development of science and technology actually stretch back to the 19th century. House Speaker Henry Clay’s “American System” of 1824 proposed transforming a disjointed United States into a networked nation.
At the time of Clay’s two-day-long 40-page speech, European economies overshadowed the fledgling American economy with its fragmented and dysfunctional infrastructure. To counter these deficits, the government used import taxes to promote industrialization and develop infrastructure.
The American System achieved its aims, and the government repeated this pattern of investing to catalyze national growth throughout the country’s expansion westward during the rest of the 19th century. The federal government sponsored land grant colleges, railroads and settlements to encourage economic growth and knowledge production throughout the expanding country.
The involvement of the federal government in the economy during World War I convinced a generation of Democratic politicians, reformers and economists that government spending could fuel innovation and economic growth. In the 1930s, they seized on this idea when the Great Depression wracked the United States. They kept pushing government spending and investment to drive innovation as the United States fought World War II in the 1940s.
During the 1930s, government financed bridges, dams and airfields. This was followed by wartime investment in military bases, ports and national laboratories during the 1940s. This New Deal and World War II experience grew the size of government and entrenched a new understanding of government as the engine of growth for infrastructure and technology.
But even after a century of a governmental role in developing local and national infrastructure, it was the Manhattan Project and the first mass production of penicillin in the 1940s that cemented the United States’ place in spearheading science and technology developments in weapons, energy and health. Federal spending and the arrivals of immigrants propelled the United States to the forefront of science unlike before in the nation’s history.
President Franklin D. Roosevelt’s science adviser, Vannevar Bush, urged the president to champion science as the “Endless Frontier.” Although Bush’s proposal for the creation of the National Science Foundation was fulfilled in 1950, politicians didn’t rush to allocate federal largesse for science and technology spending.
Instead, it took a Cold War surprise to prompt the government to pour money into science and technology.
The Soviet Union’s launch of the Sputnik satellite in 1957 sent shock waves through America, kicking off the Space Race and forcing policymakers into action. Anxieties sparked by Sputnik led to the creation of NASA and the 1958 National Defense Education Act to subsidize science, technology, engineering and math education to strengthen the technical workforce pipeline. The act pumped 1 billion taxpayer dollars into grants and scholarships for science and technology study as well as previously unfunded disciplines such as area studies.
Sputnik also motivated President Dwight D. Eisenhower to pour money into research and development (R&D) for innovation agencies like the Defense Advanced Research Projects Agency (DARPA) to invest in long-shot bids in novel inventions. DARPA funded the technologies that created the Internet, Global Positioning Systems and virtual assistants like Apple’s Siri that are now woven into daily life.
The funding surge didn’t last:: The end of the Space Race and the Vietnam War led to cuts in government backing for R&D.
Starting in the late 1970s — and accelerating in the 1980s — the private sector eclipsed government R&D funding. Business investment climbed as corporate power and globalization picked up steam.
Simultaneously, congressional support for science and technology research came under fire after the Cold War for allegedly wasting taxpayer funds. Rep. Lamar Smith (R-Tex.) targeted the National Science Foundation for its funding of research, and science and technology allocations were an easy target in congressional budgetary battles beginning in the 1980s.
By 2019, federal expenditures plummeted from a 1964 peak of nearly 70 percent of R&D funding to just under 20 percent. The private sector now commands 70 percent of R&D funding and delivers cutting-edge technologies for the information revolution that has propelled the U.S. economy in recent decades.
Although DARPA paved the way for the information revolution by sponsoring research that created the Internet, the government surrendered its role as the prime mover of innovation. Some government-related institutions such as In-Q-Tel have bridged the gap, but few other government bodies have demonstrated the capacity to sponsor innovation R&D and formulate a new model of public-private partnerships.
And one of the central challenges facing government today is realigning public-private interests and reclaiming the government’s role in the industries of the future and emerging technologies. Differences between the interests of technology companies and the federal government have simmered, receiving the most notoriety in 2018 when Google ceased working with the Defense Department’s Project Maven. Google bowed to its workers’ anger over Project Maven’s combat use, and the decision to stop working with the military appeared disconnected from Silicon Valley’s history.
Even though private sector firms in Silicon Valley invented a myth of independence, public-private partnerships created the technologies that are commonplace today. Forging new models of public-private partnerships will be crucial for harnessing the gains of government R&D funding in science and technology to deliver economic gains.
Domestic criticism of the Innovation Act has popped up on the left and right. The Democratic Socialists of America condemned it for weaponizing industrial policy against China, and a host of antiwar organizations and the think tank the Quincy Institute criticized the bills for feeding China’s nationalism. From the right, the Wall Street Journal’s editorial board slammed the bills for imitating China’s industrial policy.
But, such an industrial technology policy has long been part of cementing the United States’ economic competitiveness and science and technology leadership. Federal funding birthed the public-private partnerships that were the genesis of the information revolution that will shape future technological and industrial transformations.
The road to passage of a reconciled Innovation Act isn’t certain. Despite Senate Majority Leader Charles E. Schumer’s (D-N.Y.) push and bipartisan support, Democratic leadership and Biden will have to spend fleeting political capital to reconcile and pass a bill. Time is running out.
Not since Sputnik has the United States faced a rival whose technological sophistication could surpass its own. China’s endeavor to control the commanding heights of science and technology innovation compelled Congress to pass competitiveness bills. But without a major Sputnik-level milestone by China, the Innovation Act may surrender precious time and funding in a pitched race that could determine the 21st century’s arc.
The views and opinions of authors expressed herein do not necessarily state or reflect those of the U.S. government or Lawrence Livermore National Security, and shall not be used for advertising or product endorsement purposes.