How the U.S. Became A Science Superpower

Prior to WWII the U.S was a distant second in science and engineering. By the time the war was over, U.S. science and engineering had blown past the British, and led the world for 85 years.


It happened because two very different people were the science advisors to their nation’s leaders. Each had radically different views on how to use their country’s resources to build advanced weapon systems. Post war, it meant Britain’s early lead was ephemeral while the U.S. built the foundation for a science and technology innovation ecosystem that led the world – until now.

The British – Military Weapons Labs
When Winston Churchill became the British prime minister in 1940, he had at his side his science advisor, Professor Frederick Lindemann, his friend for 20 years. Lindemann headed up the physics department at Oxford and was the director of the Oxford Clarendon Laboratory. Already at war with Germany, Britain’s wartime priorities focused on defense and intelligence technology projects, e.g. weapons that used electronics, radar, physics, etc. – a radar-based air defense network called Chain Home, airborne radar on night fighters, and plans for a nuclear weapons program – the MAUD Committee which started the British nuclear weapons program code-named Tube Alloys. And their codebreaking organization at Bletchley Park was starting to read secret German messages – the Enigma – using the earliest computers ever built.

As early as the mid 1930s, the British, fearing Nazi Germany, developed prototypes of these weapons using their existing military and government research labs. The Telecommunications Research Establishment built early-warning Radar, critical to Britain’s survival during the Battle of Britain, and electronic warfare to protect British bombers over Germany. The Admiralty Research Lab built Sonar and anti-submarine warfare systems. The Royal Aircraft Establishment was developing jet fighters. The labs then contracted with British companies to manufacture the weapons in volume. British government labs viewed their universities as a source of talent, but they had no role in weapons development.

Under Churchill, Professor Lindemann influenced which projects received funding and which were sidelined. Lindemann’s WWI experience as a researcher and test pilot on the staff of the Royal Aircraft Factory at Farnborough gave him confidence in the competence of British military research and development labs. His top-down, centralized approach with weapons development primarily in government research labs shaped British innovation during WW II – and led to its demise post-war.

The Americans – University Weapons Labs
Unlike Britain, the U.S. lacked a science advisor. It wasn’t until June 1940, that Vannevar Bush, ex-MIT dean of engineering, and President of the Carnegie Institute told President Franklin Roosevelt that World War II would be the first war won or lost on the basis of advanced technology electronics, radar, physics problems, etc.

Unlike Lindemann, Bush had a 20-year-long contentious history with the U.S. Navy and a dim view of government-led R&D. Bush contended that the government research labs were slow and second rate. He convinced the President that while the Army and Navy ought to be in charge of making conventional weapons – planes, ships, tanks, etc. — scientists from academia could develop better advanced technology weapons and deliver them faster than Army and Navy research labs. And he argued the only way the scientists could be productive was if they worked in a university setting in civilian-run weapons labs run by university professors.

To the surprise of the Army and Navy Service chiefs, Roosevelt agreed to let Bush build exactly that organization to coordinate and fund all advanced weapons research.

(While Bush had no prior relationship with the President, Roosevelt had been the Assistant Secretary of the Navy during World War I and like Bush had seen first-hand its dysfunction. Over the next four years they worked well together. Unlike Churchill, Roosevelt had little interest in science and accepted Bush’s opinions on the direction of U.S. technology programs, giving Bush sweeping authority.)

In 1941, Bush upped the game by convincing the President that in addition to research, development, acquisition and deployment of these weapons also ought to be done by professors in universities. There they would be tasked to develop military weapons systems and solve military problems to defeat Germany and Japan. (The weapons were then manufactured in volume by U.S. corporations Western Electric, GE, RCA, Dupont, Monsanto, Kodak, Zenith, Westinghouse, Remington Rand and Sylvania.) To do this Bush created the Office of Scientific Research and Development (OSR&D).

OSR&D headquarters divided the wartime work into 19 “divisions,” 5 “committees,” and 2 “panels,” each solving a unique part of the military war effort. There were no formal requirements.

Staff at OSRD worked with their military liaisons to understand what the most important military problems were and then each OSR&D division came up with solutions. These efforts spanned an enormous range of tasks – the development of advanced electronics, radar, rockets, sonar, new weapons like the proximity fuse, Napalm, the Bazooka and new drugs such as penicillin, cures for malaria, chemical warfare, and nuclear weapons.

Each division was run by a professor hand-picked by Bush. And they were located in universities –  MIT, Harvard, Johns Hopkins, Caltech, Columbia and the University of Chicago all ran major weapons systems programs. Nearly 10,000 scientists and engineers, professors and their grad students received draft deferments to work in these university labs.

(Prior to World War 2, science in U.S. universities was primarily funded by companies interested in specific research projects. But funding for basic research came from two non-profits: The Rockefeller Foundation and the Carnegie Institution. In his role  as President of the Carnegie Institution Bush got to know (and fund!) every top university scientist in the U.S.  As head of Physics at Oxford, Lindemann viewed other academics as competitors.)

Americans – Unlimited Dollars
What changed U.S. universities, and the world forever, was government money. Lots of it. Prior to WWII most advanced technology research in the U.S. was done in corporate innovation labs (GE, AT&T, Dupont, RCA, Westinghouse, NCR, Monsanto, Kodak, IBM, et al.) Universities had no government funding (except for agriculture) for research. Academic research had been funded by non-profits, mostly the Rockefeller and Carnegie foundations and industry. Now, for the first time, U.S. universities were getting more money than they had ever seen. Between 1941 and 1945, OSR&D gave $9 billion (in 2025 dollars) to the top U.S. research universities. This made universities full partners in wartime research, not just talent pools for government projects as was the case in Britain.

The British – Wartime Constraints
Wartime Britain had very different constraints. First, England was under daily attack. They were being bombed by air and blockaded by submarines, so it was logical that they focused on a smaller set of high-priority projects to counter these threats. Second, the country was teetering on bankruptcy. It couldn’t afford the broad and deep investments that the U.S. made. (Illustrated by their abandonment of their nuclear weapons programs when they realized how much it would cost to turn the research into industrial scale engineering.) This meant that many other areas of innovation—such as early computing and nuclear research—were underfunded compared to their American counterparts.

Post War – Britain
Churchill was voted out of office in 1945. With him went Professor Lindemann and the coordination of British science and engineering. Britain would be without a science advisor until 1951-55 when Churchill returned for a second term and brought back Lindemann with him.

The end of the war led to extreme downsizing of the British military including severe cuts to all the government labs that had developed Radar, electronics, computing, etc.

With post-war Britain financially exhausted, post-war austerity limited its ability to invest in large-scale innovation. There were no post-war plans for government follow-on investments. The differing economic realities of the U.S. and Britain also played a key role in shaping their innovation systems. The United States had an enormous industrial base, abundant capital, and a large domestic market, which enabled large-scale investment in research and development. In Britain, a socialist government came to power. Churchill’s successor, Labor’s Clement Attlee, dissolved the British empire, nationalized banking, power and light, transport, and iron and steel, all which reduced competition and slowed technological progress.

While British research institutions like Cambridge and Oxford remained leaders in theoretical science, they struggled to scale and commercialize their breakthroughs. For instance Alan Turing’s and Tommy Flower’s pioneering work on computing at Bletchley Park didn’t turn into a thriving British computing industry—unlike in the U.S., where companies like ERA, Univac, NCR and IBM built on their wartime work.

Without the same level of government support for dual-use technologies or commercialization, and with private capital absent for new businesses, Britain’s post-war innovation ecosystem never took off.

Post War – The U.S.
Meanwhile in the U.S. universities and companies realized that the wartime government funding for research had been an amazing accelerator for science, engineering, and medicine. Everyone, including Congress, agreed that the U.S. government should continue to play a large role in continuing it. In 1945, Vannevar Bush published a report “Science, The Endless Frontier” advocating for government funding of basic research in universities, colleges, and research institutes. Congress argued on how to best organize federal support of science.

By the end of the war, OSR&D funding had taken technologies that had been just research papers or considered impossible to build at scale and made them commercially viable – computers, rockets, radar, Teflon, synthetic fibers, nuclear power, etc. Innovation clusters formed around universities like MIT and Harvard which had received large amounts of OSR&D funding (MIT’s Radiation Lab or “Rad Lab” employed 3,500 civilians during WWII and developed and built 100 radar systems deployed in theater,) or around professors who ran one of the OSR&D divisions – like Fred Terman at Stanford.

When the war ended, the Atomic Energy Commission spun out of the Manhattan Project in 1946 and the military services took back advanced weapons development. In 1950 Congress set up the National Science Foundation to fund all basic science in the U.S. (except for Life Sciences, a role the new National Institutes of Health would assume.) Eight years later DARPA and NASA would also form as federal research agencies.

Ironically, Vannevar Bush’s influence would decline even faster than Professor Lindemann’s. When President Roosevelt died in April 1945 and Secretary of War Stimson retired in September 1945, all the knives came out from the military leadership Bush had bypassed in the war. His arguments on how to reorganize OSR&D made more enemies in Congress. By 1948 Bush had retired from government service. He would never again play a role in the U.S. government.

Divergent Legacies
Britain’s focused, centralized model using government research labs was created in a struggle for short-term survival. They achieved brilliant breakthroughs but lacked the scale, integration and capital needed to dominate in the post-war world.

The U.S. built a decentralized, collaborative ecosystem, one that tightly integrated massive government funding of universities for research and prototypes while private industry built the solutions in volume.

A key component of this U.S. research ecosystem was the genius of the indirect cost reimbursement system. Not only did the U.S. fund researchers in universities by paying the cost of their salaries, the U.S. gave universities money for the researchers facilities and administration. This was the secret sauce that allowed U.S. universities to build world-class labs for cutting-edge research that were the envy of the world. Scientists flocked to the U.S. causing other countries to complain of a “brain drain.”

Today, U.S. universities license 3,000 patents, 3,200 copyrights and 1,600 other licenses to technology startups and existing companies. Collectively, they spin out over 1,100 science-based startups each year, which lead to countless products and tens of thousands of new jobs. This university/government ecosystem became the blueprint for modern innovation ecosystems for other countries.

Summary
By the end of the war, the U.S. and British innovation systems had produced radically different outcomes. Both systems were influenced by the experience and personalities of their nations science advisor.

  • Britain remained a leader in theoretical science and defense technology, but its socialist government economic policies led to its failure to commercialize wartime innovations.
  • The U.S. emerged as the global leader in science and technology, with innovations like electronics, microwaves, computing, and nuclear power driving its post-war economic boom.
  • The university-industry-government partnership became the foundation of Silicon Valley, the aerospace sector, and the biotechnology industry.
  • Today, China’s leadership has spent the last three decades investing heavily to surpass the U.S. in science and technology.
  • In 2025, with the abandonment of U.S. government support for university research, the long run of U.S. dominance in science may be over. Others will lead.

15 Responses

  1. New book called Technology Republic speaks directly to your post.

  2. It will be unfortunate for the US.
    CHINA may take over.

  3. Great writeup; thank you!

  4. Once again, you’re analysis is completely wrong.

    Eric Weinstein has been talking about this for ages—and seeing how he was a theoretical physics / mathematician who part of the system (and then forced out), he would know more than most.

    The tl;dr is:

    Universities are basically “hedge funds + indoctrination”. You should ask yourself why each University has billions invested in equities across the world.

    Run by frauds. Harvard President Claudine Gay resigned in January 2024 due to plagiarism accusations against her academic work. Stanford University’s president, Marc Tessier-Lavigne, resigned after an investigation revealed manipulated data in several of his published academic reports.

    DEI. See the case of Stanley Zhong. He is a genius, but got rejected by 16 universities because he was ethnically asian. Instead, he has skipped college and has been as directly hired by Google. (Lawsuit is underway)

    It’s a means to generate personal wealth. In 2022, N.P. Narvekar, the CEO of Harvard Management Company (HMC), received a total compensation of $9,602,531. Why is the CEO of Harvard making over 9 million a year?

    Gatekeeping politics leads to wasting our biggest and brightest. Despited decades of research and billions in funding String Theory has produced nothing useful. Yet, String Theorists such as Edward Witten use their positions of power to purposefully keep out new ideas, in order to drive funding to their work. They also don’t want new ideas to emerge and displace their own.

    This is the problem.

    • I can’t speak to all of these objections, but Harvard Management Company runs Harvard’s *endowment* fund. It’s not Harvard, the university. And it’s a $50 billion endowment fund.

      Glancing through Wikipedia’s list of major hedge funds, the top 10 range from $90 billion to $40 billion. Then, looking up the compensation packages of their managing directors, most of these were paid multiple hundreds of millions to billions of dollars. The *tenth* highest-paid hedge fund manager earned about $800 million.

      So $9 million is quite small in comparison for managing a fund of this size. It’s small peas compared to what you’d earn in the corporate world.

      The actual Harvard President, Alan Garber, earned about $900k for the last year I could find record of (2021), which is about 10% of what you were suggesting.

      I’m a bit skeptical of some of your other claims – like that we’ve spent “billions” on string theory research, or the suggestion that DEI has (on net) been harmful to academia. But from my time in academia, while it definitely has some problems that need to be worked through, (like an overemphasis on big flashy publications), it’s still the most robust and creative way of offering and then testing new ideas. I’ve been lucky enough to be part of some absolutely excellent teams, and the creativity and hard work that I observed was unparalleled in the corporate world.

      If you’re passionate about science, then generally you work at universities or government science labs. Big advances, like CRISPR, iPS cells, isolation and characterization of graphene, the first exoplanets, the sequencing of the Neanderthal genome, and quantum dots were all pioneered at universities, research institutes, or government science labs. These are still quite productive, and very much pushing forward the bounds of human knowledge.

    • Every now and then one gets to enjoy someone making a complete fool of themselves. The statement below is today’s winner.

      Universities are basically “hedge funds + indoctrination”. You should ask yourself why each University has billions invested in equities across the world.

    • None of this even contradicts the article. Universities can be effectove sites of research (in the past and present) and also be misguided or self-serving or blinkered in the ways you’ve described.

      You seem to suffering the same condition as Weinstein: the paranoid / contrarian emotional posture collapses the world into a good/evil binary and forces every idea or institution to take a side. The things you’re describing are rightly seen as a minute sideshow to the main event–ome that is growing, perhaps, but without a doubt still tiny.

    • If you pick a random university and look at their faculty in STEM fields, you will find there is a very high percentage of Asians. In many departments, first generation Asian immigrants make up a majority of faculty.

  5. In addition to a change of government in the UK it might be necessary to consider the role of extreme secrecy and compartmentalisation. Especially in the case of Bletchly Park. Flowers was required to destroy all documents, prototypes, working drawings &c.

  6. You forgot to mention the Tizard Mission, where the UK gave the US its most sensitive technology, including the radar and the jet engine, to secure support.

    https://en.wikipedia.org/wiki/Tizard_Mission

    • Yep. Unfortunately for the British they didn’t share their nuclear research when they came over on the Tizard mission. At the time they were ahead of the U.S. They thought nuclear weapons was something they wanted to own themselves. Instead, in the dark about what the British were doing, the U.S. moved forward independently and by mid 1942 they spun the nuclear weapons research program out of OSR&D into the Manhattan project. By then the U.S. no longer needed the British. Simultaneously, Lindemann and Churchill realized that Britain didn’t have the industrial bandwidth to build the gaseous diffusion plants needed for their planned U-235 weapons. By the time they approached the Americans (mid-1942) it was too late. The U.S. let a few British scientists work at Los Alamos (one, Klaus Fuchs was a Soviet spy,) and had Canada facilities make some small contributions. Eventually the U.S. and Britain signed the Quebec agreement and here.

  7. If not for the advanced technology in Nazi Germany that the USA stole and continued to steal after the war and into the current day. The intellectual drain of German Scientists for all these years, either by choice or otherwise., the USA would be not even remotely be as close to key technologies as they are portraying themselves today.

  8. Great post. It reminds me of the seminal paper on open source “The Cathedral and the Bazaar”

  9. Not going to provide a dissertation response, although this article (which follows your series) – provides useful insights. Like you – have spent the entire career in High-Tech with an emphasis on the Aerospace and Defense target segments – the real stuff – weapons systems and other key areas. In today’s world – given the Ukraine conflict, the challenge of a new range of Superpowers (particularly, China) and other identifiable threats – there is the undisputed question of balance of power shifts (Economic and Military). Europe (overall) appears to be caught off-guard – and is regrouping – the US Monster is coming alive and facing the facts that the methods of modern military warfare have morphed – with drone utilization and other advanced capabilities. Jury is not in – would not count the US out – if the mantra of focus on National Security, protecting the War Fighter and perpetuating our Military superiority prevail (which means fostering continuous innovation, providing the right level of budget/funding allocation, streamlining the process, kicking Bureaucracy in-the-can and assuring that we have “bright lights” to back this up ) – then the picture and future state will change. Last point: If the US has spent more on Strategic Defense over the last 3-Decades than all countries combined – then how have we managed to be in today’s slippery-slope position? Budget/Funding Allocation alone does not yield Leadership – although the public deserves an explanation from the Big Whigs – on cause and effect.

  10. An interesting and insightful analysis. But anglo-centric and thus incomplete.

    Before drawing any generalized conclusions or policy prescriptions from this, I’d like to also look at how this has played out in Germany, France, Russia, Japan, (both pre and post WWII) and post WWII South Korea & China.

Leave a Reply

Discover more from Steve Blank

Subscribe now to keep reading and get access to the full archive.

Continue reading