Tech News
← Back to articles

An engineering history of the Manhattan Project

read original related products more articles

The Manhattan Project, the US program to build an atomic bomb during WWII, is one of the most famous and widely known major government projects: a survey in 1999 ranked the dropping of the atomic bomb as the top news story of the 20th century. Virtually everyone knows that the project built the bombs that were dropped on Hiroshima and Nagasaki. And most of us probably know that the bomb was built by some of the world’s best physicists, working under Robert Oppenheimer at Los Alamos in New Mexico. But the Manhattan Project was far more than just a science project: building the bombs required an enormous industrial effort of unprecedented scale and complexity. Enormous factory complexes were built using hundreds of millions of dollars worth of never-before-constructed equipment. Scores of new machines, analytical techniques, and methods of working with completely novel substances had to be invented. Materials which had never been produced at all, or only produced in tiny amounts, suddenly had to be manufactured in vast quantities.

This massive effort was required in part because of the enormous difficulty in producing fissile material, and in part because of the enormous uncertainty facing the project: it wasn’t known what the best method for manufacturing the fissile material needed for the bomb would be, what the design of the bomb should be, or whether a workable bomb could even be built. Developing the bomb required resolving this uncertainty, and the project needed to rapidly push forward knowledge and capabilities in many fields: not merely in the realm of nuclear chain reactions and atomic physics, but also in areas like precision explosives, metallurgy, welding, chemical separation, and electronics.

Because of the exigencies of war, this work needed to be done extremely rapidly. There wasn’t time to investigate promising approaches sequentially, or wait for more information before picking a particular course. Thus, multiple possible routes to the bomb — different fuels (and different fuel production techniques), different bomb designs, different components like triggers and tampers — were pursued simultaneously. Major commitments, like factories that cost hundreds of millions of dollars, were made before it was known whether they would even be useful. Design work began on the bombs when the nuclear fuel they would use hadn’t been produced in more than microscopic amounts.

Normally when trying to create a new technology, funding constraints and the need for economic returns determine how much time and effort can be spent on development. Efforts to create some new technology will often be small-scale until the surrounding conditions are right — until knowledge has caught up, or the necessary supporting infrastructure exists, or the input materials are cheap enough — and risk can be minimized. But with the Manhattan Project, these constraints didn’t exist. Funding was virtually unlimited in service of ending the war sooner, and the biggest perceived risk was that Germany would beat the US to the bomb. As a result, an extremely robust development effort could be justified, which thoroughly explored virtually every promising path to an atomic weapon (no matter how expensive or uncertain).

Beginnings of the project

The Manhattan Project began in June of 1942, when Colonel James Marshall of the Army Corps of Engineers was directed to create a new engineering district to lead the army’s efforts to develop an atomic weapon. Shortly after, Colonel Leslie Groves (who would soon be promoted to brigadier general) was selected to lead the project. At the time, the official name of the project was “Laboratory for the Development of Substitute Materials” (DSM for short), but Groves felt that this name would attract curiosity, and so a new name was selected based on the location of Marshall’s New York office: the Manhattan Engineer District.

By the time the Manhattan Project officially formed, the US was already at work developing an atomic bomb. Following the discovery of fission in 1938 by Otto Hahn and Fritz Strassmann, physicists began to speculate that a nuclear chain reaction might be possible, and that such a reaction could be used to build a bomb of unprecedented magnitude. In August the following year, Albert Einstein and physicist Leo Szilard sent a letter to president Roosevelt, warning him that a nuclear chain reaction might be used to build an extremely powerful bomb, and that the US should research atomic energy. Two months later, Roosevelt ordered the creation of an advisory committee on uranium, and by early 1940 US researchers (most notably Enrico Fermi) were working to create a sustained nuclear chain reaction.

In July of 1941, a report from the British MAUD Committee concluded that it was likely feasible to build an atomic bomb. It reached the US, and in October Roosevelt authorized expediting atomic bomb work. Bomb efforts accelerated following Japan’s attack on Pearl Harbor in December of 1941, and in February of 1942 the Metallurgical Laboratory was formed at the University of Chicago to study nuclear chain reactions and the chemistry of newly-created element plutonium. There, a team working under Enrico Fermi continued their work to create nuclear chain reactions, ultimately resulting in Chicago Pile-1, the world’s first self-sustaining nuclear reaction, in December of that year.

Early test pile at University of Chicago, 1942, via Wikipedia .

The path to the bomb

... continue reading