injection, the rest would live for decades, plagued by a myriad of physical ailments.
About sixteen months after that trip to Italy, the
Tribune
published the first of a three-day series on the experiment. When the first installment rolled off the press, I sat at my desk expectantly, waiting for calls from outraged readers. I didn’t get one phone call from the public that day, but reporters from the hometowns where the five patients lived began calling to get more information, and a few reporters from Japan contacted me. Still, the story went largely unnoticed by the national media until Department of Energy Secretary Hazel O’Leary, who had been appointed less than a year earlier, officially condemned the experiment at a December 7, 1993, press conference, which she had called to announce the DOE’s new policy of openness and candor. Speaking toward the end of the conference, O’Leary said what she had just learned of the experiment horrified her: “I was appalled and shocked. It gave me an ache in my gut and heart.” 7
Soon after O’Leary’s press conference, President Clinton directed the federal agencies to make public any records dealing with the human radiation experiments. He also appointed the Advisory Committee on Human Radiation Experiments to look into the controversy. As the documents on the plutonium experiment poured from the government warehouses and people used in other experiments came forward, it became apparent that the story was much bigger than anyone had imagined. It turned out that thousands of human radiation studies had been conducted during the Cold War. Almost without exception, the subjectswere the poor, the powerless, and the sick—the very people who count most on the government to protect them, Clinton would later point out.
Many of the Manhattan Project doctors who took part in the plutonium injections showed up as advisors or participants in the postwar studies. Although they played a key role in the experiments, they had only supporting parts in the bomb project. They were on a first-name basis with such legendary figures as J. Robert Oppenheimer and Enrico Fermi, but they themselves have remained among history’s obscure players. During the Manhattan Project, their job was to protect the health and safety of workers at a time when little was known about the effects of radiation on healthy people. Fearing a cancer epidemic among the project’s employees, they embarked upon a crash course to learn everything they could about the effects of radiation delivered externally or internally through the ingestion or inhalation of radioactive materials. The bombings of Hiroshima and Nagasaki only intensified the urgency of their research. What did radiation do to human genes, reproductive organs, and fetuses?
With the building of the atomic bomb, an industry equivalent in size to General Motors had been born in the United States. After the war, the lavishly expensive atmospheric testing of atomic bombs began at the Pacific Proving Ground, and the Nevada Test Site. Responding to these developments, medical researchers found ever-new areas of inquiry to pursue. In closed-door meetings in Los Alamos and Washington, D.C., they and other scientists investigated such issues as how much radioactive strontium America’s children were collecting in their bones from fallout and how many more bombs could be exploded before the radioactivity would exceed a level that the doctors had deemed safe.
In addition to studies focused specifically on issues relating to the bomb and its fallout products, many other experiments used supposedly harmless amounts of radioactive materials, so-called tracer doses, to investigate questions relating to human metabolism. Officials of the early Atomic Energy Commission, the civilian department that succeeded the Manhattan Project in 1947, promoted radioisotopes with a missionarylike zeal. Doctors and scientists desperately hoped the splitting of the atom would