Bone Marrow Transplant (BMT) is a medical procedure used to treat a number of different disorders, including blood cancers like leukemia, lymphoma, and myeloma, as well as immune deficiency disorders and radiation exposure. The treatment involves isolation of blood stem cells from either a patient or healthy donor, and then insertion of these cells into the patient with the goal of renewing their body’s ability to make healthy blood cells. Blood stem cells can be isolated directly from the bone marrow, or in some cases, can be separated out of blood. BMT is used to treat over 50,000 patients annually worldwide and is the only effective treatment currently available for some of the disorders listed above. While a source of life and hope today, the story of how BMT became possible is rather unique and fascinating.
The origins of BMT lie in the increasing presence of radiation present in everyday life at the turn of the 20th century, culminating in the deadliest aspect of this technology, the atomic bomb. Several key fields within biomedical research, including radiobiology, cancer biology, immunology, and stem cell biology, were significantly impacted, accelerated, and even originated as a result of World War II and the accompanying atomic age. Research spurned by this era led to the discovery and optimization BMT.
Glowing watches, trendy drinks, and X-rays, the first signs of a creeping demon
During the 1910s and 20s, radioactive compounds were becoming increasingly common in the workplace. A higher incidence of leukemia was observed in US radiologists as radioactive techniques and compounds, including x-rays, radium, and radon gas, become more common in modern medicine. Bolstering this correlation, women who worked in factories using radium paint were much more prone to develop oral, bone, and jaw cancers. Painters would lick their brushes to ensure a fine point and even paint their nails, face, and teeth to give them that trendy glow. Radium paint emits a faint amount of light and was used to paint watch faces and dials to highlight their hands and numbers. Glowing was trendy in more than just watches in the 20s; companies sold a variety of glowing tonics, medicines, and even radon infused water for vigor. Now known as radioluminescence, the emitted light was a result of the radioactive decay. Radioactive decay is the process by which an unstable atom becomes more stable by emitting various forms of energy, in this case, light energy. Unfortunately for these workers and trendy consumers, these first concentrated exposures to radioactive elements became the earliest recognized warnings for radiation sickness, cancer, and death now associated with radiation.
In 1927, scientists discovered a link between radiation exposure and genetic mutation, the first piece of the puzzle in determining how radiation wreaked havoc in the human body. A genetic mutation occurs when one base pair within the original DNA sequence is modified, for example an adenine (A) becoming a guanine (G). Exposure to sources of radiation can cause breaks in DNA, the repair of which occasionally results in a mutation that can contribute to cancer and other disorders. As an example, when exposed to ultraviolet light radiation from the sun, the DNA in skin cells can be broken, become mutated, and cause melanoma, or skin cancer. By the end of the 1920s, it had become clear that sources of radiation should be removed from everyday life whenever possible, even in small but continuous doses, like radon naturally found in drinking water and the basements of homes. However, with radiation and nuclear technologies just dawning as powerful tools in manufacturing, energy, and warfare, it became clear that a complete separation between modern society and radiation was simply not an option.
From blood cells to bombs, the contrasting discoveries of the Manhattan Project
As research involving radiation progressed, one aspect of the field began rise above the rest in terms of the potential for usefulness, both for power and destruction. This process was known as nuclear fission, or the the splitting of an atomic nucleus into smaller pieces. Early research revealed that anyone who could harness this reaction would yield a source of power and destruction never before known to man. However, this reaction also produced radiation at an immense scale, a side effect which propelled BMT from a research technique to a clinical therapy.
The Manhattan Project was the research and development collaboration commissioned by the US government, with assistance from the UK and Canada, during WWII with the intention to create the first nuclear bomb. The project lasted from 1941-46, employed 130,000 people, and cost the US government $2 billion ($27 billion in 2017), with a large amount of these resources going to a variety of research projects. The US was afraid that it’s soldiers or citizens may be exposed to high levels of radiation and also wanted to understand how it would affect the enemy if it deployed a nuclear weapon. As a precaution, the US devoted some of the researchers involved in the Manhattan Project to radiobiology, an emerging field that investigates how radiation affects the body and the potential for treatment.
Researchers used a variety of animal testing to evaluate which regions of the body were affected by radiation. After exposure, the animals began to develop blood related issues, including an inability to produce enough blood cells (aplastic anemia) and cancers related to blood cells, like leukemia and lymphoma. These experiments showed that the blood system, especially the locations where blood cells originate including the bone marrow and spleen, was the most susceptible to effects of radiation. Continued tests showed that shielding key areas, like the spleen or a rather large region of bone marrow like a single hind leg, would often rescue the shielded subject from the worst symptoms of exposure and allow for a full recovery. This became the foundation for radiation shielding precautions, ranging from heavy duty industry and the military applications to the lead vest worn at the dentist or doctor office during an X-ray.
Once researchers knew that radiation affected the body’s ability to create blood, they began testing ways to renew this ability post exposure. Bone marrow saved from a subject before exposure or collected from a healthy donor was infused into the animal exposed to radiation and returned blood production to normal. Bone marrow in the blood was able to repopulate the typical regions of blood cell production, like the spleen and large bones, thanks to a feature in bone marrow cells called “homing”. Cellular signals within the body allow bone marrow cells to exit their native environments and enter the bloodstream where they circulate briefly, eventually returning into one of the many areas of the body where bone marrow resides. This ensured that even if only a few transplanted cells found their way back into the marrow, they would multiply here and then repeat the process, repopulating all the typical marrow regions of the body. Experiments within the Manhattan Project were able to successfully perform BMT in a number of animal models, saving animals that had been exposed to typically lethal levels of whole body radiation. With the dropping of two atomic bombs by the US at the end of WWII, radiation sickness in humans became went from isolated to extensive overnight. Read more about the first use of BMT in humans, its therapeutic potential, and the continued effects the atomic age had on modern medicine in the part 2 of this story.