top of page

The Evolution of Blood Banking: From Leeches to Leukoreduction

  • Writer: caitlinraymondmdphd
    caitlinraymondmdphd
  • Mar 22
  • 5 min read

Updated: Apr 17



It’s hard to imagine now, but there was a time when the best hope for curing a fever was letting your blood drip into a bowl.


Today, transfusion medicine is a highly regulated, data-driven, life-saving discipline—but it was born from centuries of trial and error, myth, and undue confidence. From ancient physicians armed with leeches to modern labs humming with centrifuges and filters, the journey of blood banking is filled with stories that are every bit as messy, dramatic, and vital as the substance itself.


When Too Much Blood Was the Problem

In 18th-century London, the barber was more than someone who cut your hair. He was the man who would slice a vein in your arm to drain the sickness from your body. You might sit in his chair feverish, pale, and scared, and he’d wrap a cloth tight around your upper arm, pick up a lancet, and open a vein—because for nearly 2,000 years, medicine believed illness came from imbalance.


Hippocrates had laid the groundwork, but it was Galen, a Roman physician, who refined the humoral theory into something resembling dogma. Four fluids—blood, phlegm, yellow bile, and black bile—needed to be in harmony. Too much blood? That meant fever, aggression, or mania. The cure? Bleed the patient.


But even then, not everyone was convinced. There are records of patients refusing second visits after bloodletting left them faint or worse. In rural villages, families sometimes questioned why their loved ones seemed to worsen after the doctor’s visit. Still, the practice persisted—used for everything from childbirth to cholera—because there was no better alternative. Not yet.


The Wild Experiments of the 1600s

Paris, 1667. Jean-Baptiste Denis, physician to King Louis XIV, had a theory that animal blood might have healing properties. After all, lambs were considered pure, peaceful creatures. Maybe their blood could calm a feverish mind.


His test subject: a 15-year-old boy with recurring fevers. Denis transfused a small amount of lamb’s blood into the boy’s vein. Miraculously, the boy survived. Encouraged, Denis tried again—this time on a laborer named Antoine Mauroy, a man struggling with mental illness.


That did not end well.


Mauroy died, and Denis was accused of murder. Though ultimately cleared, the backlash was swift. By 1670, France and England had banned transfusions entirely. Science would have to wait.


But curiosity never really dies. In 1818, Dr. James Blundell—an English obstetrician disturbed by how many women died of postpartum hemorrhage—tried something new: human-to-human transfusion. He built a device using a syringe, a silver tube, and gravity. And unlike Denis, Blundell understood the importance of giving like with like. His patient lived.


For the first time, blood transfusion wasn’t just a wild theory. It was medicine.


The Day Blood Stopped Being Mysterious

In a Viennese lab in 1901, Karl Landsteiner was puzzling over a question that had vexed physicians for decades: Why did some transfusions succeed and others kill?


He and his team began mixing samples of human blood and watching for clumping—an ominous sign of incompatibility. After hundreds of trials, they identified distinct patterns. Landsteiner labeled the groups A, B, and C (which was later renamed O). It was a discovery that would win him the Nobel Prize.


The mystery had been solved: human blood wasn’t all the same. It was immunologically distinct. What had once been a dangerous game of chance could now be predicted and prevented.


Still, it took time for this knowledge to take hold. In 1916, an Army surgeon in World War I—Captain Oswald Robertson—successfully set up a rudimentary blood bank using Landsteiner’s principles. It saved lives on the battlefield, and the era of modern transfusion medicine had begun.


From Battlefield to Blood Bank

Before the 20th century, if you needed blood, you needed a donor in the next room—alive and ready to give. There was no such thing as blood storage.


But war, as brutal as it is, has always accelerated innovation.


In 1914, researchers discovered that sodium citrate could prevent blood from clotting, and by adding glucose, they could store it for days. In World War I field hospitals, doctors began collecting and refrigerating blood. The ability to store blood transformed transfusion from emergency improvisation into a system that could be planned, scaled, and standardized.


By World War II, the United States had launched a national blood collection program. Volunteers lined up to donate. Hospitals received glass bottles labeled by blood type and expiration date. Blood had become mobile. It had become bankable.


Cleaner, Safer, Smarter

In a children’s hospital in the early 1980s, a young leukemia patient received a routine platelet transfusion—only to spike a sudden, unexplained fever. It was a familiar story. The care team suspected that white cells in the donor product were to blame, provoking the child’s immune system into a reaction. That moment was one of many that pushed transfusion medicine toward leukoreduction—the removal of white blood cells from blood products to reduce febrile reactions, prevent alloimmunization, and limit the risk of cytomegalovirus (CMV) transmission.


But even as physical reactions were being tamed, invisible threats loomed larger.

The 1980s brought with them a terrifying revelation: viruses could silently hitchhike in donated blood. Transfusion-transmitted viruses (TTV) became a category of urgent concern. This wasn’t a single virus—it was a growing list of infectious agents that could pass undetected from donor to recipient, including HIV, Hepatitis B, Hepatitis C, and syphilis.


The blood supply, once viewed as a miracle, was now seen as vulnerable.


The response was swift and sweeping. Mandatory screening, stricter donor history questionnaires, and advances in nucleic acid testing (NAT) transformed blood safety. Tests that once took weeks were now detecting viral material in days—or even hours. Today, the risk of contracting HIV from a transfusion in the U.S. is estimated at less than 1 in 2 million.


Safety, once a reactive measure, became a proactive science.


And the vigilance hasn’t stopped. Blood banks continue to adapt, adding new tests as emerging pathogens threaten to join the TTV list. Each added layer of screening—each filter, barcode, and database—is built on the lessons of the past.


Because in transfusion medicine, trust is everything.


Looking Forward

We’ve come a long way from leeches and lamb’s blood.


Blood banks today are built on the work of pioneers—some brilliant, some reckless, all deeply human. Their stories are woven into every unit we hang on a pole and every life saved by a well-timed transfusion.


And the story isn’t over. Researchers are working on universal donor red cells, synthetic platelets, and pathogen-inactivated plasma. The goal is not just to transfuse—but to transfuse perfectly.


But perfection, like progress, takes time. And as we continue to refine this essential therapy, one thing remains unchanged: the act of giving blood is still an act of hope.

Comments


Raymond, Caitlin M._edited.jpg

Caitlin Raymond MD/PhD

I'm a hybrid of Family Medicine and Pathology training. I write about the intersection of blood banking and informatics, medical education, and more!

  • Twitter
  • LinkedIn
  • Instagram

Subscribe

Thanks for submitting!

©2023 by Caitlin Raymond. Powered and secured by Wix

bottom of page