The text and photograph here are excerpted from a four-volume series of books titled Oncology: Tumors & Treatment, A Photographic History, The Anesthesia Era 1845–1875 by Stanley B. Burns, MD, FACS, and Elizabeth A. Burns. Photograph courtesy of Stanley B. Burns, MD, and The Burns Archive.
For more than 2,500 years, bloodletting was the backbone of medical therapy. To date, it is the longest-running therapeutic tradition known. First practiced in ancient Egypt, its use spread throughout Western civilization. The therapy was still performed in Southern rural America until the 1910s.
One theory of the genesis of bloodletting is that it developed out of the observation of the menstrual cycle and its restorative results. Bloodletting was not only used for every disease, including cancer, but it was also employed in the treatment of trauma. Even severely wounded soldiers would be bled to syncope. It is amazing that such a drastic procedure, which truly provided little relief, remained in
practice for so long.
From Humoral to Germ Theory of Disease
The traditional fathers of Western medicine, Hippocrates through Galen, developed a humoral theory of disease. This theory was firmly established as the foundation of medical practice, and drastic therapies were devised to affect the humors. The belief was that good health depended on proper balance of the four body humors: blood, phlegm, yellow bile, and black bile. Bleeding, purging, sweating, and vomiting, which all affected the humoral balance, were rigorously used. As blood was the easiest of the humors to manipulate, removing blood, theoretically and in practice, did alter the proportion of the humors. Of note, however, it induced an immediate effect in the condition of the patient.
Until the development of the germ theory of disease in the 1880s and the resultant therapeutic accomplishments, physicians could do little for patients. Almost all therapies were useless if not harmful. Patients did not suffer from diagnosed specific diseases; they suffered from nonspecific ailments defined by their “symptoms,” hence the confusing disease terminology. Terms like dropsy, pleurisy, fits, decay, fever, excitability, and others were used to define and describe an illness. Physicians based their medical therapy not on statistics or case histories, but on their own personal experience gained through treating their own patients.
History of Bloodletting: More Harm Than Good
Bleeding affected all patients and gave physicians some sense of control over the situation. Patients were usually bled while sitting up until they fainted. Babies were bled until their lips turned blue. The effect of bloodletting on a seriously ill, feverish, agitated, or even delirious patient or trauma victim was immediate. The person became calm, cool, and pale. This treatment resulted in cardiovascular collapse, shock, and often death, especially in a sick or debilitated person. Wounded soldiers or trauma victims were also calmer when bled, and they died by the thousands. The public was so convinced of the necessity of bloodletting, even for healthy patients, that physicians routinely performed the procedure. This was especially true in the spring, as it was customary to change the body humors for the season.
One example that illustrates the extent of bloodletting was found at an archeologic dig at a monastery in Scotland, where monks were known to have routinely bled one another. The archeologists found a stratum of blood waste formed by an estimated 300,000 pints.
The 18th century was an era where physician impression dominated the standard of care. In America, Benjamin Rush, MD (1745–1813), created a deadly therapeutic regimen called “Heroic Therapy.” This was based on his unfortunate impression of the positive effect of drastic bloodletting during Philadelphia’s 1793 yellow fever epidemic. Dr. Rush erroneously believed the body held 25 pounds of blood and that 20 pounds could be safely drained. He routinely bled patients as much as 75% of their blood volume. Admiration, sparked by the courage and fortitude Dr. Rush exhibited by staying in Philadelphia during the epidemic, treating and bleeding more than 100 patients a day, resulted in a half century of excessive bleeding, purging, vomiting, and blistering treatments used by his followers. In reality, this practice killed thousands.
By the mid-19th century, French statisticians had compiled both recorded patient histories and documentation of the treatment results, exposing the fact that bloodletting caused more harm than good and indicating its deadliness. In 1860, one astute author wrote: “If the employment of the lancet was abolished altogether, it would perhaps save annually a greater number of lives than in any year the sword has ever destroyed.”
Methods of Bloodletting
Bloodletting was performed by four main methods: 1) phlebotomy, in which a vein was opened with a lancet; 2) phlebotomy by a spring-loaded single-blade cutting device, called a phleam, which provided a set depth of cut; 3) cutting with a multiple-blade, spring-loaded “scarifier,” which provided several shallow cuts, intended to be used with a heated cup, which created a vacuum to suck the blood out; 4) by leeches, which were used in anatomic areas where the other methods could not be easily performed, such as the eyelid, mouth, or cervix. Curiously, medicinal leeches are still used today to withdraw blood from some areas to reduce swelling common in plastic repairs, transplants, or skin grafts to earlobes or fingers.
Bloodletting is still also used not only in the treatment of congestive heart failure and pulmonary congestion, but it is the standard of care in hematologic diseases such as polycythemia vera, where overproduction of blood cells is the problem. Despite the fact that bloodletting was a common practice, photographs of this therapy in the 19th century are rare, and only a handful are known to exist (all in The Burns Collection). This photograph shows an English practitioner opening a vein using a phleam.