Civil War Medicine

By James B. Atkinson, M.D., Ph.D., Professor, Vanderbilt University School of Medicine

Originally published in the Battlefield Dispatch Vol. 2 No. 4 Fall 2014 - read the rest of the issue here.


The American Civil War brought tremendous advances to the practice of medicine. While the medical departments in both the North and South operated dismally at the onset of the war, by 1864 both sides had efficient medical services that met and, in many cases, set the standard of cate at that time. That dismal beginning resulted in misconceptions that have been carried through today. The recognition of disease patterns, preventive medicine, evacuation and triaging injured soldiers during battle, and treatment of serious wounds all evolved in those few short years. Over 3,000,000 soldiers participated in the War Between the States. 618,000 died. Two thirds of those deaths were by disease, and one third occurred in battle or from wounds sustained in battle. The Union lost 359,000 men. The Confederacy lost an estimated 258,000 which accounted for 1 in every 4 men of military age. The total mortality was 2% of the U.S. population at that time which is equivalent to over 6 million in today's population. What accounted for these large numbers?

Two factors contributed to the loss of life during the Civil War. It was the first "modern" war where high numbers of casualties could be inflicted over a short period of time. With the long range of the rifled musket, soldiers were subjected to fire over a great distance, turning massed assaults into suicidal charges. The soft lead projectiles, weighing around 1 ounce, flattened and deformed on impact, causing significant damage to soft tissues and bones.

A second factor that impacted morbidity and mortality of the two thirds of soldiers who died from diseases, was that basic medical theory and surgical practice had remained relatively unchanged for hundreds of years. There were no concepts about infectious causes of illness, and soldiers lived in large, crowded camps where food and water were contaminated from both men and horses. The work of Louis Pasteur that helped identify bacteria did not occur until the 1860s and 1870s, and the landmark work published by Joseph Lister "On the Antiseptic Principle in the Practice of Surgery" was not until 1867. 

Many doctors in the North and South began their careers without having the opportunity to observe an operation closely, and when they were thrown into the fray, many operated largely by instinct. Trauma in the mid 1800s consisted of the occasional horseback or industrial accident, and few doctors ever dealt with gunshot wounds. Prior to the Civil War, surgeons at one of the premier hospitals at the time, the Massachusetts General Hospital, performed less than 200 surgical procedures of any kind per year. Yet once the war began, doctors were required to perform large numbers of surgical procedures in a matter of hours. At the beginning of the war, there were 113 surgeons in the U.S. army, 24 of whom joined the Confederate army. By 1864, more than 12,000 surgeons had served in the Union army and about 3,200 in the Confederate.

How did Civil War surgeons compare to their contemporaries?

Civil war surgeons compared favorably to those who treated casualties in the Crimean War (1853-56) and Franco-German War (1870-71). The British had a fatality rate of 28% for 1,027 amputations. Union surgeons had a fatality rate of 26% for over 30,000 amputations. Fatality rates were higher when amputations were closer to the trunk. In every recorded amputation of the hip that was performed by British surgeons, the patient died. Union doctors succeeded 17% of the time, and the amputation of John Bell Hood's right leg just below the hip at Chickamauga on September 20, 1863, performed by Dr. T.G. Richardson, Chief Medical Officer of the Army of Tennessee, was nothing short of miraculous.

Surgeons who served during the Civil War were generally successful in saving lives, especially considering the state of medicine in the 19th Century. Concepts conceived during the war, such as the organization of hospitals based on underlying illness, ambulance services, mobile hospital units, and triaging patients, are still very much in practice today. Even the basics of performing an amputation have not changed dramatically over the past 150 years, other than the use of sterile techniques and better anesthesia. The pain and suffering brought about by all wars since that time have further advanced medical knowledge and practice, and it is indeed possible to create good out of that evil.

The Myths of Civil War Medicine…

Doctors Were Uneducated Butchers Who Did Amputations Indiscriminately

Military surgeons were all educated, having attended medical school or trained with an established doctor. They had to pass an exam before serving as military surgeons. The large number of amputations was because of the severe bone and tissue damage by the Minie ball, making surgical repair impossible. Because of a bad reputation promulgated by the press, Civil War surgeons were actually criticized by European and Canadian observers for performing too few amputations!

Operations Were Done Without Anesthesia

To "bite the bullet" is a myth perpetuated by old movies. Anesthesia was used in 95% of all operations, usually ether or chloroform, or a combination. Ether was first synthesized in the 1500s but it was not until 1842 when a Georgia doctor, Crawford Long, used it as an anesthetic to remove a neck tumor. Chloroform was discovered in 1831 and used as an anesthetic in 1847. Both sides had ample supplies. Doctors dripped the anesthesia onto a cloth over the soldier's face, causing loss of consciousness followed by a stage of excitement. Although the soldier was unconscious and felt no pain, he would thrash or moan .and had to be held down by assistants. Since operations were performed with others waiting their turn, or in open air with passers-by, those observers saw the clamor and assumed the patient was conscious, and this was reflected in· their letters back home.