SOME STORIES IN THE HISTORY OF MEDICINE

Lecture 2: Contagion, Infection, Antisepsis

Adam Blatner, M.D.

(This six-lecture series is part of Senior University Georgetown’s Winter-Spring 2009 session.)
    You can also click on the following links:
         1. The first lecture was an introduction and some history of the microscope, laying the foundation for the other lectures.on the background  
 Also, three supplements, which you can click on to link to related webpages:
            A. A very brief overview of the history of medicine. (pre-history to the Renaissance) (just to get you oriented)
            B. Further Overview to modern times.
            C. A further history of microscopy.
 Then: (To be posted as I prepare and give these lectures: )
    (Lecture 2 is here on this present webpage).   3: The Early History of Immunology  ;      4:  The Discovery of Anesthesia;
    5:  Recognizing Nutritional Deficiencies ;      6:  Hygiene: Sanitation, Hookworm, Dental Floss, & Summary

(First Posted  February 2, 2009)

We'll start with a quote that appeared in an editorial in the 1801 edition of the publication, The Medical and Physical Journal (London, Volume 5, page 505)
The most important discoveries, when familiarized to the mind, are contemplated with indifference. Who now wonders at the discovery of America, or the circulation of the blood?   There is, however, a period between the conception of a discovery and its mature birth, fraught with more pangs than war or women know; and there is no light, in which the human mind can be viewed, more interesting than during this anxious period.
Though a bit overblown with the kind of intellectual language style then prevalent, this quote does suggest that even then thoughtful people  realized the complexities of discovery and the development of knowledge. You may keep this in mind as you follow the rest of the talk(s).

There are many people involved in the story of the recognition of germs and the fight against infection, but for purposes of time and space, this presentation focuses on just a few of the better-known figures and, even then, just the highlights of their extensive stories. More specifically:
    1.Ignac Semmelweiss, working in the 1840s and 1850s, in Vienna.
    2. Oliver Wendell Holmes, Sr., working in Boston and its environs at the same time.
    3. Louis Pasteur in France, whose major contributions happened between 1840 through 1890.
    4.  Joseph Lister, in Scotland, then England, working in the later 1850s through the 1880s and beyond.
    5.  Robert Koch... and just touching on some other pioneers in bacteriology and a-septic surgery.

How Relatively Recent It All Is

Who among you are or know people who have achieved grandparenthood? Good. In fact, four generations can happen in a single lifetime. Now consider that your parents were conceivably born with great-grandparents who grew up in a world where an operation was an occasion for near-unbearable agony, and most wounds became infectious. The reason there were so many amputations in the civil war and other wars fought in and before the mid-nineteenth century was that wounds became infected, frequently gangrenous, many died from general infection, and the only thing you could do would be to amputate the limb well above any gangrene. Yuk.
         So the point is not to take the developments we’re talking about for granted!

The Development of Knowledge

In the last lecture I presented the following spectrum of knowledge, from the unknown to the known:
- What can never be known by the human mind
- What may well be discovered someday
- What only a very few people know---but most don't
- What some think they know, but they're mistaken
- What some partly know, because they have begun to put together the clues correctly
- What some know, but most others don't yet believe them
- What ideas have now been accepted by the major authorities in a given field, but not yet much known by the general population
- What has become familiar knowledge to parts of the population but is still unfamiliar or even a bit shocking to many others
- What has become commonly accepted by almost everyone in a culture
In a way this restates the quote at the beginning: We’re dealing with the field of discovery and the dissemination of that knowledge, along with the politics and technical difficulties of moving between what may be discovered and getting it accepted and widely applied..
Breaking With Tradition
There has been a gradient of status that has been associated with knowledge. Part of this is associated with the technology of writing and the authority that literacy conferred: If it was in a book it seemed as if it must be true. Ignorance was so pervasive and life so mysterious that anyone confident enough to write it up tended to be believed. (We still need to work a bit to disconnect from this, to question what is in print or on the internet.) Of course, political and religious authority exploited this and made it official; as a result, it was explicitly or implicitly taboo to doubt.  The abdication of inquiry in favor of mindless obedience became a virtue. The difference between respect and passive acceptance is still unclear for many. So, in medicine, too, for many years, it was assumed that the ancestors, the ancient scholars, really had an in with the truth. I think that during the dark ages preceding the Renaissance people had lost the spirit and techniques for inquiry and exploration, so that reliance was a fall-back coping response.

Anatomy, in this sense, was also part of this tradition, relying primarily on the writings of Galen around 140 CE and his dissections of

animals---because dissection of humans was religiously taboo. For the next twelve hundred years and more, his writings were authoritative, and the occasional anatomy was done as a scholarly exercise was explained to the guests according to the established texts. In the picture to the right, created in the mid-1400s,  the “Doctor” reading a traditional book about what Galen said, while lowly and ignorant technicians held up whatever the doctor told them to.

A new age was beginning. The growth of trade in the wake of the Crusades also opened up a trade in manuscripts, and it turned out that many of the ancient texts had been preserved first in the Byzantine empire, then by the Islamic countries, and as well, there had been advances in chemistry, mathematics, astronomy, and medicine in those regions even as Western Europe was mired in the Dark Ages. The new age was considered a re-birth of learning---and another word for re-birth is Renaissance---a term given a feew centuries later by historians.

On one hand, the ancient texts were indeed respected, but there were enough new bits and other things---like printing and the growth of literacy---so that along with this came an interest in questioning knowledge that was asserted to be true simply because it was old and established. Nevertheless, to put this into a little perspective, starting in around the 16th century and advancing well into the 19th, this questioning of authority was only done by a relatively few educated people around Euruope. The real subversion was a shift in the way people thought about truth: Reason was slowly emerging as the criteria for truth, instead of blind acceptance of tradition. It should be noted, though, that the dominant culture remained authority-bound and evocative of a corresponding tendency towards mindless obedience—a practice that continued well into our own lifetimes.
 
In the mid-1500s, Andreas Vesalius and then others did their own dissections and found out that frequently the authorities were mistaken. This was one of the revolutions in the history of medicine.


However, Vesalius was a maverick---he wasn't really typical of the proper way to be a scholar and educated man. Tradition dies hard. The old way still dominated medical training three hundred years later.

 In this caricature by Hogarth, the doctor still wouldn’t dream of getting down and getting his hands dirty. So Vesalius was by no means typical—a maverick.
Pathologic Anatomy
Moving the story forward to the 1700s,  while anatomy revealed how the human body is constructed, something else was also revealed: Abnormalities of various types were discovered, and gradually it became apparent that these might be correlated with the patient's symptoms and signs before they died. Indeed, the idea of doing a dissection to find out why a person died---now called an autopsy---was stilll new.

Doctors had begun to notice things like the fact that patients who had  shown signs of yellow jaundice before they died often showed growths in their livers or other disturbances of nearby organs---such as stones in their gallbladder or bile ducts. Or a patient who had stopped producing urine and shown other signs often, on autopsy, was found to have abnormally shrunken kidneys or other distortions of normal anatomy.

While it is recognized today as an "obvious" truth that disease was often associated with specific organs, that was quite new back then.

One more notable pioneer in this new development was Giovanni Morgagni, in Italy. For around twenty years, in the mid 1700s, when George Washington was still a boy, Morgagni worked on research and writing, and when he published, it shifted the field of medicine away from the pre-scientific theorizing and more towards building the field on a foundation of actual findings.

This pioneering work of Morgagnis led to a new way of practicing medicine: Correlate the signs and symptoms with the patient's pathology on autopsy. To do an autopsy was the cutting edge (oops, excuse the double meaning there), then, in the top medical schools in the late 18th and early 19th century.
 
Note also that there was a significant gradient between the thinking at the top medical schools where new ideas were being entertained and the vast bulk of practice which was still mired in theories that are now no longer considered valid—and indeed, thought of as rather quaint, such as the doctrine that disease is caused by an imbalance of the four humors, of the value of bleeding, sweating, salivating, getting the poisons out, and so forth. Indeed, such theories and practices continued well into the 19th century and in some folk traditions, even into the 20th century. The point, though, is to realize that autopsies became cutting edge of science, like people today who are excited about the internet and the hubble telescope. At the big hospitals, the whole idea of a more systematic, up-to-date, modern, scientific approach was the big thing.

Doctors used to be all frippery and never got their hands dirty.

That was just for surgeons, who were low status. While to the aristocracy and near aristocracy, doctors were tradesmen, yet within those fields, there was a hierarchy of doctors above surgeons and apothecaries... in the 18th and early 19th centuries. (Actually, the field of surgery went way up in status in the late 19th century!)

So at the turn of the 1800s, the modern physician did get in there and get blood and pus on his coat. It was a badge of honor, a way to say, “I am involved. I am engaged! I am wrestling with science. I am modern, not old-fashioned.”

The problem was that everything was filthy. Fecal contamination was everywhere in the streets, you could smell a city miles before you came into its outskirts, there wasn’t much money for fuel to boil water, either for drinking much less bathing. Bathing was just coming back into fashion, because at the beginning of the 18th century hardly anyone bathed with any degree of frequency. It was also a religious thing, a sign of pridefulness, to bathe. And it was an invitation to sexuality. Whatever—the history of bathing is a whole other thing.


  It should be noted that the movement to clean up the sewage system and the purify water was also beginning in that era. A newspaper cartoon tried to call people’s attention to this---the picture on the left is called "monster soup," referring to what a microscope might see on looking at the Thames river.
.
There’s a new book in the library—I had them order it—called The Big Necessity—about a topic that many avoid—toilets, and sewers, and sewage—read feces—purification and disposal systems, and how much all this really necessary part of collective life, government public health, has surprisingly frequently been avoided—it’s almost a taboo topic!  And how much this issue really needs to be addressed—and indeed is one of the themes to be mentioned again in the last lecture.

Ignac Semmelweis and the Fight  Against Childbed Fever

The brief back-story of the break from tradition beginning in the Renaissance has been mentioned because the emergence of truly useful knowledge must pass through currents of new theories colliding with the weight of old theories. All this is relevant in the effort to discover the causes of infection, and to prevent infection. This episode is set for the 1840s, at what might possibly be the top medical school in the world at the time: The Allegemein Hospital in Vienna. For that time, some of the main explorers in many beginning specialties were on the faculty. Our hero is a not-particularly-rebellious, mainstream physician, good team player and all that, named  Ignac Semmelweis.

He was from Hungary and had a bit of an accent—evoking the sort of assumption that his education was second-rate that many American physicians entertain, at least subconsciously, to their immigrant colleagues.

Microscopes were still being developed, making new progress. Autopsies were a major technology as the field of pathology was very avant-garde. Semmelweis had become a part of the faculty, teaching medical students, and was concerned about a common complication of giving birth, which was childbed fever, or puerperal fever, and was often fatal. Why were they getting sick? What could be done to prevent this? There were many theories!

Semmelweis noticed something that today, now that we know there are germs, seems obvious. You can’t fool around with filthy hands. Well, in fact, doctors did wash, but the washing was perfunctory, and their clothes were often still contaminated with bits of pus or pus after they had been down in the morgue doing autopsies—which, you must remember, was the most modern thing they could or should be doing. These guys thought they were really up-to-date, and there was a pride in being on the cutting edge, not like their old-fashioned models, their parents’ generation, who were technologically speaking, square, country bumpkins, who didn’t think scientifically.

     There were two units, one managed by midwives, and one managed by doctors. And the midwives ward had many fewer cases of childbed fever. So to make a rather long story short, Semmelweis put two and two together—and indeed, his use of comparing numbers of cases was one of the first examples of fairly crude epidemiology—and with some other experiences, deciding that the docs should not just wash, but wash in what we would now consider an antiseptic solution, and when he was able to supervise this, wouldn’t you know it, the infection rate went down!

  
A little backtracking for emotional impact. Semmelweis’s first efforts to figure it out was to redouble the tradition: Examining, thinking, going back and doing more autopsies, going back and examining, and doggone it if the infections didn’t just continue. And then a friend of Semmelweis, while doing an autopsy of a patient who had died of childbed fever, nicked himself with a scalpel and died of blood poisoning, massive sepsis. Well, they did an autopsy on this friend, too, and Semmelweis noticed that the inside of the body of the man who died from infection from a cut on his finger while doing an autopsy had many of the pooling of infected pus here and there, blood clots in the blood vessels, other things that were also characteristic of what women who had died of childbed fever had. We’d now call it a final common pathway of massive blood infection—also called blood poisoning— and the complications to various organs that come from this.

So Semmelweis got the idea that, aha! Childbed fever was caused by doctors transferring what Semmelweis called cadaverous material—who knew from germs back then? – to the patients. Take a moment to consider, then, how he might have felt. He himself undoubtedly caused the death—inadvertently,—we might say today, “unintended consequences”— of numerous young women in the prime of their lives, making the babies orphans. Also, though, the babies often caught the infection (passing through an infected birth canal) and they, too, died. Anyway, Semmelweis certainly was overwhelmed with guilt, and this drove him to try to correct or even make up for his errors.

He became a bit of a fanatic on this hand-washing business. He kept statistics, and they supported his unfolding theory. Semmelweis still didn’t know about microbes, germs. He did get the sense of contagion by cadaverous material. Later on he got that a patient could get infected by the presence of old pus on the clothes or sheets—and they didn’t do much laundry there—and even from someone with a bad open infection being in the same room. Semmelweis was in charge of a unit and insisted that the young men, medical students, nurses, anyone who was to examine a woman before or after giving birth, would have to wash in a chlorinated lime solution. To the extent that he could supervise them, the rate of infection dropped. And on other units they stayed high, and if he had support and the students obeyed, the rate went down, or if the med students took a month off, the rate went down, and when they got back to their bad habits, the rate went up. So he became increasingly convinced and tried to convince others. Some he did, and many he didn’t. It’s quite a story.

Oliver Wendell Holmes, Senior

Unbeknownst to Semmelweis, there was another doctor, at Harvard, in Boston.


Quite reputable, Dr. Oliver Wendell Holmes, Senior, also was having a similar insight and getting similar back pressure. Note that there were two of them, both famous in the 19th century, the father and his son.. The son, OWH, Jr. was a famous supreme court justice, doing his thing in the latter part of the 19th century. The father, OWH Sr. was not only a physician, but historically became better known as a poet and essayist—that’s where I first encountered the name: He wrote several poems, like Old Ironsides, and other stuff I read in high school English when they talked about American Literature. But that was a sideline!  Holmes  was mainly a physician, and one of his main things was the campaign to promote hygiene—that’s another word for preventive medicine—by cleanliness, hand-washing, etc.

About the resistance to new ideas: A little psychology here, since that’s my specialty: If you mean well, and have noble intentions, it’s hard to consider that what you do is counter-productive, or has unintended consequences, or is radically foolish. That’s been a problem throughout history and has been quite recently played out on the national and international political scene. It’s even harder to hear that you might be the problem when you’re bustin’ your butt to be the best doctor in the world, at the best hospital, using the most modern methods!

Also, both Holmes and Semmelweis were mainstream and mildly established, but also on the young side, so their authority wasn't great in terms of seniority; and, furthermore, there were also a variety of plausible alternative explanations, so it might be explainable why other physicians might wonder why they were supposed to wash in an annoying and somewhat harsh antiseptic solution that was both bothersome and time-consuming. Then, as now, being "busy" generated a subtle sense of entitlement as an excuse for haste and perhaps carelessness.

Moreover, so much of life, equipment, and so forth was dirty, stinking, filthy. This was a charitable hospital, after all. There wasn’t much money or many resources for cleanliness. Nurses back then were paid a pittance---the economic angle---and were far from being the professionals they are today. Indeed, their dirty work put them below that of washer-women. With the exception of the cadre of women recruited by Florence Nightingale, nurses were often ignorant, not infrequently alcoholic, and not above some on-the-side prostitution if it would make some money. The only reason anyone went to the hospital is that things were even more wretched at home with kids’ demands and other kinds of dirtiness there. Even a little help was better than nothing.

This was an era of urbanization, of people moving from the country to the city, of child labor, of sunless cities, and rickets—the vitamin D deficiency that comes with a lack of sunlight—was very common (I’ll talk about vitamin and nutritional deficiencies in the 5th lecture.). Bones were distorted, and that included women’s pelvises, meaning that there were all sorts of difficult birth problems. Forceps deliveries were common, and these instruments entered the edge of the opening womb—and before sterilization became routine—that would take another—if you can believe this—fifty years or more—then it was like rubbing dirt into an open wound.

Another thing about wounds and infection and the theories of infection. If you get a cut, there’s a fair chance it’ll get infected; but often infections were by not-terribly-pathogenic germs, like staphylococcus alba, leading to a kind of white pus. If you can believe it, doctors considered this good pus, they called it laudable pus. It was the kind of infection that was slow to heal, but heal it did. There were other germs that got in wounds that were bad, led to gangrene. And if these bad germs were in certain tissues with a rich blood supply, they you got septicemia and died, more often than not.

Childbed fever, puerperal fever, was one of the bad diseases. An infection in the lining of the uterus, called endometritis, went all over. That itself slowed down the understanding of the problem. Sometimes it went right into the bloodstream, sometimes the infection leaked into the abdomen, giving peritonitis. Sometimes infectious blood clots would form and go to the lungs—the point is that when things went wrong, the condition could take on a variety of mid-to-end-point symptoms and signs—but the high fever, the shaking chills, the pain, all was bad news.

The thing to emphasize is that the best docs when your great great great grandmother lived didn’t have a clue. They had fifteen to thirty competing explanations. One of those was contagion, but that itself wasn’t at all clear. A big cause was obviously the miasma, the gassy exudates of under-ventilated rooms, the smells of sewage and feces, and so forth. The treatment was fresh air, if any could be found. The problem is that it missed the point—and we’ll talk more about this later.

Semmelweis' Difficulties

So back to Semmelweis: It became painfully clear that there was a tight correlation between visits from the doctors and patients dying. Can you imagine the guilt he must have felt when he realized that some of the stuff he had been doing had been making patients worse?

In my (Adam's) personal background, I, too, became more aware of possibly having contributed to secondary or nosocomial infection in some of my patients, and felt more than a few pangs myself when, several years— decades, really—after my internship at Los Angeles County Hospital, that we doctors —this would be back around 1964---were practicing a level of not-half-tight-enough cleanliness, compared to what infection control standards of the late 1970s and 1980s became. But we were so idealistic, and we really tried!  And we were ten times as clean as anything Semmelweiss wanted—but we now know enough to know that even more is needed.

Semmelweis' story is involved, full of political ups and downs. He made some real allies, but he also made many enemies—in contrast to Joseph Lister, about whom we’ll here in a moment. For one thing, he tended to attack the intentions, the honor, if you will, of those who didn’t buy into his theory. Remember, microscopes had just been developed and not a lot of people had seen germs.

An interesting and tragic side point. There were some new microscopists at that Viennese medical center, and Semmelweis could have inspected the purulent discharge from the vagina—the result of the infection of the uterine walls—and seen the bacteria, maybe. It could have strengthened his case. But he relied on his numbers, and others found other ways to explain these. Semmelweis also failed to write up his findings for many years, and some allies had to summarize—and they didn’t present the case optimally.

Another factor: This was now 1848, and the liberal revolution of 1848 swept Europe, Semmelweiss chose the liberal side, his bosses were conservative, and as the Austro-Hungarian empire and other authoritarian institutions reasserted their power—the revolution weakened—Semmelweiss was demoted.


In spite of having a number of influential allies, he was too sensitive to defeat, and responded to the pressure by suddenly and without warning just leaving town, leaving Vienna, and returned to Budapest. The story goes downhill from here: Semmelweis gets mentally weird. The history sounds like tertiary syphilis—a not uncommon problem in that era—you have a fling with a prostitute as a young man and your goose may well be cooked. Or your brain, which deteriorates, into a condition known as General Paresis or Paralysis of the Insane. They didn’t know what caused it, it constituted maybe a third or more of the mentally ill in hospitals near the end of the century—this history of syphilis is a whole ‘nother lecture. But such people do often go through phases of mania, depression, and paranoia, which he did. Semmelweis did finally write, but it had that writing contaminated by mania and dementia, full of redundancies, attacks, self-glorification, and in short, no one read it, or those that did found the format such a turn-off that the substance was discounted. This is an important historical lesson. I’ve known of others with good ideas who presented them orally or in writing so poorly that the valid parts couldn’t be heard.

Anyway, Semmelweis ended up being hospitalized and the attendants at that time around 1865 often beat patients up to quiet them. If they were uppity, they beat them more, and from the evidence, Semmelweis died from complications of violence a couple years after being hospitalized. Alas, he was only in his mid-40s.

Later on Semmelweiss was recognized as being prescient, right, and honored posthumously. But to appreciate his work, several other discoveries had to happen, so that other objections to this inconvenient truth for that time might be considered. (Inconvenient, because generating a clean, much less aseptic context, is, frankly, difficult and expensive! We’ll talk more about that later on.)

Louis Pasteur

Now let’s talk about another story that overlaps this. Louis Pasteur in France was just beginning to explore some of the problems that would result in an understanding of the existence and action of microbes, micro-organisms.

Some highlights of Pasteur's life: 
      (1) The discovery that organic chemicals could have isomers, forms that were either in a figurative way of speaking, either right- or left-“handed.”
     (2) While the process of fermentation was known for millennia, Pasteur discovered that the ferment consisted of living micro-organisms! As a corollary, fermentation wouldn’t work unless the organisms were alive—and they also had to be the right organisms for the job!
     (3) There is no such thing as “spontaneous generation.” Life and also organic decay required the action of living organisms (or eggs)
   (Demonstrated bacteria in air, as with equipment in the picture above---1860.
     (4) Disease in silkworm (1865), wine, and people could be caused by micro-organisms (i.e., germs).
     (5)  Discovered microscopically (with Sternberg) various bacteria! (1877-1880)
     (6) Germs can be weakened so they don’t cause disease, and can be then used as a kind of vaccine to help a larger organism resist the disease that would otherwise be caused by that germ (e.g.,  anthrax, 1881;  rabies, 1885).

I’ll talk about the last item---vaccines---in next week’s lecture, about the history of immunology.

 The point to be made here is that item number one gave Pasteur a bit of a reputation, so that he was hired to consult on some problems of why a factory that turned sugar-beets into alcohol was going sour. From this he discovered pasteurization. The point is that the wrong kind of micro-organisms could change the nature of the fermentation from what is desired to what is undesirable.

Part of the delay in this discovery was that before the improvements made to microscopy in the 1870s, this tool hadn’t much been applied to medicine. Some of this involved staining technology. Other progress was made in the still new field of  microscopic anatomy called "histology." If you were a scientist then, if it didn't occur to you to look specifically for germs, you would not be likely to perceive them or appreciate their meaning. Even if you did see them, without that appreciation, you’d just think these dots and dashes were contamination, background stuff.

Another factor slowing down Pasteur's ideas' being more widely accepted was the fact that Pasteur wasn’t even a doctor! He was a chemist. Later on, physicians didn’t lend his findings a lot of credit because he wasn’t even a doctor. Ho ho. In fact, many significant discoveries in medicine were anticipated or first made by non-physicians!

When he was in his early 30s, Pasteur was asked to consult about why some factories that converted beet-sugar into alcohol were failing because the resulting material was sour and smelly. We’re talking about the process of fermentation, here.

The Problem of Germs

Just to note, for the record, others had talked about the possibilities of contagion and the communicability of diseases.

There were clues in the behavior of the new disease of Syphilis, probably imported from the New World—the Western Hemisphere— by Columbus’ sailors, in return for our sending smallpox and other epidemic diseases over there from Europe. Girolamo Fracastoro wrote about the idea of contagion in a treatise that used the term Syphilis for the first time, instead of calling it The French Disease—because for the first fifty years after Columbus, folks in France were calling it the Spanish Disease, or the Italian Disease. At that time, there were big pock marks formed—it’s become more mild in the last two or three hundred years— and it was also called the Great Pox—in contrast to the Small Pox about which we’ll speak next time.

Historical research has found reports in the previous centuries suggesting the possibility of something like germs, but they didn’t know what to make of these patterns. The microscope hadn’t developed strong enough resolution back then to see bacteria, and so forth. So ideas would blossom and then be overlooked.

A related problem in all this that had been around for centuries is the origins of life, and through various experiments, around 1859, Pasteur showed that it didn’t just happen, there had to be contamination. Soon after this, in medicine, in another country in Europe, another researcher demonstrated something similar: Cells had been discovered as universal components of plants and animals, but they didn’t just crystallize out of the formless background gunk; rather they were produced by parent cells. All this happened around 1860. Remember, folks didn’t hear about each others’ results all that quickly, if at all.


 Pasteur continued to explore the idea that micro-organisms could cause disease, not just by souring wine, but in animals and as a source of disease.

 A round 1865 he did research on Silkworms. While Pasteur was made into a national hero by 1880, in the years before that he was not infrequently challenged and this led him to further experiments to make his points.

A couple of interesting points. He had a significant stroke in mid-life and yet kept going, with an arm and a leg significantly weakened. His findings weren’t readily accepted by physicians because he wasn’t a physician, just a chemist. By mid-life, Pasteur was also beginning to work toward the establishment of research institutes, to get governmental funds allocated, and the like—and in a sense, this was also a pioneering effort that was one step removed from the actual biological research.
 



It addressed the infrastructure that I’ve mentioned!

Joseph Lister

Let’s turn to another major pioneer: Joseph Lister—from whom you get the name Listerine.


Lister was a mainline doctor in Scotland—and the University of Edinburgh was at the time another major center of research and discovery. To the right are photos, first, of him and his wife, Agnes, around 1856. It should be noted that she was of great help to him in his many endeavors! Later, when older and much honored, in the 1890s, he was then Lord Lister, given a peerage!

Lister was disturbed by the prevalence and not infrequent fatal results of surgery and major injury, all due to infection. He didn’t know about germs. (Interestingly, he was the son of another physician who had made some substantial contributions to the improvement of the microscope around 1830.) Still, microscopes and staining technology weren't able to demonstrate germs clearly before around 1870. But also around 1865, he did get the idea that there was infection in the air---and had heard of Pasteur's experiments --- and instituted the practice of spraying the area of surgery with the antiseptic, carbolic acid. Relative cleanliness of the wounds, putting on carbolic acid-soaked dressings, such efforts were rewarded.

Lister didn’t attack his enemies, but rather built his case, gradually, with statistics, written papers, presentations at professional conferences. He did in time learn about Pasteur’s work and acknowledged that it helped his efforts, and in the long run was recognized and even elevated into the peerage—made a Lord!

So Lister's techniques were implemented. Here are some photos of early surgeries using different instruments for the spray of the carbolic acid:  On the right you can see the concomitant administration of anesthesia.



   This technology gave rise to the manufacture of more refined machines:




Gradual Dissemination of Antiseptic Awareness

It took decades before sufficient mainline professors accepted and began to implement antiseptic concepts, the awareness of the existence of germs and how they cause infections, and how they can be avoided.


Here's a famous painting by the artist Thomas Eakens, titled the Gross Clinic---portraying a leader in the field---but the idea of cleanliness had not yet penetrated to this midwestern American region with a high-ranking medical faculty. But this level of non-antisepsis also had a relatively higher rate of secondary infection.

Robert Koch, Louis Pasteur, and Bacteriology

Microscopy finally was developed enough, along with staining techniques, to see germs:

  But advances were happening. Better microscopy, better staining, and people started being able to see germs.
By 1875 the idea of germs was becoming more clear, and further changes in microscopy, the use of oil-immersion lens, and staining, all made it more possible to see the little bugs. Koch’s contributions here were significant, but for our purposes a full description would elaborate this story in some other directions—as fractal branching can do. So, for the purposes of this lecture, we’ll stay with the following course of seeing what it all had to do with surgery.


Asepsis

If there are germs, why just use anti-sepsis to try to kill ‘em in the air and on the ground? The next step, then, was a-sepsis, the idea being keep germs away from the wound to begin with. completely?


Dr. Ernst Von Bergmann in Berlin developed the idea of, first, stronger chemical antiseptic sterilization of instruments (1877), and then in 1880, steam heat for sterilization. His photo, left, and also an "autoclave" (though this one shown is a more modern version, from the 1930s).

There was a coming together of several technologies here. Robert Koch (picture to right)

was another pioneer of bacteriology. Beginning in the 1870s, among other things, he worked out was the technique of assessing how many germs there were on a specimen or swab—ten, a hundred, a million?  And this allowed a way of getting feedback to how successful efforts at cleaning things—drapes, sheets, clothes, and especially instruments—were. Suffice it to say, ordinary cleaning doesn’t do it. So the next phase went through one step after another in order to make things more and more clean, and, indeed, sterile.

Changing the Instruments

Surgical instruments used to be made with wooden handles, and for the fancy specialist, ivory handles. These were further engraved for better handling. But these were also impossible to sterilize, because germs could "hide" in their crevices.

 Another refinement was needed----this need to bring diverse technologies together is part of this larger story— and that was the manufacture of all metal tools that could be sterilized.


These in 1840s are impossible to sterilize.
These made in the 1870s can be autoclaved, steam sterilized.

Around 1880, other aseptic techniques were applied: In addition to sterilization of the instruments, a shift away from wearing street clothes was instituted and spread from Europe to the United States. (Before the 20th century, Western Europe was generally viewed as leading the USA in medical advances.) Sheets, drapes, towels, sponges---all were to be sterilized. The skin was to be meticulously cleansed. Later, antiseptics were applied to the skin, also. Routines for the doctors and surgical nurses also carefully washing---scrubbing, really, and it was called "scrubbing up"---were instituted. 

In the following pictures, some progress can be seen between the 1880s and 1970s:


Surgery at the University of Pennsylvania, 1890s.
 Dr. Billroth in Europe, 1890s.

The open operating rooms were closed. There was a new goal of making them easier to clean meticulously.
Here's an operating room in the 1820s, before any thought of germs and antisepsis. And here's one after 1900, with asepsis kept in mind.


 
Notice that the doctors had begun to wear special, clean suits---white instead of the more modern green that came to be used after the 1920s---. The green special surgical outfits came to be called "scrubs" because doctors (and nurses) would change into them, including coverings for shoes, and then put gowns over them!
 

Another shift from around 1900 to 1920 was the adoption of rubber gloves and face masks, plus better covering of the hair.



An interesting anecdote about rubber gloves. Halstead, above left, was asked by his chief nurse, Caroline Hampton, what could help the problem of the antiseptic washing solution irritating her hands. Halstead as chief of surgery had a mold made of her hands and rubber gloves ordered specially for her. Later, he found the idea was good---it extended the whole concept of asepsis another step. By the way, the nurse went on to marry the chief surgeon. Also, it used to be that doctors wore gloves so as not to catch a disease, but then it changed over so that doctors wouldn’t unintentionally give a disease, contaminate a wound.

Moving the story up to 1970s and beyond, this picture of a surgery, in which the air itself is being sterilized:

Summary

The key to infection, then, is---of course---germs, about which little was known less than a hundred and fifty years ago. There has been a good deal of attention given to infection control in the first quarter of this century, but progress was slackened a bit in the anti-biotic era of the 1940s through the 1960s. Following that, gradually there came to be a heightened awareness of antibiotic resistance in an increasing number of cases, of sub-types of germs!

So consider then a couple of words for your vocabulary:
 
Iatro-genic: Iatro is Greek for physician. I am not a student of the mind–a psych-ologist, but rather a psych-iatrist—an iatros, a physician who has specialized in disorders of the mind. I integrate a good deal of psychology in my learning, but that’s not my official field.  Iatro- .. Then genic is like genesis, the beginning, cause. So an Iatro-genic disease is when you’re sick because of what the doctor did.

Nosocomial:  Nosocom is greek for hospital, and it’s a disease you catch because others are there with certain germs—the kid in the next bed with a strep throat... that kind of thing. It’s not the same as "iatrogenic" in meaning, though the two words in fact overlap a bit. .

There’s been a great deal of attention given to nosocomial infections, and much more especially now that increasing numbers of cases of anti-biotic resistant germs are cropping up. So a lot of the increase in medical costs is going into infection control, into making hospitals less likely to spread germs. Now if we can get doctors to wash their hands—really, that continues to be a big problem, and I suggest that you bug your docs by asking them to wash where you can see them wash before you let them touch you. Really wash, too. Assume they’ve just been feeling around on some infected wound just before entering your waiting room. Blame me.

I’m open to questions.