CECIL STRIKER LECTURE
Healing Cincinnati:
A Winding Path Towards “Pursuing Perfection
May 26, 2010
Let me begin with a comment.
COMMENT
What is now The Henry Winkler Center for History of the Health Professions rests on a fabulous collection that has been built up over many years. In 1990, Billie Broaddus, the director of what was then called the History of the Heath Sciences Library and Museum, asked me to assist in the preparation of a grant application for the National Endowment for the Humanities. The goal was to increase the visibility and use of the collection. The proposed medium was a reorganization of the display space into a more focused interpretive exhibit that would explore the career of Dr. Albert Sabin. I found myself dropped into the middle of a wealth of materials that I previously knew nothing about. Although we were not successful at securing the grant, I hope we made a small contribution to what has become the Winkler Center.
Let me warn you, I am no expert on the history of medicine, a weakness that those who invited me to speak this evening insisted they understood. So what I am going to do is to draw on many elements of research with which I have been engaged over the last 20 years, including columns for the Cincinnati Post, the curation of exhibit for the Cincinnati Water Works, passages for several books, and the preparation of a manuscript on the history of Bethesda Hospital that never saw the light of day. My goal this evening is to reorganize those disparate elements into a coherent whole that paints an overview that others will clarify, deepen and correct in future Stryker lectures. This will not be a systematic history. I am painfully aware of how many topics I am not exploring, and that I do not have all the connective elements worked out.
BEGINNINGS
The popular mindset of Americans is that history is progressive—that the inevitable course of change is towards improvement. But at the time that Cincinnati was founded and growing rapidly, the profession of medicine did not yet share this American view of things. Rather than fixed on the future with hope, it clung to the past.
Daniel Drake, the recognized leader of the Cincinnati medical community in the first half of the nineteenth century, founded of the Medical College of Ohio in 1819. Over the next four decades he welcomed hundreds of students to the study of medicine at this and other colleges he helped found. In his lecture welcoming a new class of students in 1849, Drake reviewed the scope of a medical education and made it clear that the “young gentlemen” (no women here) would have to master many individual sciences, and the “profession of medicine.” In the course of the lecture, he referred to the 2000 year-old writings of Hippocrates “the chief treasure of any medical library.”
Most early nineteenth century doctors accepted the Hippocratic view that the human body was composed of four humors—blood, black bile, yellow bile and phlegm. Disease was the result of an imbalance in these humors. Treatment involved various methods—bleeding, purging, the prescription of diuretics, etc.—aimed at restoring the balance.
One widely, but not universally, held view of the origins of disease pointed to miasmas. In 1878, The Ohio Bureau of Labor Statistics reported on the condition of the tenements in Cincinnati. The report pointed a finger at the design of many tenements that placed the common privy at one end of the entrance hallway. As a result, “it was almost impossible to prevent the gaseous exhalations arising from it (the privy) from being disseminated through the entire building, poisoning the atmosphere and causing discomfort, disease and death.” “Gaseous exhalations” was more than just a colorful description of bad odors, those were disease bearing miasmas.
In addition to tenement privies, miasmas were suspected of lurking over the lanquid canal that wandered through town, as well as from stagnant pools of water in depressions in the muddy streets, and from rotting kitchen waste thrown regularly into the streets for the garbage collectors—free ranging pigs—that ate almost anything, but left fecal matter in their wake which was ground into the mud and dust. Occasionally those droppings were scooped up and deposited safely into the Ohio River, not far from the intake valves of the pumping station for the Cincinnati Water Works on the eastern edge of the riverfront.
This did not mean that the best physicians did not search for new insights within the confines of the existing paradigm. In the late 1841, for example, Daniel Drake published the results of several years of field research into the causes and the efficacy of various treatments for what was known as the Trembles or Milk Sickness, which was first observed in the region in 1809. In the course of that research, he evaluated the effectiveness of blood- letting, noting that “the blood was sometimes baffy,” as well as the use of cathartics, opium, blisters, and poultices of scorched oats.
And for medical rather than moral reasons Drake joined the temperance reformers of his day. In public lectures he argued intemperance caused morning vomiting, excessive secretion of bile in the liver, an offensive discharge of phlegm, epileptic convulsions, and hastened the death of those with consumption (TB) and asthma. Although Drake’s suggestion that intemperance predisposed the body to “spontaneous combustion” seems a bit over the top, and can we say unfouned?
Preparation to become a physician in the 1840s and ‘50s in Drake’s college and its competitors, including those with competing medical models such as the homeopathics and the eclectics, consisted of attendance at a series of lectures in anatomy, physiology, pathology, therapeutics and operative surgery. Usually a student could complete this course of study in about 18 months.
Just four years after Drake founded the Medical College, he also helped organize the first public hospital in Ohio, the 150 bed Cincinnati Commercial Hospital and Lunatic Asylum. Like other public hospitals of its day, this was not so much a medical facility in the modern sense, but a combination poor house, pest house, insane asylum and orphanage which happened to dispense medical treatments to the “inmates” as they were referred to. And with no controlling theory of the origin of disease, hospitals themselves became incubators of disease and death from what was known as “hopitalism.”
Surgery, the fundamental service that would fuel hospital growth in the twentieth century, was a bleak, brutal business that “changed little and improved less” between the ancient Greeks and the introduction of antiseptic techniques by Dr. Joseph Lister in the late 1860s.
Although medical students regularly joined professors on their rounds at the Cincinnati Hospital and observed procedures performed on the largely indigent patients, the students rarely had the opportunity for hands-on diagnosis or treatment.
Medical education also did not include any hands-on experience in scientific laboratories. Not until Johns Hopkins opened in 1876 did any American medical school have a single scientific laboratory connected to its program. For an aspiring doctor to learn about science, to even be introduced to the use of a microscope, he had to travel to Europe, usually Germany. Medical education also did not normally incorporate the dissection of a cadaver by students, although, hopefully, the students would observe their professor perform a dissection. The 1844 Catalogue of the Botanico-Medical College of Ohio boasted that it possessed a “Manikin” which allowed “an amount of knowledge that it once required years to obtain can be communicated in just weeks.”
Just like today ordinary people in the nineteenth century were hungry for information about disease and cures. But they did not have television which could dispense a daily dose of Dr. Oz or a world wide web with thousands of medical sites. But people in the 1859s, ‘60s and ‘70s did have access to thick books that offered what seemed to be authoritative insights.
Two books that were in circulation in Cincinnati were J.H. Pulte’s Domestic Homeopathic Physician (1851) and John Gunn’s New Domestic Physician or Home Book of Health (1858). These compendiums of maladies, diseases and cures were designed to assist ordinary families. For example, Pulte warned that hanging someone who had drowned upside down would not bring them back to life. On the one hand, Gunn recommended that a victim of Scarlet Fever cover their entire body, except for their head of course, with the “fat of bacon.” By the way, Gunn also reassured his readers that they faced no danger from treating Syphilis with mercury.
In the face of severe disease, the medical establishment could observe and document, but could do little to probe the causes or actually combat disease. When Cincinnati was swept by cholera in 1832 and 1849, the most prominent doctors were at a loss. In 1832 a local medical society promptly appointed Drake the Chairman of the Society’s Committee on Epidemic Cholera. Drake reported through the newspapers of his observations of the course of the disease and offered suggestions. For example, he recommended that Cincinnatians “avoid intoxication, night air, and unnecessary medicine. Eat only beef, mutton, veal, poultry, eggs, milk, and ‘good ham’ in moderation; keep rooms dry with fires; wear woolen clothes.” If a person felt a "lax or disordered state of the bowels” he recommend that they immediately "bathe feet, take to bed, put poultice of mush or bitter herbs over bowels, send for physician or take a pill of 10 grains of calomel and 1 of opium." Doctors had long prescribed a variety of compounds, but had little sense of calibration.
When cholera struck again in 1849, native born residents were reassured that fortunately, most of the 4,700 deaths were concentrated in the German and Irish immigrant communities, who clearly brought the disease on themselves because of their “inferior civilizations” and their choice to crowd together in the city’s poorest districts.
A comment about Daniel Drake
In the first half of the nineteenth century no one living in Cincinnati stood out as a more important or more respected leader than Dr. Daniel Drake. His accomplishments in medicine, science and education; coupled with his tireless efforts at institution building, and his role as one of early Cincinnati’s most effective boosters won Drake the nickname the “Benjamin Franklin of the West.” But we shouldn’t let that moniker, or the passage of time, blind us to the man himself.
Drake was brilliant, but difficult. One colleague described him as someone for whom "artful silence was foreign." He repeatedly had stormy relationships with colleagues, sometimes spilling over into bitter exchanges of letters in the newspapers and occasionally resulting in physical fights. Strains with the faculty of the Ohio Medical College caused him to leave in favor of teaching positions first at Transylvania College in Lexington and then at Louisville Medical College. He later returned to Cincinnati and helped form the Ohio-Miami Medical College as a rival to the Ohio Medical College. When that failed, he helped organize a medical school at Cincinnati College as an additional rival institution.And in case we are tempted to think that a great reformer like Drake certainly must have allied himself with the most important progressive movements of the day, his role in the debate over slavery is a reminder of the complexity of all human beings, including people we think of as geniuses and leaders. As America faced a growing, dangerous, division over slavery and race in the 1840s and ‘50s, Drake spoke out against the Abolitionists who he believed were disruptive and threatened to undermine the union. In a series of letters published in 1851, Drake made clear that he believed that America did not need the “African population” which was by nature a “serving people, parasitic to the white man,” who “according to the instinct, feeling, and opinion of the immense majority of our people, they are, and should be kept, a distinct and subordinate caste.”
GERM THEORY
A flurry of scientific developments beginning in the 1880s provided the foundation of scientific medicine and created the opportunity to transform the way doctors interacted with patients and the role and reputation of hospitals in American society. In 1882 German scientist Robert Koch isolated the tuberculosis bacteria and demonstrated that it caused the dreaded disease. In 1896 the x-ray was developed and in 1898 the first miracle drug, aspirin, was introduced.
After two millennia of no real progress, change over the next century came with revolutionary swiftness. But these developments did not instantly, or even quickly, change the practice of the average practitioner who had been trained in the earlier paradigm. In fact, in the early twentieth century, the primary source of improvement in mortality rates in the United States had much more to do with the introduction of sanitary sewer systems and water treatment plants than the way individual physicians practiced in or beyond the walls of a hospital.
Take, for example, what happed with the Cincinnati Water Works between 1900 and 1910. In a typical year at the turn of the century, Cincinnati recorded 700 cases of typhoid and almost 200 deaths. In 1904 that number shot up to 1,648 cases with 270 deaths and spiked in 1906 with a staggering 1,940 cases and 239 deaths.
In October of 1907 the Water Works opened the California Filtration Plant. The combination of the plant’s 28 rapid sand filters and the effective use of chlorination instantly and dramatically cut the impact of waterborne diseases in the community. In the twelve months after the plant opened, the city recorded only 234 cases of typhoid and 64 deaths, almost all of which were traced to contaminated milk or the use of unfiltered water drawn from private wells. By 1919, the numbers were down to 40 cases and 11 deaths.
Most diseases did not yield so quickly. Although Koch isolated the bacteria that caused tuberculosis in 1882, not until the introduction of streptomycin in 1944, did doctors have an effective way to treat TB.
It is hard today to understand the scourge of TB, or consumption as it was popularly known. In 1910, typhoid, scarlet fever, measles, diphtheria, croup, whooping cough and smallpox killed 151 Cincinnatians combined. In that same year, 1,025 Cincinnatians died of tuberculosis.
With no effective remedy, the local Anti Tuberculosis League worked to prevent the spread. A 1911 pamphlet warned of the danger posed by “careless spitters” and encouraged everyone to carry small paper cups distributed for free by the League that could be disposed of safely at the end of the day.
Although referred to as the “disease of the slums,” consumption struck across all socio economic lines. George Ward Nichols and his wife Maria Longworth Nichols were at the pinnacle of the city’s “leisure class.” They led the effort to create the May Festival, build Music Hall and establish the College of Music. But shortly after the birth of their second child in 1872, George Nichols contracted tuberculosis. Out of fear that he would infect his children, Maria isolated him in a specially built extension of their Grandin Road home until he died 14 years later.
For the typical working class victim, private treatment was not financially possible. Victims were collected in special contagion wagons and delivered to the tuberculosis wards of General Hospital. Beginning in 1897, TB patients were moved to what was known as the Branch Hospital for Contagious Disease on Gurley Road is East Price Hill. Over the next 30 years, the hospital expand to include 14 buildings capable of housing nearly 600 patients. The principal architectural feature of these sanitarium buildings were open air porches where victims could rest during the day and sleep at night, breathing in fresh air all the time. The hospital was renamed in 1945 for its long-time medical director, Dr. Henry Dunham, and closed in 1971.
More dramatic than tuberculosis, was the Influenza Pandemic of 1918-19, which killed somewhere between 50 and 75 million people worldwide, including 675,000 Americans. At its peak, in the fall of 1918, the epidemic killed more than 5 percent of the population in the entire world in a twelve week period. It remains the deadliest plague in human history.
This is a very personal story for me. My father-in-law’s father died from the flu in the fall of 1918, forcing his son to drop out of school midway through his sophomore year of high school and enter the workforce. For my mother-in-law, her earliest clear memory came when she was just over three years old. On Christmas Eve her mother, Ruth Schneider, suddenly fell ill. The last glimpse she ever caught of her mother alive was as she was carried upstairs to her bedroom. In a matter of days she died, her daughter never being allowed to see her again. Her father, an immigrant widower with a three year old, quickly re-married, this time to his dead wife’s sister, who, unfortunately proved to be the cruel step mother. In many ways the Great Influenza was the most important event in my mother-in-law’s entire 90 year life.
The first case from the second wave of influenza, what was popularly known as the “Spanish Flu,” arrived in greater Cincinnati on September 25, 1918, when Mrs. George Topmiller fell ill after visiting her husband at Camp Lee in Virginia. She was immediately quarantined. Over the next two weeks doctors reported 4,000 new cases in the city. Officials in the City of Fort Thomas imposed virtual quarantine on the military reservation. Two days later health officials in Cincinnati, Covington and Newport ordered all churches, schools, theaters and public meetings to close. The schools remained closed for nine weeks.
In a sign of true panic, Cincinnati officials ordered saloons shuttered after 6 p.m. On October 10 the police arrested 25 people who violated the anti-spitting ordinance. By October 23 citizens began flushing residential streets with water and city crews washed downtown streets with “disinfectant.” In an era in which scientists could not yet consistently distinguish bacteria and viruses, public health officials still believed that fresh air was the best preventative technique and cure and ordered the windows on trolley cars and sick rooms to be kept open.
THE RISE OF HOSPITALS
One often overlooked impact of the Influenza Pandemic was the explosion in hospital capacity. The number and size of hospitals had already undergone significant growth in the three decades before the outbreak of the flu. The United States had only 178 hospitals in 1873, including seven in Ohio. By 1909, more than 4,350 dotted the American landscape.
That proliferation of hospitals was reflected in Cincinnati. Before the Civil War, Cincinnati Hospital, the public facility, which dated to 1823 was joined in 1845 by Jewish Hospital, St. John’s Hospital (the precursor to Good Samaritan) in 1852, and St. Mary’s Hospital in 1858. In 1888 the Deaconess Hospital and The Christ Hospital were established and in the next few years almost a dozen other institutions, including Bethesda Hospital were organized. And between 1890 and 1915 all of these hospitals vacated the Basin in favor of hilltop facilities closer to their patients.
Accompanying, and also fueling, this expansion was the steady reduction of fear of hospitals. The long time preference for home births supervised by a midwife gave way to a desire for hospital births, supervised by doctors and trained nurses. In the first ten years after Bethesda Hospital opened its new building, the Maternity Department welcome 1,000 new babies into the world.
But in the aftermath of World War I and the flu epidemic, General Hospital, The Christ and Children’s all undertook significant expansions. In 1925 the Community Chest published the first local comprehensive hospital survey. The study found that 12 general and 21 special hospitals provided 3,605 beds for residents of Hamilton County and Northern Kentucky. The problem was that at a time that 80% was considered full occupancy, Cincinnati hospitals achieved only 68.1 percent. The report considered Cincinnati as “over hospitalized.”
But a potential response to that problem was emerging in Texas at almost the same moment. In 1929 Baylor Hospital pioneered a not-for-profit health insurance plan for Dallas school teachers. That quickly grew into the early Blue Cross plans. Grants from the Schmidlapp Foundation in 1939 allowed Dr. Otto Geier at Cincinnati Milling Machine and other local industrialists to organize the “Hospital Care Corporation.” Suddenly, as people gained the ability pay for medical services, the presumed over abundance of hospital capacity disappeared.
ALBERT SABIN
I have purposely developed the picture of the practice of medicine and the still partially developed scaffolding of the medical system in Cincinnati at the end of World War II as more related to the 1880s than to 2010. It is easy to forget how close we still are to the origins of scientific medicine. And it is a necessary backdrop to appreciate the accomplishments of Dr. Albert Sabin.
A few weeks ago Cincinnati marked the 50th anniversary of the launch of the Sabin Sundays campaign. On April 23, 1960, the front page of the Cincinnati Post carried a headline “Drive to Completely Eliminate Polio Here Begins Tomorrow.” Inside, a full-page advertisement declared a week-long “Children’s Crusade” beginning the next day to distribute as many as 20,000 doses of the oral polio vaccine to children under six years old. This was the beginning of an effort to make Cincinnati the first “polio free” city in the U.S., and the nation was watching.
One of the tricky aspects of that effort was to convince local parents that it was important to have their children “sip the Sabin syrup” even if they had already taken a full round of the Salk vaccine shots that had been introduced in 1955. The newspapers calmly explained the differences between the Salk vaccine which utilized a killed virus compound, and the Sabin oral vaccine which contained a live, but attenuated virus which produced a herd immunity that multiplied its benefits.
Those explanations did not convince my father. My brother, sisters and I did not line up for the sugar cubes. As a result, when I interviewed Dr. Sabin in his Washington, D.C. apartment 30 years after the first Sabin Sunday for the Cincinnati Medical Heritage Center, I decided that it was prudent not reveal my parents’ stance, suspecting that if he found out, he would whip out a dose of his vaccine from the refrigerator and vaccinate me on the spot.
Albert Sabin had a tenacious personality and even after 40 years, I found he had not tired of delineating the differences between the two approaches, and, in the process, dismissing Salk’s contribution.
Though the reports in 1960 were presented in calm, dispassionate tones, every reader knew that behind these cool explanations lay one of the fiercest scientific disputes ever played out on the public stage.
During the early twentieth century, the incidence of polio had increased in the United States at an alarming rate. In the 1920s public health officials reported about 4 cases per 100,000. That rate doubled by 1940-44, and doubled again to 16 per 100,000 by 1945-49. In the early 1950s it soared to 25 per 100,000, peaking during the “Plague Season” of 1952, at 37 per 100,000.
Images of children trapped in wards filled with iron lungs, or hobbled by braces and crutches struck fear into the hearts of every parent and child as the summertime “polio season” approached. What troubled and confounded many was why the explosion of polio was occurring in the developed countries like the United States and Western Europe and the average age of the victims was getting older.
In 1947, in the midst of that depressing growth curve, Dr. Albert Sabin, who originally established his virology laboratory at Cincinnati Children Hospital in 1939, wrote an essay that David Oshinsky, the author of the Pulitzer Prize winning book, Polio: An American Story, describes Sabin as “remarkably prescient.” Although no one yet understood the path by which the virus entered the human body or reached the nervous system where it did its damage, Sabin hypothesized that a natural way that most young children traditionally acquired immunity must have been interrupted. He further suggested that the reason might be that rapidly improving standards of sanitation and hygiene prevented small children from being exposed to the fecal matter where they picked up trace amounts of the virus that stimulated the development of antibodies. In other words, all those great new cleaning products being developed by P&G were increasing the incidence of polio.
From early in his career Sabin, along with John Paul and John Enders, was one of the scientific stars of the search for a cure for polio. Salk, on the other hand, who worked at the University of Pittsburgh, was considered by Sabin a “bench scientist” who was best at carrying out the drudge work needed to identify the different strains of polio virus and evaluate different approaches.
In 1951 and ’52, however, Salk developed a process of culturing samples of each of the three types of polio virus and then “cooking” them in a solution of formaldehyde to kill the virus, but leave it capable of stimulating the production of antibodies. When early field tests in June 1952 on residents of two residential homes for retarded children proved successful, things moved quickly. Basil O’Conner, the leader of the National Foundation for Infantile Paralysis for 30 years, was desperate for a breakthrough. In the face of a great deal of resistance from other researchers, O’Conner pushed Salk forward into the public eye. In February, 1953, Time Magazine reported “solid good news on the polio front” and ran a photo captioned “Researcher Salk.” The next month Salk appeared on a national CBS radio program called “The Scientist Speaks for Himself.”
Sabin saw this sort of publicity as unseemly and dangerous, pushing a solution forward before it was fully vetted by the scientific community. But massive field trials in 1954 involving 1.3 million children led to widespread distribution of the vaccine in 1955 and the enshrinement of Jonas Salk as the scientist who conquered polio.
Despite those successes, Albert Sabin believed Salk’s killed virus vaccine was inferior, potentially dangerous, and pressed on with his own line of research. Despite his steady progress over the next five years, Sabin faced an unusual problem expanding his field testing. Salk’s vaccine had worked.
The year the Salk vaccine was first widely distributed in 1955, health officials recorded 30,000 cases of polio. That number dropped to 15,000 in 1956; 7,000 in 1957 and to just over 1000 in the next few years. Because of this record, the idea of field testing Sabin’s vaccine had little support in the United States. Confronted with this roadblock, Sabin found an unlikely partner, America’s Cold War enemy, the U.S.S.R. In 1959, over 10 million Soviet children swallowed the vaccine wrapped in pieces of candy. The results were so impressive that the Health Ministry decided to vaccinate all 77 million Soviet citizens under 20 years of age. By collaborating with the U.S.S.R., some people in the United States regarded Sabin’s vaccine as the “Communist vaccine,” and therefore suspect for other reasons.
In 1960, Sabin was granted the right to begin United States trials and chose Cincinnati as his first city. The way the story is usually told in Cincinnati, this was the beginning of the end of polio, but in fact, the widespread use of the Salk vaccine had already made Cincinnati, and most of the United States, all but polio free. The Hamilton County Health Commissioner wrote the U.S. Surgeon General that he felt “the whole thing is doing harm.” From the Commissioners perspective, Sabin had to rekindle fears about a disease that no longer existed while diverting scarce resources away from other pressing health issues. No matter, by 1962, the medical community swung their support to Sabin’s vaccine.
For the next 30 years Salk and Sabin continued to spar in public and within scientific circles. Salk was enshrined in the public mind (at least outside Cincinnati), as the “Doctor Who Conquered Polio,” but Sabin emerged in the scientific community as the superior scientist.
THE SEARCH FOR PERFECTION
I would like to end by focusing on a report published last August by the Harvard Business School. Harvard made Cincinnati Children’s Hospital Medical Center the focus of one of its famous case studies.
The study documents the hospital’s integration of “Improvement Science,” designed to increase the rate and impact of improvement across the organization. The roots of this program grew out of a vision first articulated in 1993 by Dr. Uma Kotagal, the Senior Vice President of Quality and Transformation. Her vision was bolstered in 1996 by the arrival of James Anderson as President and Lee Carter the Chairman of the Board. Carter summed up his view of the effort as, “We will be the best at getting better.”
The effort began in 1996 by focusing on the way that the hospital treated patients with bronchiolitis. Using data and the best studies available, the hospital developed recommendations that ran counter to the well established practice of local pediatricians.
In 2002 the CCHMC submitted a grant proposal to the Robert Wood Johnson Foundation and Institute for Heathcare Improvement under the rubric of “Pursuing Perfection.” They won the award and initially focused on the treatment of Cystic Fibrosis.
The approach is committed to a use of evidence based decision making. That meant that the hospital had to come to grips with the reality that although they had believed themselves to be among the best CF centers in the country, evidence revealed they were only mediocre. That insight was humbling for the clinicians who had to have some very serious inter-staff conversations.
The hospital also committed itself to complete transparency with the parents of the young patients. Sharing less than stellar information about the hospital’s track record was worrisome. To the staff’s surprise, however, they discovered that the vast majority of parents appreciated the honesty and were willing to work together to improve the treatment outcomes. As one of the parents, Kim Cook said, they came out “respecting them on a new level.”
A third commitment embraced by the hospital leadership was inflexibility in setting a target of zero tolerance for serious safety events such as a death from medication error. The commitment was to to “pursue perfection,” and change the way people in the hospital think about the services they render.
CONCLUSION
We have traveled a long way.
We started at a time when the best trained, most conscientious and committed doctors held fast to assumptions and solutions that were honored because they were 2000 years old, but provided little power in the face of disease and suffering. The best Daniel Drake could do was document the course of the cholera epidemic and offer suggestions that even he knew had little efficacy.
By the early twentieth century, the paradigm for understanding the causes of disease had radically shifted, but in the face of tuberculosis or influenza, and dozens of other maladies, physicians did not have the tools they needed.
In the post World War II period, Dr. Sabin demonstrated what could be accomplished within the scientific paradigm through posing open ended questions, systematic research and sheer determination to get it right.
And in 2010, we stand at a place where a hospital can embrace the audacious goal of pursuing perfection.
Remarkable. Dizzying.