Monday, Aug. 09, 1976

The Struggle to Stay Healthy

JOHN H. KNOWLES, M.D.

The following Bicentennial Essay is the eighth in a series that has been appearing periodically, surveying how we have changed in our 200 years.

On the eve of the Revolution, there were 2.5 million people in colonial America. Virginian William Byrd wrote, "It was a Place free from those three great Scourges of Mankind --Priests, Lawyers, and Physicians." Divine aid was considered more important than that of the physician. Only through God's grace could one escape disease or survive its attack. In The Angel of Bethesda, the first general treatise on medicine written in the colonies, Cotton Mather advised in 1724, "Lett us look upon Sin as the Cause of sickness."

Average life expectancy at birth was 34.5 years for men and 36.5 years for women. Fifty percent of deaths occurred in those under ten years of age. Infectious diseases decimated the population. Smallpox and yellow fever were most feared. Tuberculosis, cholera and dysentery, typhoid, diphtheria, measles and mumps were ever present. Malaria was as common in New England as on the Southern plantations. In 1721, almost half the population of Boston caught smallpox, and more than 7% died. Yellow fever wiped out 10% of the population of Philadelphia in 1793.

Scurvy, scrofula and scabies were common among the poor. Bathing was rare: one Quaker lady noted in her diary in 1799 that she withstood a shower bath "better than I expected, not having been wett all over at once, for 28 years past." Body lice were omnipresent, as was the disease they carried--typhus fever. Frequent births and poor obstetrics accounted for the high mortality in mothers; the death rate among black women served by midwives was lower than among whites served by physicians. Mental illness was seen as the work of the devil: the village idiot was either derided or tolerated, while the more violent were shackled and jailed.

There were 3,500 medical practitioners in the colonies when the Revolution began, of whom fewer than 200 held degrees from medical schools. One writer noted that "with a few, honorable exceptions in each city, the practitioners were ignorant, degraded and contemptible." Quacks abounded. In the North, ministers and magistrates doubled as physicians, while in the South, planters and their wives cared for the slaves. Some of these individuals brought status to the profession. The people viewed the medical profession in general, however, with a mixture of fear, comtempt and amiable tolerance. There simply was little that doctors could offer, and their cures were sometimes worse than the diseases that afflicted people.

Purging, emetics and bloodletting were common remedies; surgery consisted of "cutting for stone" and amputations. With no anaesthesia, the best surgeons were the ones who could cut, hack and saw most rapidly, aided by the strongest assistants to hold the patient down. Herbs and plants were extensively used in treatment. Governor John Winthrop of Massachusetts Bay prescribed a paste of wood lice, while Cotton Mather--who together with Zabdiel Boylston brought inoculation to the colonies in 1721 to prevent serious cases of smallpox--condemned the use by Boston physicians of "Leaden Bullets," to be swallowed for "that miserable Distemper which they called the Twisting of the Guts." By the early 18th century, there were only two drugs known to be specific: cinchona bark for malaria, and mercury as an antisyphilitic agent. Dr. Benjamin Rush of Philadelphia (one of four physicians to sign the Declaration of Independence) used bloodletting so extensively that even his colleagues marveled at the survival of his patients. Thomas Jefferson said in 1807, "The patient ... sometimes gets well in spite of the medicine."

The apprentice system of medical education held sway.

The apprentice might pay the master -L-100 annually for as long as seven years until he "qualified" to practice on his own. By the mid-18th century, more formal training began to take hold. In 1765, after a tour of medical centers in London, Paris, Padua and Edinburgh, John Morgan persuaded the College of Philadelphia to set up the first American medical school. The prototype of the British voluntary hospital was established with the founding of the Pennsylvania Hospital in 1751, the New York in 1771 and the Massachusetts General in 1811, moving the care of the sick poor and the teaching of medical students out of the almshouses. With the founding of the first mental hospital, the Virginia "insane asylum" at Williamsburg, shortly before the Revolution, the mentally ill began to be moved from jails and almshouses to state-sponsored, more humane institutions. Early on, the great cost of mental illness precluded voluntary efforts to cope for people of ordinary means.

The institutionalization of a loosely organized profession grew with the founding of state medical societies, teaching hospitals and medical schools. Largely because of the devastation caused by infectious diseases, local communities were forced to form boards of health, which established quarantine measures and tried to provide for sanitary engineering. Infectious disease was thought to be the result of noxious vapors emanating from decaying animal and vegetable matter. Therefore, in addition to isolating feverish individuals, much of the health boards' time was spent attempting to improve sewage and garbage disposal.

The 19th century in Europe saw the emergence of modern medicine. Vaccination for smallpox was introduced. The stethoscope, clinical thermometer and hypodermic syringe were developed. Morphine and quinine were isolated. Surgical instruments were perfected, antiseptic techniques were developed, and the use of ether as an anaesthetic agent was demonstrated in 1846 at the Massachusetts General Hospital--the single most important contribution of American medicine during the century. Pasteur, Koch, Klebs, Roux and Yersin established the science of bacteriology, and between 1880 and 1900 the microbial origins of numerous diseases were demonstrated. A new interest in nutrition developed.

In 1895 two events took place that would have a profound effect on the progress of American medicine: 1) the discovery of a "new kind of rays" by Roentgen, which led to the development of diagnostic radiology and X-ray therapy; and 2) the development of psychoanalytic psychiatry through the studies of Sigmund Freud. In the same way, the accurate diagnosis of many diseases was virtually impossible before the advent of two major technologies in the early part of the 20th century; 1) the chemistry of blood and bodily fluids, which made easier the study of the body's organ systems; and 2) the use of the X-ray machine and the progressive development of such radiopaque substances as barium and iodine compounds to visualize organ systems. These two advances, together with the expansion of surgery after the introduction of anaesthesia and antiseptic techniques, transformed the hospital. From a passive receptacle for the sick poor, it became a house of hope and an active diagnostic and curative institution for all classes. The use of blood transfusions hastened the transformation.

The new sciences of bacteriology, biostatistics and epidemiology led to development and extensive use of vaccines, pasteurization of milk and measures for the control of disease. These advances led to a marked improvement in public health. So did the development of urban sewage-disposal and water-purification systems, the rapid transportation of fresh food and its storage under refrigeration, state food-control acts and the new concern for woman and child labor, as well as for industrial working conditions. By 1910 average life expectancy at birth had increased to 50 years.

The Progressive Era also profoundly affected health interests. Upton Sinclair's The Jungle, in 1906, exposed abysmal conditions in meat-packing plants. Congress responded by passing the first meat-inspection law. Samuel Hopkins Adams muckraked the patent-medicine industry, and Congress swiftly enacted the Pure Food and Drug Act.

In 1904 there were 160 medical schools with 28,142 students and 5,747 graduates annually. Abraham Flexner, an educator, not a physician, was commissioned by the Carnegie Foundation for the Advancement of Teaching to study the situation. He recommended the closing or reorganization of all substandard proprietary schools. By 1930 there were only 76 schools with a total of 21,597 students and 4,565 graduates annually. Little significant expansion of medical schools occurred for the next 20 years, but the "Flexner revolution" helped make the U.S. the world leader in biomedical science and medical education. From 1901 through 1939, the number of Nobel prizes in medicine totaled 42, of which only eight were awarded to Americans. From 1943 to 1975, Americans won 41 of the 74 prizes awarded.

With expanding knowledge and technology, an inevitable subdivision of labor occurred. The general practitioner faced extinction as medical students entered a wide variety of specialties. Specialization advanced to the point where what happened to the patient all too often depended on who saw him first. "Free market" medicine resulted in a gross geographic and functional maldistribution of doctors. There developed a severe oversupply of specialists in some areas (surgery, where work weeks declined as fees rose) and an undersupply in others (pediatric psychiatry and general practice). The g.p. declined from 64% of the total number of doctors in 1949 to 13% in 1973. Meanwhile, the number of graduates of foreign medical schools practicing in the U.S. increased from 20,575, or 8.6% of the total in 1959, to 69,000, or 20% in 1971.

The increasing use of medical technology, while markedly enhancing accuracy of diagnosis and success of treatment, was accompanied by less time spent with patients. Complaints about the dehumanizing of medical care were increasingly heard. Doctors moved their offices close to the hospital and its technology. By the 1950s the house call had virtually vanished as doctor and patient met in the emergency wards and clinics of urban teaching hospitals or in offices next door.

Acute, curative, technology-dependent medicine reached its apogee in the 1960s--and, as expectations rose, so did the costs. The expense of medical care had reached a critical stage with the Depression of 1929-32, when individuals found it increasingly difficult to pay their medical bills. The private sector in the 1930s developed the Blue Cross-Blue Shield insurance system of prepayment for hospitals and physicians. In the public sector, the Social Security mechanism and general tax revenues were used to pay the costs of the indigent sick, the disabled, the elderly and such special groups as veterans, migrant farmers and American Indians. A variety of amendments to the Social Security Act of 1935 culminated in Medicaid (a federal, state and local program for financing medical-care needs of the indigent sick) and Medicare (compulsory health insurance for the elderly). Today 21 million Americans aged 65 and over have such insurance for hospital and extended-care costs.

The total national expenditure for health in fiscal 1975 was $118.5 billion, which included $46.6 billion for hospital care, $22.1 billion for physician services, $10.6 billion for drugs, $9 billion for nursing-home care, $7.5 billion for dentists' services, $3.5 billion for Government public health activities and $2.8 billion for medical research. Third-party payments (public and private) for medical care increased from 35% in 1950 to nearly 70% in 1975, thus leaving about 30% of the total to direct payments by the beneficiaries--a significant burden. Hospitals, physicians and drugs consumed almost 70% of the total expenditure. Gross overuse of all three has become a major problem.

The consumer movement focused on the skyrocketing costs of medical care, questioning doctors' fees and incomes, their unavailability and the amounts of unnecessary surgery. Mass media joined the assault, along with those largely liberal politicians trying to generate support for national health insurance as the antidote. The American Medical Association was increasingly viewed as a guild, mostly interested in the welfare of its own members. Nonetheless, virtually every poll of attitudes toward different occupations continues to show that the American physician ranks No. 1 and enjoys immense prestige, exceeding that of Senators and Supreme Court Justices. My doctor is great--it's those doctors!

Where do we stand today, and what are our prospects for health beyond 1976? Gone are the

scourges of smallpox, yellow fever, tuberculosis, measles and infantile diarrhea. Life expectancy has increased from 47.3 years in 1900 to 72.4 years in 1975. Of the roughly 2 million deaths annually in the U.S., 37.8% are due to heart disease, 19.5% to cancer, 10.2% to strokes, 4.3% to lung disease (pneumonia, bronchitis and emphysema), 5.3% to accidents, 1.9% to diabetes, 1.7% to cirrhosis of the liver, 1.4% to suicide and 1.1% to homicide. But death statistics give only part of the picture. For every successful suicide, eight others (or 200,000 people) may have made the attempt. For every person who dies of cirrhosis--commonly related to alcoholism and malnutrition--at least 200 and probably 300 people can be classified as alcoholics (10 million Americans). For every accidental death, hundreds are injured, some permanently disabled. Twenty-four million Americans, 11 million of whom receive no federal food stamps, live below the federally defined poverty level, a level that does not support an adequate diet. Venereal disease has been increasing annually, with nearly 1 million cases of gonorrhea and syphilis reported last year.

Beyond death and disease statistics, there exists a steadily expanding number of the "worried well" and those with minor illnesses. Has life itself become a disease to be cured in the American culture? Some 80% of the doctor's work consists of treating minor complaints and giving reassurance. Common colds, minor injuries, gastrointestinal upsets, back pain, arthritis and psychoneurotic anxiety states account for the vast majority of visits to clinics and doctors' offices. One out of four people is "emotionally tense" and worried about insomnia, fatigue, too much or too little appetite and ability to cope with modern life. At least 10% of the population suffer from some form of mental illness, and one-seventh of these receive some form of psychiatric care. Meanwhile, the figures for longevity are the highest and for infant mortality the lowest in U.S. history, and the gap continues to narrow. We are doing better but feeling worse.

As a people, Americans have been noted for their selfcriticism. I would suggest that we give at least equal time to extolling our virtues and triumphs. Let us look at both sides of the coin:

1) We should be grateful for our medical technology and the countless lives that have been saved because of it. We should gasp at its wild abuse and overuse.

2) We should be grateful for the markedly improved health of most Americans. We should be horrified by the unmet medical and nutritional needs of nearly 25 million poor people.

3) We should applaud the development of health insurance mechanisms that have protected the patient from financial disaster. We can decry the fact that health insurance is a misnomer (it is disease insurance) and that so little effort and emphasis have been placed within the insurance system on the maintenance of health.

4) We can be grateful for the quality of care given in the majority of our 7,000-plus hospitals and 1.5 million beds. We should decry our inability to avoid costly reduplication of services, build more extended-care facilities and low-cost hospitals for the chronically ill, and reduce unnecessary surgery.

5) We can take great satisfaction that so many Americans have found fruitful work in the health system. We should worry about the low number concerned with environmental health research, health education, visiting nursing and prevention programs.

6) We can be proud of the quality and quantity of our health-related educational system: 114 medical schools with some 33,000 full-time faculty, 50,000 students and 14,000 Doctors of Medicine graduated annually. We should decry the unbelievable cost of medical education and the precarious state of financing for schools of public health.

7) We can applaud the activities of the National Institutes of Mental Health. We should decry the meager sums of money available for research in mental illness, which represents the nation's primary public health problem.

The next major advances in the health of the American people will result from the assumption of individual responsibility for one's own health. This will require a change in lifestyle for the majority of Americans. The cost of sloth, gluttony, alcoholic overuse, reckless driving, sexual intemperance and smoking is now a national, not an individual responsibility. These abuses are justified on the ground of individual freedom, but one man's freedom in health is another's shackle in taxes and insurance premiums.

In the 17th and 18th centuries, moral and astrological factors were supplanted by theories that attributed disease to mental states, heredity, unknown poisons, environmental factors ("airs and waters"), contagion by mysterious poisons ("miasmata") and infection by animalcules ("germs" described by the early microscopists). With Pasteur's work in the late 19th century, a unitary theory of disease developed as a natural concomitant to the germ theory of disease: single organism, single disease, single cause. With further research, we have come full circle to colonial beliefs. It is now realized that there are multiple causes of disease, involving varying combinations of genetic factors, environmental factors (levels of stress, pollutants, germs and parasites) and behavioral factors (rest, smoking, exercise, diet, alcohol and hygiene).

When all is said and done, death and disease are inevitable, and as we eradicate one scourge, another will take its place. Ethical and moral concerns will have to play an increasing role in guiding us through lives of quality. These concerns will be matched with a typically American hardheaded pragmatism that tells us health care is only one element in the quality-of-life equation and the other elements, which depend on national will and individual responsibility, are equally important, if not more so.

This file is automatically generated by a robot program, so viewer discretion is required.