Free Novel Read

Heart--A History Page 11


  Epidemiology is about the ecology of disease: where and when it is found, or not. In 1854, John Snow, physician to Queen Victoria, performed the world’s first epidemiological study when he investigated a major cholera outbreak in London’s Soho district. Snow was born in the town of York, at the intersection of two rivers contaminated by dung and sewage. His childhood likely sensitized him to a community’s need for clean water. Based on studies nearly ten years before the Soho epidemic, Snow had concluded that cholera was transmitted by “morbid matter,” not foul air, as his colleagues at the London Medical Society believed. He based his theory in part on the fact that workers in slaughterhouses, thought to be a font of cholera, were afflicted no more than the general population. So, when cholera broke out in London in 1854, Snow set his sights on a well. He went to the General Registry Office and mapped the addresses of all the cholera deaths in the Soho district, discovering that most deaths had occurred near a water pump on Broad Street. True to his meticulous nature, Snow also studied Soho residents who did not contract the disease—for example, inmates at a nearby prison that did not use the Broad Street pump, as well as brewery workers whose supervisor, a Mr. Huggins, told Snow that his men drank only water from the brewery’s own well (when they weren’t consuming the malt liquor they produced).

  Though Snow knew nothing of germs, he was nevertheless able to contain the epidemic, which caused 616 deaths, by persuading the board of governors of the local parish to remove the handle on the well’s pump, making it impossible to draw water. Only later, by studying water samples, did London authorities show that the pump was contaminated with sewage from a nearby cesspool, setting off what Snow called “the most terrible outbreak of cholera which ever occurred in this kingdom.” Snow’s investigation saved many lives. Just as important, it showed that an epidemic could be controlled without a precise understanding of its cause.2

  After Snow’s study and the subsequent development of epidemiological techniques, public health authorities in the United States focused their attention on acute infectious diseases like cholera, tuberculosis, and leprosy. Chronic noninfectious ailments—the long-term hard hitters like heart disease—received little attention. But after Roosevelt’s death, Assistant Surgeon General Joseph Mountin, a founder of the Office of Malaria Control in War Areas (later known as the Centers for Disease Control, or CDC), was eager to correct this disparity. As was the case with cholera in the mid-nineteenth century, very little was known about the determinants of heart disease. Could risk factors be identified by studying people who developed the disease, just as Snow had studied the victims of the cholera epidemic?

  The national climate after World War II was favorable for such an investigation. New hospitals were being built, the National Institutes of Health was expanding, and there was increased federal commitment to basic and clinical research. Moreover, a beloved president had just died. In this environment, things moved quickly. By the summer of 1948, the U.S. Public Health Service had already negotiated the basic framework of an epidemiological study of heart disease with the Massachusetts Department of Health. The commonwealth was a natural choice for the project, with top medical schools, such as Harvard, Tufts, and the University of Massachusetts, in and around Boston. The commissioner of health was “warmly enthusiastic” about a pilot study to develop heart-screening tools. With the support of Harvard physicians, the town of Framingham, about twenty miles west of Boston, was chosen as the site.

  In the late seventeenth century, Framingham was a farming community, home of the first teachers’ college and the first women’s prison, and a haven for those trying to escape the witch hunts in nearby Salem. During the Civil War, it was the first town in Massachusetts to establish a volunteer battalion. However, by the 1940s, Framingham had turned into a middle-class industrial town. Children played with garden hoses on tree-lined streets. The 28,000 townsfolk lived mostly in single-family homes and earned a median family income of about $5,000 per year (about $50,000 today). (There were exceptions, of course, such as James Roosevelt, the president’s son, who owned a large estate on Salem End Road.) Most Framingham residents ate a typical meat-and-potatoes diet. Like the rest of the country, about half of them smoked. Predominantly white and of western European descent, they were believed to be representative of America after World War II.

  The key question at the heart of the Framingham study was this: Can the risk of a heart attack be predicted in a person with no overt heart disease? The plan was to follow approximately five thousand healthy patients between the ages of thirty and fifty-nine for twenty years until enough of them developed heart disease. Meanwhile, factors associated with the development of the disease would be identified (and later, hopefully, modified to prevent disease in healthy patients). At the time, hypothetical factors were “nervous and mental states,” occupation, economic status, and use of stimulants like Benzedrine. Though research linking heart disease with cholesterol had been available for decades (in 1913, researchers in St. Petersburg had demonstrated that feeding rabbits large quantities of cholesterol-rich foods, such as meat and eggs, caused atherosclerotic plaques), this information was not yet widely known to doctors or the American public.

  The initial outlay for the Framingham study was modest: about $94,000, mostly to cover office supplies (including ashtrays for the study researchers who smoked). Mountin, the assistant surgeon general, selected Gilcin Meadors, a young U.S. Public Health Service officer, as the first director. Born in Mississippi, Meadors had graduated from medical school at Tulane only eight years earlier. When he was tapped by Mountin, he was still completing a master’s degree in public health at Johns Hopkins. Besides a lack of experience, Meadors faced many challenges. He had to persuade local physicians, many of them suspicious of the federal government, to cooperate with the U.S. Public Health Service. Moreover, because of the long period required for heart disease to develop in healthy people, nearly half the eligible townsfolk would have to agree to participate, and their attrition rate would have to be almost vanishingly low.

  The study was announced in a small advertisement in the local newspaper on October 11, 1948. Then Meadors, the young upstart epidemiologist, went into action. Not your typical pocket-protector bureaucrat, Meadors was charming and sociable. He attended town meetings and befriended civic leaders. Flattered by his ambition, a whole network of veterans, lawyers, and housewives sprang up to spread word about the study. Meadors’s recruits knocked on doors, staffed telephone banks, and appeared at churches, parent-teacher organizations, and community groups. Their mission was to help him enroll subjects willing to reveal intimate information to federal officials with no promise of any direct benefit (though Meadors said the study would eventually lead to “recommendations for the modification of personal habits and environment”). Within weeks, Meadors’s staff had filled appointment slots through the spring.

  The first study questionnaires included items about personal and family history, parents’ age at the time of death, habits, mental state, and medication use. Government-appointed doctors peered into subjects’ eyes and palpated livers and lymph nodes. Blood and urine tests were taken; X-rays and electrocardiograms were performed. Though cholesterol testing had been considered before the initiation of the study, it was only added after the research had begun.

  After a year, control of the study shifted to the newly established National Heart Institute. The NHI changed the character of the project, making its methodology more rigorous. Instead of enrolling volunteers, it now randomly selected subjects, eliminating a source of bias. The focus also shifted toward investigating biological rather than “psychosocial” risk factors. Questions about sexual dysfunction, psychiatric problems, emotional stress, income, and social class were discarded. Statisticians at the NHI invented something called multivariate analysis, a method of calculating the relative importance of each of several factors that coexist in the expression of a disease. (In the beginning, Framingham scientists focused on age, serum cholest
erol, weight, electrocardiographic abnormalities, red blood cell count, number of cigarettes smoked, and systolic blood pressure.) Therefore, the Framingham study, as it emerged in the 1950s, was “clinically narrow,” as one researcher put it, “with little interest in investigating psychosomatic, constitutional, or sociological determinants of heart disease.” This would turn out to be a major flaw.

  After nearly ten years of closely monitoring approximately fifty-two hundred patients, Framingham researchers published a key paper in 1957 (out of the nearly three thousand produced to date) showing that patients with high blood pressure had a nearly fourfold increase in the incidence of coronary heart disease. A few years later, hypertension was also shown to be a major cause of stroke. Referring to President Roosevelt’s premature death, Framingham scientists commented on the “mounting evidence that many of the commonly accepted beliefs concerning hypertension and its cardiovascular consequences may be in error.” Even Dr. Bruenn, Roosevelt’s cardiologist, wrote, “I have often wondered what turn the subsequent course of history might have taken if the modern methods for the control of hypertension had been available.”

  Later Framingham publications identified additional coronary risk factors, including diabetes and high serum cholesterol. One paper found that nearly one in five heart attacks present with sudden death as the first and only symptom, a discovery that ratified the tremendous fear that millions of Americans were living with. By the early 1960s, a definitive association had also been made between cigarette smoking and heart disease. (Smokers in previous studies hadn’t lived long enough to draw definitive conclusions.) This led to the first surgeon general’s report detailing the health hazards of smoking. In 1966, the United States became the first country to require warning labels on cigarette packages. Four years later, primarily because of Framingham, President Nixon signed legislation banning cigarette ads on television and radio, one of the great public health triumphs of the second half of the twentieth century.

  The Framingham study was nearly shut down in the late 1960s for lack of funding. There was no dearth of events—assassinations, riots, civil rights protests, and the Vietnam War—to occupy policy makers, and an epidemiological study in a small town in Massachusetts hardly seemed to warrant much attention. So, Framingham investigators went around the country trying to raise private money. Donors included some unexpected contributors, including the Tobacco Institute and the Oscar Mayer Company, which manufactured luncheon meats. In the end, only after President Nixon’s personal physician, the cardiologist Paul Dudley White, lobbied for the study was federal support revived.

  The Framingham study shifted the focus of medicine from treating cardiovascular disease to preventing it in those at risk. (Indeed, the term “risk factor” was introduced by Framingham researchers in 1961.) In 1998, while I was still in medical school, Framingham researchers published a formula, based on the major independent cardiac risk factors that had been identified—family history, smoking, diabetes, high serum cholesterol, and hypertension—to calculate a patient’s risk of getting heart disease within ten years. (This is the formula I used after my first CT scan showing that I had developed coronary plaque.) Today we know that programs that target such risk factors improve public health. For example, a recent twelve-year study of 20,000 Swedish men showed that almost four out of five heart attacks could be prevented through Framingham-inspired lifestyle changes, such as a healthy diet, moderate alcohol consumption, no smoking, increased physical activity, and maintaining a normal body weight. Men who adopted all five changes were 86 percent less likely to have a heart attack than those who did not. An earlier study of about 88,000 young female nurses found that participants who followed a healthy lifestyle—didn’t smoke, had normal body weight, exercised at least two and a half hours each week, had moderate alcohol consumption, followed a healthy diet, and watched little television—had almost no heart disease after twenty years of follow-up.

  But as important as the Framingham Heart Study has been in advancing our understanding of coronary heart disease, it does not tell the whole story. For example, Framingham risk models do not seem to apply equally to nonwhite ethnic groups. Meadors and the early Framingham investigators recognized the lack of diversity in the study population as a major limitation.3 What of my medical school cadaver or my grandfather? In 1959, the first study showing an increased risk of premature heart disease in Indian males was published in The American Heart Journal. These men had four times the rate of heart disease compared with men living in Framingham, despite having lower rates of hypertension, smoking, and high cholesterol and more often consuming a vegetarian diet. Today in South Asia, a large percentage of heart attacks occur in men with zero or only one Framingham risk factor. Over the past half century, coronary artery disease rates have increased threefold in urban India and twofold in rural India. During that time, the average age at which a first heart attack occurs has increased by ten years in the United States but decreased by about ten years in India. Compared with whites, South Asians have more multivessel coronary artery disease and are more likely to have a more dangerous anterior location of a myocardial infarction. South Asians will soon make up over half of the world’s cardiac patients. What is it about South Asian genetics or environments that leads to so much heart disease? We need a Framingham-type study to answer this question.4

  But there are almost certainly cardiovascular risk factors that Framingham investigators did not identify. Some of these factors are likely in the “psychosocial” domain that Framingham investigators decided to ignore when the study was taken over by the NHI in the early 1950s. For example, consider heart disease in Japanese immigrants. Coronary artery disease is relatively rare in Japan. However, its rate is almost double in Japanese immigrants who settle in Hawaii and triple in those who settle in the mainland United States. Part of the explanation might be that Japanese immigrants adopt unhealthy American habits, like a sedentary lifestyle or a diet rich in processed foods. Still, Framingham risk factors do not fully explain the disparity.

  In the early 1970s, Sir Michael Marmot and his colleagues at the UC Berkeley School of Public Health studied nearly four thousand middle-aged Japanese men living in the San Francisco Bay Area. They found that immigrants who stayed true to their Japanese roots (as evidenced in surveys by their ability to read Japanese, the frequency with which they spoke Japanese, the frequency with which they had Japanese co-workers, and so on) had a much lower prevalence of heart disease, even when they matched Americans in terms of serum cholesterol and blood pressure, than immigrants who were more integrated into their new culture. “Traditional” Japanese immigrants had coronary disease rates in line with their homeland counterparts. “Westernized” immigrants had a prevalence that was at least three times higher. “Retention of Japanese group relationships is associated with a lower rate of coronary heart disease,” the authors concluded. And so, acculturation, they declared, is a major risk factor for coronary disease in immigrant populations.

  If cutting traditional cultural ties increases the risk of heart disease, then psychosocial factors must play a role in cardiovascular health. Today we know this to be true in many strata of human society. For example, American blacks in poor urban centers have a much higher prevalence of hypertension and cardiovascular disease than other groups. Some have proposed genetics to be the deciding factor; however, this is an unlikely explanation, because American blacks have hypertension at much higher rates than their West African counterparts. Moreover, hypertension pervades other segments of American society in which poverty and social ills are rampant.

  Peter Sterling, the University of Pennsylvania neurobiologist, has written that hypertension in such communities is a normal response to what he calls “chronic arousal,” or stress. In small preindustrial communities, he writes, people tend to know and trust one another. Generosity is rewarded; cheating tends to be punished. When this milieu is disrupted, as in migration or urbanization, there is often an increased need for vigi
lance. People get estranged from their neighbors. Communities become diverse and more mistrustful. Physical and social isolation often results. Add in poverty, fragmented families, and joblessness, and you get extremely stress-prone populations. The chronic arousal triggers release of hormones, such as adrenaline and cortisol, that tighten blood vessels and cause retention of salt. These in turn lead to long-term changes like arterial wall thickening and stiffening that increase the blood pressure that the body tries to maintain.

  In Sterling’s formulation, nothing is broken (except perhaps “the system”). The body is responding exactly in the way it should to the chronic fight-or-flight circumstances in which it finds itself. If takotsubo cardiomyopathy proves that acute psychological disruption can damage the heart, Sterling’s theories suggest that chronic, low-level stress may be just as harmful. His theories put psychosocial factors front and center in how we think about and approach heart problems. They show that chronic heart disease, unloosed from a Framingham cage, is inextricably linked to the state of our neighborhoods, jobs, and families. Heart disease, in this conception, is no longer strictly biological; it is cultural and political as well. Improving our social structures and relationships becomes not only a quality-of-life issue but also a public health concern.5

  The harmful cardiovascular effects of chronic arousal apply to traditionally white communities, too. One example is the Whitehall study, also conducted by Marmot, of seventeen thousand male workers in the British civil service. In this study, early death and poor health were found to increase stepwise from the highest to the lowest levels of the civil service hierarchy. Messengers and porters had nearly twice the death rate of higher-ranking administrators, even after accounting for differences in smoking, plasma cholesterol, blood pressure, and alcohol consumption. None of these civil servants were poor, in the usual sense. They all enjoyed clean water, plenty of food, and proper toilet facilities. The main ways they differed were in occupational prestige, job control, and other gradients of the social hierarchy. Marmot and his co-workers concluded that emotional disturbance, because of financial instability, time pressures, lack of advancement, and a general dearth of autonomy, drives much of the difference in survival. “Both low-grade civil servant and slum dweller lack control over their lives,” Marmot writes. “They do not have the opportunity to lead lives they have reason to value.”