Infectious Madness: The Surprising Science of How We "Catch" Mental Illness

CHAPTER 5

Microbial Culture: Pathogens and the Shaping of Societies

No longer were there individual destinies; only a collective destiny, made of plague and emotions shared by all.

—ALBERT CAMUS, THE PLAGUE

Pathogens dictate more than individuals’ mental health. There is a broader question to ponder: How do microbes influence people’s tendencies to think and act en masse? Microbes shape culture in subtle but powerful ways, and they may trigger everything from exotic mental disorders that affect certain groups to genocide. Understanding the “microbial mind” may even illuminate predilections as subtle as our tastes in wine and perfume.

In 1989, an ophthalmologist approached the middle-aged Cambodian woman in his Long Beach waiting room. She sat silent and unsmiling as she gazed at the distant horizon. Her history read like the script of a horror film: somehow she had endured seeing her husband and son slashed to death in front of her by Pol Pot’s minions, had survived months in a refugee camp, and had finally made her way, with her daughters, to asylum in the United States. Now she was blind, and she was not alone. At least a hundred and fifty women had presented to area ophthalmologists, and as Gretchen Van Boemel of the Doheny Eye Institute researched their cases, she discovered that they shared more than the same mysterious visual problems; they shared the same cruel story.

After seeing their husbands and children killed and being driven from their homes under threat of execution, these newly minted widows had walked hundreds of miles on infrequent morsels and gulps of water in order to save their remaining children’s lives. Those who made it to the fabled safety of refugee camps found cold comfort in the sparse rations and lax security; rapes and muggings were common. Finally, the women reached the haven of America, but as the reality of widowhood, murdered children, isolation, and remembered rapes and assaults set in, a new blow staggered the women who had been torn from their cultures and left to eke out a lonely subsistence on welfare in a foreign land.

They were slipping into darkness.

Something was blinding the Khmer refugees, but ophthalmologists could find no physical reason for their sightlessness.

To professionals, this looked like textbook conversion disorder—the textbook being that of nineteenth-century neurologist Jean-Martin Charcot, who first proposed the conversion of intolerable memories into somatic symptoms that often have their origin in the person’s culture. (Psychiatrists also speak of neodissociation, in which a person loses function but still processes stimuli—in this case, visual stimuli—that influence her behavior, although she is not consciously aware of it.)

Paralysis afflicts people who are conflicted about leaving home, and unexplained blindness strikes women who have witnessed a surfeit of horror. In addition, the slaughter of husbands, homelessness, violence, and exile effectively severed these Cambodian women from their traditional role in Khmer society in which a woman’s sexual virtue, pleasant manner, and serene composure—traits embodied by the female deities immortalized on the temple walls of Angkor Wat—are integral to the family honor. A woman who is widowed, raped, degraded, starved, and driven from her home ceases, in a way, to be a Khmer woman.

Richard Mollica, a Boston psychiatrist who interviewed and studied these women, determined that after witnessing the horrors of genocide, they had willed themselves to become blind. They could bear to see nothing more, even through healthy eyes.

Yet as the women sat quietly in the waiting room, they showed none of the agitation one might expect in the wake of such trauma, now compounded by blindness and the unsettling absence of a clear diagnosis. A hallmark of conversion disorder is la belle indifference, in which a patient shows an utter lack of concern about her state.1

Scientists are using neuroimaging, such as functional MRI (fMRI), magnetoencephalography (MEG), SPECT, and transcranial magnetic stimulation (TMS), to investigate, and recent research shows that specific patterns of brain activity are associated with conversion disorder.2 One theory suggests that conversion is a protective strategy that derives from false body mapping caused by dysfunction in the brain’s hypercomplex circuitry involving the cingulate cortex, insula, thalamus, brainstem nuclei, amygdala, ventromedial prefrontal centers, supplemental motor area, and other key structures. The primary sensory signals of vision, hearing, and touch pass through the thalamus on their way to the cortex, and these striatothalamocortical pathways constitute part of a feedback loop between the basal ganglia, which help govern motor control and motor learning. The motor plan starts in the cortex, is sent to the striatum of the basal ganglia, goes from there to the thalamus, and is relayed back to the cortex. Only then is it sent to the body, hence the cortico-striatothalamocortical pathway. Disruptions anywhere along this pathway, whether due to injury, infection, shock, or other psychosensory input, can cause false body mapping. The affected person loses access to senses such as vision or becomes unable to control parts of her body, as when a conflicted person becomes unable to walk.3

When such conversion symptoms grip many in schools, hospitals, army bases, or other closed communities, it is called mass hysteria.

On a brilliant fall day in October 2011, involuntary twitches and shudders suddenly seized Katie Krautwurst, a healthy, well-adjusted high-school cheerleader in Le Roy, New York. Doctors were baffled, especially when another girl soon exhibited the same symptoms, followed by ten more. By January, a total of nineteen teenage students and one thirty-seven-year-old woman in this small upstate New York town were at the mercy of frequent involuntary movements. These included spasmodic jerking, fainting, and Tourette’s-like twitches and shouts. Many doctors, epidemiologists, and activists, among them Drew Pinsky, better known to TV audiences as Dr. Drew, and Erin Brockovich, descended on Le Roy, a working-class town approximately thirty miles southwest of Rochester. They sought to solve the case of the mysteriously afflicted Le Roy girls, as they were dubbed (despite the fact that a boy and a thirty-seven-year-old woman named Margery Fitzsimmons were also affected). Brockovich spearheaded a search for toxic chemicals, and Pinsky probed the girls’ psyches while the cameras rolled, all to no avail.

An assortment of other epidemiologists and physicians proffered their own theories for newscasts and medical publications. In early 2012 the experts ruled out environmental factors, side effects from drugs and vaccines, trauma, and genetic factors. Pediatric neurologist Rosario Trifiletti from Ramsey, New Jersey, then stepped in to suggest that the tic-ridden individuals were suffering from PANDAS, explaining that infection with Group A streptococci might have caused the girls’ bodies to produce antibodies that injured their nervous systems and led to the Tourette’s-like symptoms. But as Susan Swedo pointed out, the girls’ conditions did not really fit the PANDAS criteria. PANDAS is a rare disorder, which made it unlikely that so many would be affected within such a short time and such a limited geographical region. PANDAS is also unlikely to affect principally girls.

Undaunted, Trifiletti examined the girls and revealed on Dr. Drew’s show that he had found evidence of strep or other PANDAS-associated infection in all nine of the girls he tested. Although he did not know if the levels of antibodies in their blood actually rose—a prime factor in determining disease—he declared that there was enough evidence to start them on antibiotics and anti-inflammatories.

In the end, Swedo was right: no evidence supported a PANDAS diagnosis. Instead, the Le Roy girls were diagnosed with conversion disorder, in which psychological stress causes patients to suffer real bodily symptoms. Epidemiologists concluded that the nation was looking at a case of mass hysteria.

How does psychological stress translate into bodily dysfunction? One theory indicts the amygdala, a region of the brain concerned with fear responses. It is overactive in patients who suffer from conversion disorder, Mark Hallett told the New York Times: “Ordinarily, the amygdala might create psychological distress, but instead, in these cases, it would create an involuntary movement.” But Hallett, a senior investigator at the National Institute of Neurological Disorders and Stroke, added that while the theory was plausible, our knowledge of the mechanisms involved was still “primitive.”4

Culture-bound?

Cambodia is far from the only country racked by displacement and genocide, but in the Khmer women, blindness in response to witnessing horrors is a culture-bound mental illness that shows why the insights of anthropology are as important as those of psychiatry in understanding mental disorders.

Culture-bound is the term psychiatrists and anthropologists use to describe mental disorders and syndromes whose expression is dependent on cultural factors.5 Some are as dramatic as koro, a powerful panic attack engendered by the strong belief that one’s genitalia are retreating into one’s body. Affected women often believe that their breasts and genitalia are being reabsorbed, and all the afflicted believe that death will ensue. Possession by koro, not to be confused with the infectious kuru of New Guinea Highlanders, is often blamed on a malicious person who has stolen or shrunken the intimate body parts,6 and the belief sometimes spreads, becoming a local obsession. Fifty-six accounts of genital shrinking or theft were reported in West Africa between January 1997 and October 2003, and news media recounted incidents in seven West African countries during 2012 and 2013. Genital-shrinkage anxiety haunts Asia, Europe, and even the United States. In China, it is explained as a reduction of the male yang principle; in West Africa, it is often laid to sorcery. This sounds absurd to contemporary Westerners, but the disorder has not always been a foreign concept; in medieval Europe, it was similarly believed that a man could have his male member stolen by witches.7

Other such disorders include amok, an episode of indiscriminate homicidal rage followed by amnesia of the event. Although it has been appropriated into English as the more benign phrase running amok, the disorder was first described in 1893 by W. Gilmore Ellis, British medical superintendent of the Government Asylum in Singapore, who observed it in Malays. Like koro, amok arises during times of social tension or impending disaster. Pibloktoq, or Arctic hysteria, was first described during Admiral Peary’s visits to Greenland, and a disorder called ataque de nervios was documented by military psychiatrist Fernández-Marina as the Puerto Rican syndrome, although it has been recently found among Hispanic peoples in the United States, including Mexican immigrants. Hsien Rin, a Chinese psychiatrist, first described frigophobia, an excessive fear of becoming cold, in 1975,8 which was also ascribed to an imbalance in male/female elements.

Between 1890 and 1970, many other dramatic mental or behavioral disorders were observed among non-European peoples and classified as culture-bound.9 These ailments were considered exotic, unclassifiable, or unusual by Western psychiatrists who did not always have a good understanding of the cultures they studied, and they labeled the disorders culture-bound because they differed from those in the European and North American patients they were used to treating.

Such ethnocentric classifications reflect the cultural myopia of psychopathology,10 and culture-bound is an inaccurate term because it implies that the behavior occurs in only one culture. Culture-related is a better term, because many of these diseases appear throughout the world, even in the United States. Ataque de nervios—characterized by mental stress and symptoms of nervousness like decreased ability to concentrate, emotional distress, headaches, insomnia, gastric discomfort, vertigo-like sensations, and trembling—was described as a Mexican disorder, but it is found in many U.S. cities. How many cases of North American gun violence could be attributed to amok? Mental disorders appear in different guises in different cultures, and it has sometimes taken time, research, and a shedding of ethnocentrism to recognize this.

As a matter of fact, we in the United States and Europe have had our own seemingly culture-bound diagnoses: nineteenth-century vapors, or fainting spells; shell-shock, the symptoms of which change from war to war; and hysterical paralysis, which is an apt physical metaphor for the distress caused by sharply circumscribed women’s roles. Windigo psychosis derives from a supernatural cannibal figure in Northern Algonquin mythology who can attack humans and transform them into coldhearted cannibals. A Native American seized with fear that he is becoming a Windigo may fantasize about eating others while he is plagued with nausea and unable to tolerate normal food. This may progress to homicide or suicide.

Culture-related mental disorders are not limited to the exotica at which nineteenth-century Westerners marveled; today we recognize that they include variations of familiar diseases like schizophrenia. Far from being exotic and dangerous, the pathologies that Westerners label as culture-related disorders may help sufferers to better navigate life and society with a mental illness. As we shed Western biases in evaluating illnesses, it becomes clearer that culture-related diseases may disguise garden-variety anxiety or depression or the manifestation of illnesses such as schizophrenia, as in the case of ataque de nervios.11

Anthropologist Janis Jenkins has studied nervios among Mexican American families and observes that it is popularly used to describe a condition that would be diagnosed as schizophrenia in the West. In these cases, nerviossoftens the clinical picture of schizophrenia in ways that make it easier for the sick person to remain integrated in society and his family. Its recasting of symptoms also offers the mentally ill a less dire prognosis. Nervios is viewed as temporary; one may recover from it. It is also considered a disorder of sensitivity, overreaction, or an exaggerated startle response, not a psychotic derangement where people may be controlled by voices, unable to discern reality. Jenkins explains the important cultural function of this alternative diagnosis:

Use of the term nervios affords a cultural protection not offered by other terms for mental illness, which are considerably more threatening. In their study of schizophrenia among Puerto Ricans, Rogler and Hollingshead (1965) reported that both relatives and the afflicted individual go to great lengths to consider the problem as one of nervios rather than locura (craziness). As we have seen, Mexican-Americans also prefer the term nervios. This was particularly the case when relatives were offered a specific choice between use of this term and that of mental illness.12

Invoking nervios, Jenkins says, also helps cement family support by minimizing differences between the sick person and healthy family members. The term helps the patient as well, implying that his condition is temporary, whereas the words schizophrenic and loco connote a permanent, incurable state.13 In this sense, says Jenkins, the culture-related diagnosis offers the schizophrenic a more benign social identity that eases their integration into families and society and has a better prognosis, creating the expectation of recovery. Similarly, a study of 1,031 rural African Americans, a population hit hard by the disease diabetes, found that the patients often referred to their condition as “sugar.” Thirty-one percent of subjects who had answered yes when asked whether they had sugar later answered no to a survey question asking them whether they had diabetes. Subjects who believed they had sugar felt their condition was less serious than those who said they had diabetes.14

Anorexia, too, has long fallen under the rubric of a culture-related disorder. Until recently, it was perceived as a disease of middle-and upper-class WASP adolescent girls who, threatened by their incipient sexuality or a distorted body image, developed an obsession with being slim and avoided eating. A better understanding of how to recognize and approach anorexia is critically needed; its mortality rate is as high as 18 percent.15 Anthropologists such as Caroline Giles Banks now recognize that anorexia is far more widespread than most of us think, affecting people from many cultures.

But, explains Banks, cultural rationales differ. In some countries, anorectics are likely to refuse food more for religious reasons than cultural ones, which harks back to some medieval nuns who took pride in refusing to eat.

In other areas, anorexics report feeling too full or “bloated” to eat.16 In fact, Banks described the cases of two American women from the Minneapolis–Saint Paul area who explained anorexia not in terms of the ideal of thinness but in religious idioms and symbols.17 She reminds us that the United States itself contains many subcultures and that “anorexia nervosa’s designation as a syndrome limited to Western cultures or to those cultures influenced by them may reflect unexamined assumptions on the part of researchers that dieting and secular ideals of slimness are primarily involved in the disorder.”

What does this augur for the PANDAS/PANS theory of infectious anorexia? In cases of anorexia that seem to be founded in religious asceticism or some other cultural basis, can we rule out infection as the cause? As Banks points out, “While these symptoms are related in complex ways to biological dysfunctions caused by starvation and weight loss and may be, in part, unconsciously motivated… the anorectic consciously understands and gives meaning to her symptoms using culturally explicit and objective symbols, beliefs and language.” So while GAS infection may be the physiological substrate for PANDAS anorexia, the affected person may impose a meaning on it that reflects her culture and belief systems.

Infectious mental diseases can also be culturally bound diseases, and anorexia is not the only example: kuru is an incurable disease of the human nervous system, often heralded by arm and leg pain, severe coordination issues, balance problems, difficulty walking, involuntary muscle spasms, tremors, and jerking. It causes rapid mental deterioration including emotional lability—the diseased person might, for example, succumb to deep depression that is abruptly supplanted by inappropriate and uncontrollable laughter. Dementia sets in, rendering the sufferer unable to speak or otherwise communicate, and people with kuru become placid and unresponsive to their surroundings.18Frequent headaches are common, as are swallowing difficulties that become so severe that the person is eventually unable to feed herself. It is a human analogue of the disease scrapie in sheep, and bovine spongiform encephalopathy, or mad cow disease, in cattle.

Kuru was first diagnosed among New Guinea’s Fore people. Because men appropriated the pigs they hunted, the women and children supplemented their own diets by practicing a form of religious ritual cannibalism that involved eating the bodies, and especially the brains, of recently deceased loved ones.19 Unfortunately, the prions that cause kuru are concentrated in the brain and nervous tissues, so 90 percent of the women in the area and children of both genders contracted the disease, but the adult men were largely spared. The Fore abandoned cannibalism in the 1960s, but the disease has a long incubation period, so as Robert Klitzman, director of Columbia University’s Center for Bioethics, recalled during a 2015 telephone interview, “When I went back in 1997, cases were still appearing among men and women in their late thirties and forties.” Kuru has been diagnosed as much as fifty years after the exposure to the pathogen.

Kuru has been regarded as a culture-related disease affecting the Fore people, despite the fact that quite similar prions cause Creutzfeldt-Jakob disease, or CJD, the clinically related disease that killed famed choreographer George Balanchine, as will be discussed in chapter 7. After years of illness, Balanchine died on April 30, 1983, and when his brain was autopsied, “chemical stains were added to some [slices of tissue] to help detect the pattern of appearance of certain brain cells and abnormalities, particularly the kuru plaques,” reported Lawrence Altman in the New York Times.20 Although Robert Sapolsky of Stanford University points out that the plaques of kuru and CJD are different, this suggests that kuru is infectious despite the fact that it is culture-related.

Culture is important in determining not only which mental disorder exists but whether a mental illness exists at all, because behaviors can’t be evaluated in a vacuum. A woman who eagerly feasts on human brains in the New Guinea Highlands of 1970 is participating in a ritual act that is meaningful and normal within her gender and culture. A woman who insists on ordering human brains at a SoHo McDonald’s is likely to be regarded as having a mental disorder.

Genocide, an infectious madness

I’ve argued above that individuals suffer from mental disorders that may be both infectious and culture-related. But can societies suffer from such disorders as well? On July 9, 2011, the Republic of South Sudan emerged as the world’s newest country, even as its government warred with armed ethnic groups within nine of its ten states. The armed clashes continue, hundreds have died, and tens of thousands of people have been displaced. In 2014, aid workers discovered fresh mass graves in this three-year-old country.

South Sudan has had plenty of company. Just in the past few decades, we’ve witnessed the Bosnian War of the early 1990s, Rwanda’s 1994 ethnic genocide, and cars set afire by disaffected Parisians of African descent. Forty-five years after the Holocaust, neo-Nazi violence against immigrants soared in the wake of German reunification, and Germans have driven ethnic Turks from the country in droves. And the Middle East, of course, has long been synonymous with ethnic warfare.

Then there is the dizzying array of violent racial and ethnic attacks in the multiethnic U.S. “melting pot,” from the slaughter of Native Americans to the kidnapping, torture, rape, murder, and violent revolts that characterized enslavement. This was followed by Italophobia, Hibernophobia (against the Irish), internment of U.S. citizens of Japanese origins, persistent anti-Semitism, racial segregation sanctioned by law, murders of rights workers, and burned churches and synagogues during the civil rights era and beyond. From the Ku Klux Klan to the Symbionese Liberation Army, the United States seems to have been intent on proving Black Panther H. Rap Brown’s maxim: “Violence is as American as cherry pie.”

In the struggle to understand the human penchant for racial and ethnic violence, academics have chased multifactorial social, political, and economic theories, few of which have helped stem ethnic and racial murder. Over the past half a century, some have even resorted to medical explanations. In the wake of racist civil rights–era murders, Harvard Medical School professor Alvin F. Poussaint suggested that the extreme racism that leads to murder and other acts of violence ought to be classified as a mental disorder. But American Psychiatric Association officials rejected his suggestion, arguing that U.S. racial and ethnic violence is so common that it constitutes normative behavior. The APA characterized even extreme racial violence as a “cultural problem,” not a psychiatric one.21

“To continue perceiving extreme racism as normative and not pathologic is to lend it legitimacy,” Poussaint wrote in response, adding:

Clearly, anyone who scapegoats a whole group of people and seeks to eliminate them to resolve his or her internal conflicts meets criteria for a delusional disorder, a major psychiatric illness…. Extreme racist delusions can also occur as a major symptom in other psychotic disorders, such as schizophrenia and bipolar disorder. Persons suffering delusions usually have serious social dysfunction that impairs their ability to work with others and maintain employment.22

The APA invoked culture as an alternative to psychiatry, but the two are not mutually exclusive; in fact, they are inextricably bound. In the biological sphere, culture refers to microbes coddled in an artificial medium where they are carefully tended under conditions favorable to growth. The broadest understanding of culture couldn’t be farther from this definition: a society’s shared beliefs and behavior—including their expression via symbols—that are pointedly not a result of biological inheritance. But both definitions are central to understanding how microbial culture has helped shape human cultures.

In the 1990s, gun-violence expert Dr. David Hemenway, a professor of the Harvard School of Public Health, determined that people living and working near gun owners begin to acquire, or at least to covet, guns themselves, and children of gun owners grow up to become gun owners, so that gun ownership spreads through a household and community in the same way the flu does, leaving debility in its wake.23 A single gun eventually transforms a community into an armed neighborhood.24 Hemenway’s infection model of gun violence helps explain why the United States has more guns per capita than any other developed nation, and why nearly half of American men own firearms.

Hemenway has long clarified the vision of violence by following data without regard for conventional wisdom. His investigations revealed that a gun kept in the household “for protection” was forty-seven times more likely to kill an occupant of the home than an intruder. He found that whites are more likely to own guns than blacks, Republicans more likely to own them than Democrats, and conservatives most likely of all to own them. He found that the widely recommended gun-safety training programs for owners are associated with poorer storage habits; people who complete these classes are more likely than others to store their guns loaded and outside of lockable storage cabinets. And he found that most gun owners live in the suburbs and exurbs, not in cities.

The infection/contagion model sounds plausible because we can easily see that those who live in violent environments become inured to it and prove likely to engage in aggression against others, especially outsiders. Whether these foreign elements consist of rival gang members, ATF agents, or members of hated religious or ethnic groups, violence becomes more common, as the APA suggested.

But Hemenway’s thesis also echoes Poussaint’s claim by conceptualizing violence as a sickness, something that infects and destroys a healthy community. A National Academies of Science study considers such violence pathologic and far from normal. Instead, it compares exposure to violence with exposure to HIV, tuberculosis, or cholera. Acts of violence are germs that target the mind rather than the intestines or lungs.

John Laub, a professor of criminology at Northeastern University, proposes a similar biological metaphor. Laub suggests that when children and young adults, whose still-developing brains possess great plasticity, repeatedly experience or witness violence, their neurologic functioning becomes deranged. He told the New York Times that “acts of violence lead to further acts of violence, creating a contagion effect and a sudden jump in crime rates that is hard to explain.”25

A recent report that interrogated racial bias in U.S. imprisonment was headlined “Is Prison Contagious?”

Incarceration in the United States is frequently described as an epidemic, with per capita rates nearly quadrupling in the past 30 years. African-Americans appear to be particularly susceptible: In 2011, they were six times more likely than whites to be incarcerated, making up 38 percent of the 1.6 million Americans behind bars while accounting for only 13 percent of the U.S. population.26

Infection and contagion are not synonyms; infections are caused directly by agents such as bacteria, fungi, or viruses, and contagion refers to the spread of disease from one person to others by close proximity or touch. But some illnesses are both infectious and contagious. For example, the flu is caused by a virus and is spread to others through touch, coughing, and sneezing.

In 2012, a 153-page National Academies of Science report entitled The Contagion of Violence27 summarized research describing similarities between the spread of violence and classic infectious-disease models. The report described acts of racially targeted violence as the germs that targeted not intestines or lungs, but the brain. It documented the tendency of violent acts to cluster, to spread predictably from one place to another, and to mutate from one kind to another, mimicking the spread of a viral or bacterial infection. Just as agents or vectors initiate a specific biological pathway leading to symptoms of disease, the report proposed possible mechanisms that govern the transmission of violence and suggested how the contagion might be interrupted. For example, one contributor, Gary Slutkin, told Wiredjournalist Brandon Keim that “the density maps of shootings in Kansas City or New York or Detroit look like cholera case maps from Bangladesh.”

As the contagion model achieved critical mass, Wired asked, “Is It Time to Treat Violence Like a Contagious Disease?”28 But this is the wrong question. Although scientists like Hemenway and Slutkin are proposing a metaphorof contagion, compelling recent research suggests that ethnic violence is not merely like a contagious disease. Instead, such aggression is the result of real physical, not metaphorical, infections or, more precisely, of our frenzied attempts to heuristically identify the signs of infection and thereby avoid them. We’re not very good at it, and we end up with a lot of collateral damage.

Beyond mental disease

Microbes may shape not only frank disorders, but behaviors that are common to cultures. Whether we are xenophobes or xenophiles, belligerents or pacifists, conservatives or liberals, microbes are, as usual, pulling strings behind the scenes to help make us who we are. Evolutionary psychologist Mark Schaller suggests that microbes are responsible for what he has dubbed “protective prejudice,” a suite of inborn thoughts and behaviors we have evolved in order to recognize and evade potential pathogens. Schaller, a professor of psychology at the University of British Columbia, calls this the “behavioral immune system.”29

The regular immune system usually does a good job of routing invaders, he explains, but its efficiency in preventing disease is limited by the fact that by the time it acts, the microbial invaders have already breached our physical defenses, forcing us to expend energy and time neutralizing and evicting them. While we do so, sickness often prostrates us and even causes mental-health symptoms, however transitory. “If we can use our senses to detect infection risk—and then do something that prevents us from coming into contact with such threats—that holds tremendous advantages,” says Schaller.

A 2010 study by social psychologist Chad R. Mortensen found that subjects who were shown images of sick people were quick to make “avoidant” arm movements in a computer game. They mimed pushing characters away, as if warding off a threat. Another study by his team at the Metropolitan State University of Denver revealed that participants who were shown discomforting images and given other information about infectious diseases rated themselves as less sociable than those in the control group did, essentially finding an excuse to avoid other people—and their germs. In yet another study, people shown illness-related images were more likely to express negative attitudes about foreigners. This unconscious avoidance reaction plays a driving role in ugly prejudices against anyone perceived as different, from those with different skin color to the obese to the disabled.

Worrying about parasitic infection correlates with anti-immigrant attitudes, and such biases are heightened at times when people feel more vulnerable to infection. For example, a study led by Carlos Navarrete of Michigan State University found that women tend to be more xenophobic during the first trimester of pregnancy, when the immune system is suppressed in order to protect the fetus from attack. By contrast, just after someone gets a flu shot, he or she feels protected from disease, and xenophobia decreases.

In an e-mail to the author, Robert Sapolsky noted that the literature also shows how “social conservatives are more concerned with personal hygiene, have lower thresholds for gag reflexes, and are more easily disgusted, than social progressives. And related to that, put people of all sorts of political stripes [in a room], have them fill out a questionnaire about various hot-button issues, and if there’s a foul, smelly garbage can in the corner of the room, people become more socially conservative.”

The legacy of protective prejudice is not all negative; according to evolutionary psychologist Ilan Shrira, author of “Guns, Germs, and Stealing: Exploring the Link Between Infectious Disease and Crime,” “Pathogen threats strengthen in-group affiliation and solidarity (e.g., ethnocentrism, closeness to family), which creates a supportive network should someone in the group become sick.”30

Steven Pinker’s popular book The Better Angels of Our Nature: Why Violence Has Declined seeks to reassure us that mankind has enjoyed a dramatic reduction in violence over the ages. But even if he is right, the killing, rape, and torture of outsiders remains frighteningly common. This fact leads some scientists to ask whether humans might be biologically impelled to shun, drive off, or kill strangers or anyone who appears different. Such musings often hinge on political speculation or tortured data, and they typically involve some theory of a brain irrevocably hardwired by evolutionary forces to persecute outsiders.

This carries the whiff of something repugnant. The supposition that humans are immutably hardwired for xenophobia or frank racism implies that people cannot be held accountable for genocide or xenophobia, or, worse, that these are actual biological imperatives, not only beyond our control but also murkily sanctioned by the wisdom of evolution and the body; by “natural law.”

Overreacting to a wide variety of strangers’ germs and parasites seems at first glance adaptive, because the evolutionary price of infection by a pathogen against which you have no immunity is high. Just ask the millions of Native Americans who succumbed to European colonists’ colds and syphilis, or the hundreds of thousands of nineteenth-century European soldiers who died of unfamiliar tropical diseases in the West African “White Man’s Grave.” Such diseases are bad for you, your community, and your future progeny, so your behavioral immune system decides “better safe than sorry” and impels you to avoid or eliminate strangers who might be carrying unfamiliar bugs.

But just as our species’ humoral immune system frequently overreacts, triggering everything from hay fever to autoimmune disorders, our behavioral immune system also overreacts, attacking unfamiliar people who might be carrying dangerous pathogens. The body’s evolutionary adaptations, and even evolution, often get it wrong and lead us to target groups and individuals who pose no threat.

The reason for such mistakes is that humans, unlike many other animals, are simply unequipped to distinguish infectious individuals from healthy ones with any degree of accuracy. Ants, Caribbean spiny lobsters, and bullfrog tadpoles can “sniff out” and avoid infected individuals that pose harm to their communities. Yale evolutionary biologist David Skelly has shown that healthy tadpoles appear able to smell chemicals associated with sick tadpoles. “When presented with an infected bullfrog tadpole,” Skelly says, “the [healthy] tadpoles moved up to a foot away.” Skelly went on to explain that many prey animals can change their behavior and even their body shapes when they smell predators nearby.31 But humans have no built-in mechanism to differentiate infected people from well ones. Outside the laboratory, there are few reliable clues to pathogens, so we rely on indirect clues that suggest taint.

A person whose skin is riddled with pustules, bumps, or lesions may well be a victim of an infection, but we also tend to shun people whose skin is merely a different color than our own. It’s true that people who travel to or hail from places where unfamiliar microbes live, whose sexual norms could change the sort or number of viruses they (and you) are exposed to, or who practice different kinds or levels of hygiene may be likely to harbor germs that you might acquire if you allow them to hang around. And conversely, they can acquire germs from you.

But in addition to fearing that giving strangers the benefit of the microbial doubt might prove deadly, we also bristle at outsider behaviors that often are wholly unrelated to infection. Speech, dress, foods, cooking methods, and even pets that mark outsiders are taken for shorthand that they might be pathogenic threats. According to the protective-prejudice theory, our fear of “the other” owes something to our fear of infection. Because we cannot accurately determine biological threats, the cost of xenophobia may well outweigh the speculative benefits of avoidance. Studies of countries racked by ethnic warfare provide strong evidence that using an infection model to describe ethnic violence is more than a metaphor. Xenophobia is an efficient incubator of genocide, says Randy Thornhill, who found that disease is the best predictor of ethnic violence rates worldwide, and a better predictor than poverty or income inequality. The more disease a country harbors, the more likely ethnic violence is. The official death toll of Rwanda’s most recent ethnic genocide, in 1994, characterized by the mass slaughter of Tutsis by Hutus, hovers between 500,000 and 1,000,000, yet no biological difference between the groups that might pose an infectious threat has been found. The lowest estimated death toll in the 1992–1995 Bosnian War is 104,732. Both of the above figures include only those slaughtered outright, not those who disappeared or were raped, starved, or exiled.32 “If you get high levels of xenophobia,” says Thornhill, “then one group feels so negatively about another group that they want to kill them. So you get more large-scale violence like clan wars in regions with high parasite stress.”33

Thornhill calls this phenomenon—which explains why some societies are more bellicose than others—a “parasite-stress theory of sociality.” He theorizes that where harmful microbes abound, we find xenophobes who embrace ethnocentrism as a strategy for avoiding disease. Intergroup cooperation tends to increase resources, so ethnocentric cultures that erect barriers to intergroup cooperation greatly impoverish their environments, and that, in addition to the naturally impoverishing effects of disease, sabotages economic growth. To acquire needed resources, he says, “they are more likely to resort to violent conflict.”34 Global violence rates correlate with infection more strongly than any other variable.

The correlation holds true within the United States as well. A 2013 study by Ilan Shrira of Loyola University used data from the Federal Bureau of Investigation’s 2009 Uniform Crime Reports to determine whether infection could be tied to changes in crime rates. Comparing that information with the Centers for Disease Control and Prevention’s National Notifiable Diseases Surveillance System data revealed that rates of stranger homicide rise in areas with rising infection rates, but killings that target family members or acquaintances do not. Such correlations of stranger violence and infection do not prove that infection causes the violence, but they support the theory that a fear of the other leads to violent crime. “Under persistent disease threat,” says Shrira, “xenophobia increases and people constrict social interactions to known in-group members. Though these responses reduce disease transmission, they can generate favorable crime conditions in two ways. First, xenophobia reduces inhibitions against harming and exploiting out-group members. Second, segregation into in-group factions erodes people’s concern for the welfare of their community and weakens the collective ability to prevent crime.”35

So we trade possible disease protection for certain community erosion, war, genocide, and wholesale death. However, despite our impulse to xenophobia, we remain the only species capable of using our intellect to understand and trump biological urges that we recognize as unfair or ultimately harmful. The human behavioral immune system operates on a higher cognitive level than that of any other species, and we should respect it.

But instead, we stick to our fallible prejudice-based method, as if preventing exposure to strange infections boils down to avoiding strangers or, more likely, driving them away. In fact, anyone whose behavior increases the odds of acquiring different microbes risks being ostracized, a fate that can be more deadly than we realize. Social psychologist Kipling D. Williams of Purdue University and his colleague Lisa Zadro found that, lacking resources and no longer enjoying the protection and social sustenance of their group, the shunned “lag behind, become decimated, and eventually die through malnutrition or from attack.” In short, “Animals who are ostracized inevitably face an early death,” and the same is true for people. “Although some humans ostracized by all groups have survived as hermits, the infrequency of such occurrences suggests that for humans also, ostracism threatens survival. And if not a threat to the individual, it is certainly a threat to the continuance of their genetic line.” 36

And we must keep in mind that for outsiders, shunning is at the benign end of the spectrum. In The Nature of Prejudice, Gordon Allport, a founding figure of the psychology of personality, describes a classic five-point scale of increasingly dangerous aggressions toward marginalized groups: verbal hostility, avoidance, active discrimination, physical attacks, and, finally, extermination via lynchings, massacres, and genocide, a progression that fits neatly within descriptions of delusional behavior.37

Fear, an infectious weapon

The fear of infection is a handy genocidal tool. Proselytizers of genocide are quick to inflame and floridly capitalize on such fears. The Third Reich’s propaganda machine manipulated this fear of stranger infection, cloaking its racial hatred of Jews, Poles, and Afro-Germans in the language and imagery of infection. Such non-Aryans, Nazis claimed, threatened an (imaginary) natural order and so sabotaged the nation’s purity and vigor. This purity was habitually couched in terms of biology, as when Rudolf Hess brayed in 1934 that “National Socialism is nothing but applied biology.” More specifically, the Reich invoked the biological concept of infection to achieve Gleichschaltung, or “setting things in order.” This was the “natural” biological order of things, to be achieved by cleansing the state of parasites, the inferior people accused of sapping the health, resources, and vigor of “true Germans.”

One ominous image that adorned propaganda posters embodied the concept of Krankheitserreger, or pathogens, and depicted Jews, Communists, and gays as bacteria, symbolized by small Stars of David, hammer-and-sickle icons, and pink triangles spotlighted in the field of a microscope. Polish Jews who had been forced into ghettos with inadequate space and hygienic services were decried as vectors of infection when the inevitable typhus and cholera epidemics set in:

In the 1940 National Socialist propaganda film Der Ewige Jude (The Eternal Jew), rats teem while the voice-over reports that “where rats appear, they bring annihilation to the land… [rats] spread disease, plague, leprosy, typhus, cholera, dysentery, etc…. just as Jews do among the people.” Hitler not only referred to Jews as “bacilli” but also as “viruses” and “parasites,” and he painted the Jewish population of the Soviet Union as a Pestherd(plague focus). Heinrich Himmler, in a speech to SS officers in Poznań (then the German city of Posen) in 1943, made plain the equation of Jews with bacteria: “In the end,” he declaimed, “as we exterminate the bacillus, we wouldn’t want to become sick with it and die [ourselves].”38

image

This National Socialist propaganda poster depicts Jews, homosexuals, and others as pathogens and threats to the health of German society.

A year after Hess equated Nazism with biological imperatives, Reichsbauernführer Richard Walther Darré declared, “As a Rhinelander, I demand: sterilization for all mulattoes with whom we were saddled by the black shame on the Rhine.”39 He was speaking of Somalian soldiers stationed in the Rhineland borders by France, many of whom had taken German wives and lovers. German Hereditary Health Courts judged the reproductive fitness of most persons on a case-by-case basis, but for black Germans and Afro-German children, visual or verbal evidence of African ancestry was enough to justify immediate secret sterilization in on-site clinics under Special Commission No. 3, which was established by Eugen Fischer in 1937. Frankfurt health office records for June 19, 1937, reveal a chilling example:

The German citizen Josef Feck, born on 26 September 1920 and residing in Mainz is a descendant of the former colonial occupation troops (North Africa) and distinctly displays the corresponding anthropological characteristics. For that reason he is to be sterilized. His mother consents to the sterilization.40

Today the Stormfront site, run by an assortment of virulent racists who admire National Socialism, reproduces Hess’s aphorism and screams “Expel the parasite!” as it makes its case for the extermination of African Americans.41In the 1994 Rwandan genocide, the Tutsis, like Jews in the 1930s, were dehumanized as cockroaches, rats, and vermin by those who were busily engaged in ethnic cleansing to “exterminate” them, another common strategy for identifying them as vectors of disease.42

Irish journalist Fergal Keane, who witnessed the 1994 genocide, wrote, “Tens of thousands became infected—and I can think of no other word that can describe the condition—by an anti-Tutsi psychosis.”43Ibrahim Omer of California State University44 determined that “genetic studies suggest that the Hutu and Tutsi of today are hardly distinguishable,” but this finding has done nothing to dampen the demonizing so essential to genocide. Thus, protective prejudice, in its extremes, is far more than a historical concern, especially when it is deployed to stir up ethnic animosities.

Research by scientists like Thornhill reveals that microbes dictate more than crude impulses toward xenophobia and ethnic violence. Pathogens are also responsible for subtler aspects of culture, from social traits to politics. Research in the Journal of Personality and Social Psychology holds that in areas where disease is prevalent, people tend to be less extroverted.

The idea that extroversion and collectivism are national traits has prevailed for more than forty decades, bolstered by the work of Dutch social psychologist Geert Hofstede. In the 1970s, Hofstede investigated cultural differences in sixty-four countries that were home to national subsidiaries of IBM, where he once worked. To aid his research, Hofstede, now a professor emeritus at the University of Maastricht, devised a model of cultural dimension, a scale that measures, among other things, national characteristics of individualism or collectivism—in other words, whether people think of themselves as individuals primarily responsible for their own advancement, or as members of a social institution like a family, workplace, or society. Using this rubric, Thornhill found that nations that are heavily plagued by infectious disease, such as Colombia and Somalia, tend to favor collectivism over individualism. The United States has ranked highest in the world on the Hofstede scale for individualism, but within our culturally heterogeneous nation, collectivist areas stand out dramatically. Louisiana, South Carolina, and Alabama share high rates of infectious disease and a strong culture of collectivism marked by religiosity and an emphasis on clan ties. “You need a social network of reliable people in your group who will help you through the onslaught of disease,” Thornhill told Psychology Today in explaining his findings. “That’s the only health insurance that human evolutionary ancestors had.”45

The individualism embraced by most citizens of the United States is not an inherently superior aspect of culture, nor is it better for mental health; in fact, nearly two decades of WHO studies argue that schizophrenics living in the collectivist nations of the developing world enjoy a better prognosis, which argues against the virtues of individualism, at least as regards schizophrenia. But collectivism isassociated with the particular risk factor of infection-mediated violence. Afghanistan also has both high disease rates and a collectivist worldview marked by xenophobia and clannishness. It shares yet another social characteristic with similarly infectious areas: its people are philopatric, from the Greek words philo (“love”) and patra (“country”), a term scientists use for animals, including humans, who do not leave their birthplace.

The apparent logic of preventive prejudice is hobbled by the fact that we cannot accurately determine infection threats; this means that the cost in genocide and warfare may eclipse the medical benefits of pathogen avoidance. Far from being slaves to our fear and disgust, we can apply reason to develop better ways of taming infectious threats, real and imagined, from strangers.

On an individual level, each of us can learn to overcome disgust, just as physicians and nurses quickly learn to do. We can learn to discard false fears of infection, as people did in the AIDS pandemic once they learned that while sex with an HIV-infected person without adequate precautions was risky, it was perfectly safe to work alongside, share a meal, or give a hug to someone with AIDS. Until these lessons were learned, the shunning, exile, “social death,” job discrimination, and violence against the HIV-infected were open and frequent. In other words, the “hardwired” human biases are in fact as adaptable as microbes are, and as amenable to change. In addition, on a community or even a global level, the cost of prevention and treatment can be lower than the costs of wars, genocides, and bias-fueled violence. As disease rates plummet in response to this more reasoned approach to exposure risk, the rate of biases toward strangers should plummet too.

Not every microbial tweaking of human behavior and desires is pathological or weighty. Evidence is emerging that bacteria and viruses can fine-tune our appetites in a lighter vein as well.

Cat got your tongue?

There’s no accounting for tastes, the cliché declares, but Stanford neuroscientist Patrick House might disagree. His work suggests that the subtle cultural influences of microbes may inform your tastes in wine, scent, and the gourmet Arabica in your coffee cup. Despite the self-congratulatory air of gustatory discussions that invoke le goût friand (and the heavy purse) of the gourmet, we may owe at least some of our refined tastes to a zoonotic infection.

What, for example, do Chanel No. 5, $350-a-pound coffee, and the elegant sauvignon blancs we crave have in common with jaywalkers, seductresses, and schizophrenics?

Not to put too fine a point on it: cat pee.

Consider that pricey java. Throughout the Indonesian archipelago, sharp-eyed promoters have underwritten extensive industrialized farming that capitalizes on an addiction once reserved for the very rich. Within endless rows of battery cages, Asian palm civet cats (Paradoxurus hermaphroditus), are force-fed one of their favorite foods, the coffee cherry. A day and a half later, workers reverently collect the “black gold” that these catlike marsupials deposit on trays installed beneath their cages. This culinary trophy is destined for the cups of the rich around the world.

Despite the aureate euphemism, this harvest looks exactly like what it is: pinkie-size logs of coffee beans bound by dark excrement. Once rinsed, aged, and roasted, these beans yield a gourmet brew that you won’t find chalked up on any Starbucks menu. This is kopi luwak (the Indonesian words for “coffee” and “civet”), and it fetches $30 to $65 a cup, or as much as $350 a pound—about a quarter of the price of gold but as eagerly sought after.

Fool’s gold, say some. Aficionados insist that the beans yield a brew that is “richer, sweeter, and smokier than any other bean in the world,” thanks to their sojourn through the civet’s digestive tract. This sublimity, they explain, results from the luwak’s discernment, as it selects only the finest coffee cherries, and also from the fermentation in its digestive apparatus, during which proteolytic enzymes free up amino acids that impart that irresistibly distinctive quality to the final brew.

But professional cuppers, those elite noses of the coffee world, often disagree. Many describe the taste as thin or nondescript and dismiss kopi luwak as gustatory bling driven by trend, not taste. A few critics add that the brew’s quality has plummeted, pointing out that its superiority has long been ascribed to the free-roaming civet cat’s talent for choosing only the finest coffee cherries for its dinner, while today’s farms exercise no such discretion. Still other connoisseurs flatly dismiss the taste as tainted, moldy, and frankly fecal. In 1995, such professional skepticism earned kopi distributor J. Martinez and Company of Atlanta the loudest of critical raspberries, the Ig Nobel Prize.

All of which has done nothing to tame the cravings of devotees.

Moreover, snobbery alone can no longer explain the attraction, because kopi distributors now flirt with the mainstream. A down-to-earth marketer with thirty-three thousand likes on Facebook has promoted its wares to everyday folk, sans gourmet pretensions, as “cat’s ass coffee.” Its ads crow, “That’s some good shit.”46 Now middle-class devotees join in the praise of the beans’ alluring aroma as they acknowledge their luwak addictions. Taste, of course, relies heavily on scent, especially in aromatic fare like coffee.

These addictions may be more than metaphorical. What separates kopi luwak aficionados from its detractors may go beyond a slatternly palate and a slavish adherence to foodie fashion. Instead, an infection by the unicellular parasite Toxoplasma gondii may drive an irresistible attraction to the feline aroma in the beans.

The parasitology of desire

This book has presented the evidence for T. gondii’s causal ties to schizophrenia and suicide, but more than mental illness is laid at toxoplasma’s door. A few hundred miles from the kopi luwak farms, Ajai Vyas of Nanyang Technological University in Singapore found evidence that toxoplasma manipulates its hosts sexually when it causes infected male rats to produce extra testosterone, enhancing their attractiveness to females. When they mate, males spread the parasite to their partners.

By increasing testosterone, toxoplasma also dampens fear responses, and infected rats may lose concern for their safety when they pick up the scent of a cat. At Stanford University, the group of Robert Sapolsky, professor of biology and neurology, found brain regions involving both fear responses and sexual attraction were transformed after exposure to cat odors and that, “somehow, this damn parasite knows how to make cat urine smell sexually arousing to rodents, and they go and check it out. Totally amazing.”47

Although most of us long for escape upon entering a home that’s redolent of felines, the power of T. gondii to make the scent of cat urine attractive may explain the appeal of kopi luwak: the 50 percent of the world’s population that is already infected may be drawn to the feline scent in the beans. Although the high heat of roasting coffee beans should kill T. gondii, workers who sort and handle the roasted beans with ungloved hands and an indifferent approach to hygiene may ensure that the infection moves on to its previously uninfected human consumers.

And a connoisseur is born.

Some cat lovers go right to the source to revel in their pets’ perfume, confessing on their websites that they cannot stop smelling their pets’ fur. Some go so far as to specify the attractive scent of their cats’ rear ends. Of course, only a small minority of cat owners seem to fall into the latter category. But why would anyone?

Czech scientists gave us a clue when they distributed towels imbued with the scents of various mammals, including cats, dogs, and horses, and then asked subjects to rate the smells for pleasantness. Most men who rated cat urine pleasant tested positive for toxoplasma. Just as the parasite evokes fatal attraction in an infected mouse, it can awaken irresistible desire in infected people because the same pathways and the same neurotransmitters, such as dopamine, govern the behavior of both humans and rodents.

The parasite transforms everyday behavior and, according to the research of Sapolsky and others, people’s personalities. Once infected, the formerly cautious, light-averse mice swagger fearlessly into dangerous feline territory, and infected humans, even formerly cautious ones, tend to become thrill-seekers.

Unlike household rodents, First World urbanites have few feline predators to fear, but they do face hazardous traffic, and for scientists, roadways provide the behavioral litmus test. Four large Czech and Turkish studies have found that the infected consistently take unnecessary chances on the road, both as pedestrians and behind the wheel. Infected drivers are two and a half times more likely than others to have traffic accidents.48

Eastern European researchers have found even more subtle personality changes: Infected men tend to be introverted and suspicious as well as oblivious to other people’s opinions of them, which makes them indifferent dressers who are inclined to solitude. This would not seem to bode well for the parasite’s future, as reticent loners are generally unlikely to engage in the sort of intimate social activities, like sex, that facilitate its spread. However, these men also have elevated testosterone levels, and women who are shown their photos rate them as more masculine than uninfected men. Why should this be so? Infection may well change the men’s appearance because T. gondii affects their grooming behavior and their dress. For example, a man who stops shaving daily and sports stubble might be perceived as more masculine, male, or attractive. He may eschew his usual suit for more casual body-conscious gear like T-shirts or sweaters.

Decades of human studies also reveal a pronounced gender disparity. Unlike their male counterparts, infected women are less wary, more outgoing, and more interested in attracting others than are uninfected women. Coupled with the characteristic recklessness associated with toxoplasma, scientists theorize that these women are likely to be more sexually active than the norm.

Scent of a woman

Beneath the alluring apparel of the fashionable smolders perfume. No one has studied which scent T. gondii –infected women prefer, but ever since King Solomon imported civets from Africa in the tenth century BC, the ordure-like musk excreted from their perianal glands has provided an irresistibly discordant note to haute florals.

Although $2,000-a-liter civet musk is strongly repellent, minuscule quantities have bestowed a warm complexity to fragrances like Joy and Shalimar and to the rose, jasmine, and iris-root combination of Chanel No. 5 when used to stabilize the scents. Aphrodisiac claims are also common, although they are devoid of proof. Citing animal-cruelty concerns, Chanel stopped incorporating civet in 1998 and now opts to chemically reproduce the aroma in its laboratories, but the real thing remains a popular ingredient elsewhere. Some audacious perfumers even boast of rolling it about on their tongues in their quest to concoct a perfect scent.

Other renditions of civet are kinder to the palate. Sauvignon blanc is darkly complemented by a grace note of feline musk, this time in the form of 3-Mercapto-3-methylbutan-1-ol, or MMB, which arises as the grapes ferment. This chemical is the twin of a pheromone in cat urine, and this knowledge of the wine-and-pheromone kinship enhances rather than detracts from the wines’ popularity, as new oenophilic monikers proudly proclaim the cat-pee connection.

On January 22, 2014, for example, Jessica Yadegaran evaluated Cat’s Pee on a Gooseberry Bush, a 2008 New Zealand sauvignon, in the San Jose Mercury News. She proclaimed, “Cat themed wines have become a huge success, exceeding all sales expectations! It might not sound positive, but ‘cat pee’ is usually a favorable term used to describe the aromas in sauvignon blanc.”

That same day, the Week enthused, “You’d think a Sauvignon Blanc characterized as smelling like cat pee would be awful. You’d be wrong.” The unnamed author went on to note that it was doubtful that many had actually tasted cat pee; “they’re really referring to a certain funky tanginess.”49 Neil Ellis Sincerely Sauvignon Blanc 2006, a South African wine, was heralded with “One recalls Sancerre and its characteristic gooseberry (often affectionately, or derisively, referred to as cat’s pee). It is crisp and herbaceous, with mineral notes: a well-made wine that would be magic with salads.”

No one has investigated the infection status of people who’ve made these sauvignon blancs “crazy popular,” at least not yet, but my money is on their having the parasite.



If you find an error or have any questions, please email us at admin@doctorlib.org. Thank you!