Fact, Fad, or Fiction? A Brief History of Early Allergy Science

This guest post was written by Theresa MacPhail—assistant professor in the Science, Technology, and Society Program at Stevens Institute of Technology. 

“Many physicians think that idiosyncrasies to foods are imaginary.” – Albert Rowe, MD (1951)

Two years ago, my 63-year-old aunt developed hives. Large red wheals covered her entire body, and the slightest pressure to her skin—including wearing clothes—caused her pain. Over the course of her life, she had coped with eczema and the occasional rash, but this was new. This was different.

Her doctor sent her to a dermatologist, who—dumbfounded—sent her back to her doctor. After many medical appointments, blood tests, and rounds of steroids, an allergy specialist asked her to undertake an elimination diet, cutting out several foods. My aunt’s hives immediately cleared, and it was only after she introduced wheat back into her diet that the hives resurfaced. Her diagnosis: a wheat allergy.

My aunt’s experience is an all-too-common tale of food allergy classification: routine misdiagnosis, common misconception, and a general lack of understanding within the broader medical community. What is it about food allergies that make this story so familiar? Why are food allergies and intolerances so difficult to diagnose and treat? It turns out that our troubles with allergy diagnosis have a long and complicated history.

Rose Colds & Sea Anemones: Early Allergy Science

We begin in 1819 when the physician John Bostock presented the first clinical description of hay fever—or summer catarrh—to the medical community. By the mid-1800s, doctors had begun diagnosing patients with “summer” or “rose” colds (which we now call hay fever or seasonal allergies). In 1905, immunologists discovered they could produce an anaphylactic response in animals (injecting toxin from sea anemones into dogs) and began experimenting with allergic reactions in the laboratory. These anaphylactic responses to sea anemones were not considered allergic reactions or “allergies.” That link would be discovered later.

Hay fever and seasonal allergies were relatively easy for clinicians to diagnose with skin tests and desensitization techniques. Desensitization—or allergen immunotherapy—in its early form involved allergens converted into a serum or vaccine and injected into a patient. Leonard Noon and John Freeman discovered allergen immunotherapy in 1911, and this technique is still used for treating seasonal allergies today.

Until the early 20th century, food allergy remained somewhat of a nebulous concept. It was widely recognized but hadn’t yet been proven. In 1912, Oscar Menderson Schloss breathed legitimacy into food allergy diagnosis and proved its existence. An American pediatrician, Schloss developed a skin scratch test with which he correctly diagnosed egg sensitivity. While this was seen as a breakthrough in allergy detection, skin scratch tests did not produce consistent results, as many patients with obvious clinical allergies didn’t react to these tests.

A leading difficulty with allergy diagnosis (food and seasonal)—both past and present—has been distinguishing allergy symptoms from the bevy of other ailments they mimic. Food allergy reactions are also highly idiosyncratic—meaning that no two patients with an egg or wheat sensitivity will necessarily react to the same degree or in the same fashion. Famed allergy specialist Warren T. Vaughan argued that the greatest difficulty in understanding and studying food allergy is the inconsistency of responses to different exposure levels among individuals. By 1931, after years of practice, Vaughan still couldn’t find logical patterns in the allergy symptoms of his patients. He had no explanation for why two patients reacted differently to equal doses of an allergen, concluding that “allergy to food is always an individual affair.”

By the late 1930s, physicians began realizing that chronic food allergies were far more prevalent among the general population than previously imagined. In some cases, food allergies were considered responsible for patient migraines, hives, intestinal troubles, bladder pain, and asthma. Guy Laroche and Charles Richet—two prominent French allergists at the time—argued that older physicians had failed to properly label food allergies as “alimentary anaphylaxis,” instead of classifying these events as medical anomalies. For Laroche and Richet, the vigorous tracking of patient diet and symptoms proved their hypothesis: physicians were failing to recognize anaphylactic episodes to food as the result of an allergic response. This was a breakthrough.

A Fad is Born & Modern Trends

Because allergy diagnoses relied heavily on patient input and were poorly understood, many doctors dismissed allergies as a response to emotional stress or neurosis. Doctors believed that these patients—the majority of whom were women—overplayed symptoms to garner attention or sympathy. It became a “grab bag” diagnosis, especially in the hands of general practitioners. As diagnoses surged, Samuel Fineberg warned that the glut of allergy research—only a few decades old—had led clinicians to dismiss allergies as just a trend. One prominent allergist observed that older generations regarded food allergy “as a passing fad.” Many today still view food allergies and intolerances as fads, although this is changing.

And while perceptions are evolving, allergy treatments have mostly remained stagnant. Between confirmation of the first food allergy in 1912 and the late 1960s, avoidance was the only prescription for food allergy patients. In 1935, food allergy specialist Dr. Albert Rowe argued that mild allergies couldn’t be diagnosed with skin tests alone, and insisted that elimination diets were a superior remedy to skin testing. He created a guide for physicians and patients, which became widely used among allergists from the late 1930s to as late as the 1980s. Rowe counseled that food allergy should not be dismissed as “mere fancy” but taken as medical fact, and helped shift the perception of food allergies in the medical community.

As evidenced in this history, food allergy treatments haven’t changed much. Desensitization for seasonal allergies has been around since the early 1900s, food allergy desensitization (oral immunotherapy), while relatively more recent, still builds off of the same concept of desensitization. With oral immunotherapy, the patient ingests small amounts of the allergic food in gradually increasing amounts. It’s not widely practiced at present, and is only offered by select allergists nationwide.

We can still see the echoes of this history when we look at current debates over food allergy versus food sensitivity designations. Take gluten, for example. While wheat allergy and the autoimmune disorder Celiac Disease are accepted medical conditions, gluten sensitivity is still debated by researchers and the public alike.

There is still much we don’t understand about food allergies and intolerances, but increasing research in this space holds promise for solving these medical mysteries. Fact, fad, or fiction? As history has shown, only through scientific advancements and research will facts eclipse fad and fiction.   

Part Two: Food Allergies Today

Stay tuned for part two of this story as we discuss the modern world of food allergy—epinephrine auto-injectors entering the market, the staggering increase in food allergy diagnosis, the LEAP study, and oral immunotherapy.