Egyptian Mummies: From Medicine to Paint

For hundreds of years Europeans used ground up Egyptian mummies as medicine and paint pigment.

The Arabic word mūmiyā (which later became “mummia”) was the name for the black sticky asphalt material that came out of the ground used as a sealant, an adhesive, and as medicine around the ancient world. Pliny the Elder and others wrote about the medicinal uses for mummia which became a bit of a cure-all for a range of ailments.

Unfortunately mummia the petroleum product looked like another black substance that was a byproduct of the Egyptian embalming process. As such the word “mummia” came to mean both the petroleum product AND the product of Egyptian mummification, which was then even further confused as meaning an entire mummified body. This is how we got the word “mummy”. Unfortunately this series of mistakes also led to hundreds of years of cannibalism.

Cannibal Medicine

Since the petroleum based mummia was used both externally as a salve as well as ingested internally, the Egyptian mummy version of mummia became used in the same ways. The 11th century physician Constantinus Africanus even described mummia as a “spice” found in the sepulchers of the dead. Soon the human version replaced the petroleum version and people began to crumble & grind human corpses for medicine.

With the Crusades, Europeans learned of mummia and its medicinal possibilities. This significantly increased European demand for Egyptian mummies and by the 15th-16th centuries there was a thriving trade in mummies. Thousands of bodies were being exhumed and shipped to Europe to be turned into medicines. In 1586 English merchant John Sanderson shipped 600 pounds of mummies to London to sell at various apothecaries. This was fueled in part by orientalism, that Egyptian mummies had some sort of exotic ancient knowledge or power.

Europeans would consume portions of Egyptian corpses for help with general pain, ulcers, inflammation, epilepsy, cough, difficult labor, etc – none of which worked, or if they worked it wasn’t the mummy that was the active ingredient. The practice was so common Shakespeare included mummy as an ingredient in the witches’ potion in Macbeth. Demand was so high that by the 17th century some mummy dealers were producing counterfeit mummies. Newly deceased people, animals, or prisoners who had been purposefully starved & executed, were put through a process to simulate ancient Egyptian mummies.

After a few hundred years of medicinal cannibalism Europeans began to express doubt as to the practice’s efficacy (and ethicality). The 16th century herbalist Leonhard Fuchs felt foreign mummies were acceptable but local ones were wrong. While doubts arose during the Renaissance in the 16th century it took until the 18th century age of Enlightenment for the practice to fall out of fashion. As consuming mummies slowly ended Egyptian mummies took on a new role: paint pigment.

The Egyptian Widow by Lourens Alma Tadema is an 1872 painting of Egyptian life potentially painted using mummy brown paint.
Liberty Leading the People by Eugène Delacroix is another painting that’s theorized to contain mummy brown.

Mummy Brown

Around the end of the 16th century artists began using ground up Egyptian mummies (mixed with other materials) to produce mummy brown, a shade of brown pigment. Apothecaries that were grinding up mummies for medicine began to grind them up for paint as well. As a paint it was good for shadows, flesh tones, and glazing. Artists Benjamin West, Martin Drolling, Lawrence Alma-Tadema, Edward Burne-Jones, Eugène Delacroix, and others all painted with mummy brown.

It wasn’t until the 19th century that mummy brown began to fall out of favor. That said as recently as 1926 C Roberson & Co. still sold mummy brown made with ground up Egyptian corpses. As mummy brown died out so too did hundreds of years of large-scale desecration of deceased Egyptians, using human beings for medicines and paints.

Zombies: Sadder Than You Think

The concept of Haitian zombies was used as a threat to keep slaves working.

Before Haiti was an independent country it was the French colony of Saint-Domingue where they produced sugar, coffee, cotton, and other goods. The French brought more than a million West African people to the colony as slaves, more than any other colony in the Caribbean. Slavery in Saint-Domingue was particularly brutal – most people were poorly fed, they worked 12 hour days, pregnant slaves frequently didn’t live long enough to have babies, torture was common. Life expectancy was about 3-6 years with about half of the enslaved people of Saint-Domingue dying within the first few years of arriving.

The brutal conditions of Saint-Domingue left the enslaved people hoping that, in death, their souls would return home to West Africa.

Haitian Vodou & Zombies

The Code Noir was a 1685 decree that outlined how slavery was to be conducted in the French empire. Among other things it stated that slaves were prohibited from practicing African religions and instead were forcibly baptized into Catholicism. What resulted was Haitian Vodou, a religious blend of West African beliefs (practiced in secret) given a veneer of Catholicism.

Part of this belief system was the idea that, upon dying, you would return to lan guinée (ie. Guinea, or West Africa). Their idea of heaven was to escape the slavery of Saint-Domingue and to simply go home. Feeling the allure of going home some people decided to escape slavery on their own terms. As such suicide was very common Saint-Domingue.

Initially suicide was seen as a viable way of getting to lan guinée but at some point there was a change. At some point (oral tradition is murky on when/how) suicide was prohibited and the punishment for committing suicide was that you’d be a slave forever – you’d become a zombie. The zombies of Haitian Vodou are not the Western pop culture shambling brain-eating zombies. The Haitian zombie was someone whose soul had been captured, denied entry to lan guinée, and was turned into an undead field hand with no chance of escape. Plantation slave-drivers used this to their advantage threatening slaves that if they killed themselves they would be turned into zombies to work forever under the control of a bokor/sorcerer. Unlike today what was feared was the threat of becoming a zombie, not the actual zombies themselves.

1929’s White Zombie was the first zombie movie. It used some Haitian Vodou beliefs but took significant artistic license.

White Zombie

Over time the zombie concept evolved and changed. The sensationalistic 1929 William Seabrook travel book The Magic Island introduced voodoo and zombies to mainstream Western culture. This inspired the 1932 film White Zombie, which was the first zombie movie. White Zombie stars Bela Lugosi as the villainous Murder Legendre (a bit on the nose) who’s a bokor enslaving people as zombies to be his henchmen and to work in his sugarcane mill. White Zombie used Haitian Vodou ideas but with a lot of artistic license. Later zombie stories dropped the Saint-Domingue threat of eternal slavery, then they dropped the bokor master commanding the zombies. Aside from being mindless undead creatures, the zombies of today have little resemblance to their sadder more terrifying origins.

Added info: following the Haitian revolution of 1791–1804, the 1883 Haitian Criminal Code outlaws the practice of turning someone into a zombie.

Cabinet of Curiosities

Before museums existed, people had cabinets/rooms to display their collected treasures.

There was a time when museums did not exist. The role of collecting, preserving, and displaying the art, artifacts, and wonders of the world belonged largely to individuals. As far back as the 4th century BCE Greeks were collecting exotic treasures from the East. More than just trading in commodities, the Greeks collected the art and textiles from these far away cultures. Roman emperor Augustus decorated his homes not just with art but with rare objects and bones of giant animals. Over the centuries, as cultures explored & traded with increasingly distant lands, the trends in what was collectible grew & changed. By the 16th and 17th centuries wealthy European collectors had amassed enough objects that they created special cabinets and/or rooms to show-off their collections. They created cabinets of curiosities.

Ole Worm’s Museum Wormianum is one of the most famous cabinets of curiosities.
Ferrante Imperato’s Dell’Historia Naturale is another famous wunderkabinett.

Wunderkabinett

From the German for art (kunst) or marvels (wunder) and cabinet (kabinett) or room (kammer), these cabinets & rooms were places where Renaissance scholars, merchants, royalty, and others could store their collections. Collecting was very fashionable in 17th century Europe and these cabinets were dedicated spaces to displaying all manner of objects. Like the contemporaneous maps of the world, some of these spaces were designed for show while others were more utilitarian.

A collection of cabinets and rooms displaying all manner of curiosities.

Some collectors had thousands of specimens. The objects in these cabinets were thoughtfully categorized and organized, each piece contributing to the larger whole. Collecting was a way to bring order to the world, to exert some level of control over something that is uncontrollable. What was stored & displayed in these cabinets depended on the collector, but broad categories of objects included:

  • Fine art
  • Applied art (scientific instruments, anthropological objects, etc.)
  • Natural materials (fossils, shells, rocks, etc.)
  • Historical objects

These categories, as well as these collections, served as the precursors to our modern museums. The Amerbach Cabinet was a collection of art, books, coins, etc. that was assembled by various members of the Amerbach family. It was eventually co-purchased by the city of Basel & the University Basel and became the Kunstmuseum Basel in 1661, the first public museum in the world. Francesco I de’ Medici had his studiolo, a 26 x 10 foot room of curiosities that is part of the Palazzo Vecchio in Florence. Other Medici possessions served as the start of the Uffizi Gallery. Elias Ashmole, who amassed his fortune & collection through sometimes questionable means, gifted his collection to the University of Oxford which became the Ashmolean Museum in 1683.

Throughout the 18th century an increasing number of private collections were converted into public museums, some of which still exist today but all of which helped define what museums have become.

Added info: In 1784 Charles Wilson Peale’s collection became the Philadelphia Museum which was the United States’ first museum (and also the first to display a mastodon skeleton).

The Necronomicon

The most famous magical book of occult knowledge that sounds real, but isn’t.

Possibly the most famous book that doesn’t exist, the Necronomicon is a fictional book of dark magic invented by weird fiction / horror author H.P. Lovecraft. First mentioned in 1924’s The Hound, the Necronomicon is part of Lovecraft’s Cthulhu Mythos, a dark collection of cosmic horror, ghouls, inter dimensional monsters, and unspeakable evil all set in an uncaring indifferent universe. The best interpretation of the name “necronomicon” is “book considering (or classifying) the dead”. Supposedly written in 738 CE by Abdul Alhazred (who was later eaten alive by an invisible monster in broad daylight), the Necronomicon is a dark book of forbidden knowledge and most Lovecraft characters who read it come to horrible ends.

Lovecraft felt to produce terror a story had to be “… devised with the care and verisimilitude of an actual hoax.” As such the Necronomicon is very much treated as if it were a real book. Lovecraft enjoyed making his fictional world seem believable. For example, in a list of real books he would throw in a few real-sounding fake ones (such as the Necronomicon) – blurring the line between reality and fiction. Similarly he wrote that there were copies of the Necronomicon held by 5 world institutions: the British Museum, Harvard, Bibliothèque nationale de France, University of Buenos Aires, as well as Miskatonic University … which is a fictional school set in the equally fictional city of Arkham, Massachusetts. Again, including a fictional creation in a list of real places making something fake seem real.

H.P. Lovecraft’s Necronomicon can be found in a host of movies, books, comics, and more.

Crawling Chaos

Part of the appeal of the Necronomicon (beyond the spooky name) is that, like all good suspenseful horror, Lovecraft gives the reader just enough details to understand the idea of the Necronomicon but the exact contents (or even a good physical description of the book) are left open to your imagination. This vagueness also kept the door open for future expansion of ideas. Soon other authors began to include the Necronomicon in their work, and so it spread.

Today the Necronomicon has gone beyond the works of Lovecraft & his friends and has appeared in countless other projects. It’s in books, movies, cartoons, comics, video games, music, etc, each with their own take on exactly what the Necronomicon is, but it’s always a book of dark magic. It’s in the The Evil Dead series, it’s in an episode of The Real Ghostbusters, Mr. Burns mentions it at a meeting of republicans in The Simpsons, it’s the name of a German thrash metal band, Michael Crichton and Stephen King have both referenced it, etc. The book of the dead lives on, spreading its tentacles across dark fiction. Cthulhu fhtagn.

Added info: The fictional Arkham Asylum in the DC Universe, where many of Batman’s foes are frequently locked away, was named after the fictional Lovecraft town of Arkham, Massachusetts.

Mr. Burns has Bob Dole read from the Necronomicon.

In a cleverly titled episode The Collect Call of Cathulhu, the Ghostbusters discuss that the Necronomicon will be on display at the New York City Public Library.

Vampires & Arithmomania

According to folklore, vampires have an obsessive compulsion to count.

The idea of an undead creature murdering and/or consuming the living is found in a host of cultures around the world. Some of these monsters are cleverly cunning while others are mindless killing machines, but the general vampiric themes are shared. Our modern idea of vampires is largely based on the 1897 Bram Stoker novel Dracula, which in turn took ideas from Romanian folklore.

The Final Countdown

One curious component of vampiric folklore in Slavic down through Greek cultures is the vampire’s obsessive compulsive need to count things. Vampires were said to have arithmomania and needed to count things and actions. People were said to use this to their advantage by scattering seeds, salt, grains of rice, or whatever else they had in tiny sizes & large numbers, on the floor of their houses. An intruding vampire would then have to count each seed/grain giving the homeowner time to escape or, if it took the vampire long enough, the sun to rise and vanquish the undead intruder. Similarly it was believed vampires would count all of the holes in a fishing net leading to nets being sometimes hung by the entrances of homes. It was also tradition to spread seeds/grain in a cemetery on the grave of a possible vampire so, upon rising from the grave, they would be kept busy through the night counting and stay away from the living.

Strangely this obsession with counting wasn’t always limited to vampires. In parts of Italy it was believe that witches had a similar affliction. On the Eve of St. John’s Day you could defend yourself from a witch by giving her a red carnation because she would have to count the petals giving you time to escape. In America some felt witches had to count the holes in sieves, leading some to hang them by their door.

I Love to Count

Ultimately this compulsion to count things is the joke behind Count von Count on Sesame Street. He’s a vampire who loves to count and teaches children numbers. Like the Slavic vampires of folklore he is driven to count anything he sees. It’s a joke hidden in plain site.

In the X-Files episode “Bad Blood” a drugged Mulder defends himself against a vampire by throwing a bag of sunflower seeds on the floor.

“Pumpkin” Spice

The autumnal flavor designed to resemble the spices in freshly baked pumpkin pie (and doesn’t contain any actual pumpkin).

Pumpkin spice does not contain pumpkin. It’s a blend of cinnamon, ginger, allspice, nutmeg, and clove used as an ingredient to spice up pumpkin pies. This spice mix (or variations of it) goes back as far as colonial America. Unlike the spice blend you buy in the store however, most commercially produced pumpkin spice flavored products don’t contain these spices. Commercial pumpkin spice flavor uses chemicals to simulate these spices which replicates the taste of a freshly baked pumpkin pie.

One reason a synthetic flavor is used, in a latte for example, is that using the actual spices makes it taste a bit more like Indian masala tea (chai tea) instead of pumpkin pie. This synthesized flavor has been engineered to taste like the spices after they have been transformed by the pie baking process. Other reasons for using a synthetic flavor are reliability (the flavor is the same every time) and cost (synthetic flavoring is a lot cheaper than using actual spices).

He who controls the spice controls the universe

The craze for all things pumpkin spice began in 2003 with the limited release of Starbucks’ latest seasonal specialty drink, Pumpkin Spice Latte (PSL). With the success of their winter themed Peppermint Mocha and Eggnog Latte, Starbucks wanted an autumnal offering. Inspired by the flavors of freshly baked pumpkin pie the marketing team chose the name Pumpkin Spice Latte despite the drink not containing any actual pumpkin.

From big brands to small, just a few of the pumpkin spice products available for your autumnal seasonal needs.

In 2004 the drink was offered nationwide and became the most popular seasonal Starbucks beverage, generating an estimated $1.4 billion in sales as of 2017. It also started the flavor trend of all things getting a limited edition pumpkin spice variety. You can find candles, lip balm, cereal, soap, SPAM, chocolate candy, air fresheners, beer, and more all with pumpkin spice flavors.

Added info: Starting in 2015 the Starbucks PSL now contains some amount of pumpkin, but the flavor of the drink is still created using a pumpkin spice flavoring. Also, despite the autumnal seasonality of the drink, the PSL is on the Starbucks Secret Menu and you can buy it all year round.

Gemütlichkeit

The German concept of belonging & happiness that English doesn’t have a word for.

Sitting in a tent at Oktoberfest one song that will be played again and again is Ein Prosit. It only has four words in the lyrics, it takes less than 30 seconds to sing, and after singing it the band leader directs everyone to drink. The lyrics are:

GERMAN

Ein Prosit, ein Prosit
Der Gemütlichkeit

ENGLISH

A toast, a toast
To Gemütlichkeit

What exactly are we toasting? What is Gemütlichkeit?

Gemütlichkeit is the good feeling of being with friends enjoying the simple things in life.

Good Feeling

Gemütlichkeit (roughly: ge-mut-lee-kite) is a German word that we don’t have a direct translation for in English. It’s a feeling of happy belonging, sort of like cozy but unlike cozy it’s felt in the company of others. Gemütlichkeit can’t be felt alone. It’s the good feeling you get wandering a Christmas market with your family, it’s a summer BBQ in a friend’s back yard, and of course it’s gathering together at a beer garden. Gemütlichkeit is a state of mind. It’s the enjoyment of simple pleasures shared with others.

Part of gemütlichkeit’s meaning comes from its origins. In the early 19th century Biedermeier period, industrialization helped create a new German middle class. This growing population used their new found money & free-time to embrace a quieter, simpler life. Feeling secure and happy with friends & family was more important than politics. This was also around the start of Oktoberfest, which began as a wedding festival but turned into an annual tradition in 1811. Gemütlichkeit and Oktoberfest go well together because, as people gather for good food, beer, and fun, they’re celebrating the simple things in life with others.

The legendary Franzl Lang sings Ein Prosit, a toast to gemütlichkeit.

Rednecks & Hillbillies

The terms redneck and hillbilly both come from rebellious 17th century Scottish protestants.

Rednecks

In 17th century, King Charles I pushed for greater religious uniformity across the British Isles. Scottish Presbyterians disapproved as these reforms were increasingly Catholic in style & organization. In 1638 thousands of Scots signed the National Covenant (sometimes using their own blood as ink), signifying their preference for a Presbyterian Church of Scotland and their refusal to accept the reforms made by Charles. Going one step further, some of these “Covenanters” took to wearing red cloth on their necks as an outward sign of their resistance. These dissenting Scottish religious rebels were the original “red necks”.

Looking closely at The Signing of the National Covenant in Greyfriars Kirkyard, Edinburgh by William Allan you can see the man signing the Covenant at the center is having his blood drawn by a dagger for him to use as ink.

Hillbillies

Political and religious tension continued around the British Isles throughout the late 17th century which led to the 1688 Glorious Revolution. On the one side of this revolution was Catholic King James II and those who supported a strong monarchy, on the other were Protestants & Parliamentarians. Afraid of a Catholic dynasty and that James would leave the throne to his Catholic son James Francis Edward, seven influential English nobility invited the protestant Dutch Prince William of Orange to invade England and take the throne.

Around the same time, Scottish Presbyterian leader Richard Cameron was preaching a message of rebellion against the English. Being a religious nonconformist, Cameron took to being a field preacher and spread his radical message outdoors away from Scottish towns. His followers (the Cameronians) were given the nickname “hillmen” due to their outdoor religious gatherings.

As William of Orange easily invaded England, and successfully took the throne, he was supported by Scottish Protestants. The Scottish living in Northern Ireland at the time fought against the Jacobite supporters of King James. William of Orange was nicknamed “King Billy” and his Ulster Scots Protestant supporters were nicknamed “Billy boys”. Eventually these two Scottish Protestant rebel nicknames of “hillmen” and “Billy boys” got combined to form “hillbilly boys” and then just “hillbilly”.

Ulster Scot supporters of William of Orange became known as “Billy Boys” which, when combined with the Scottish Cameronian nickname of “hillmen”, eventually became “hillbilly”.

American Rednecks & Hillbillies

Despite their successful support for William many Scottish were still oppressed for being Presbyterians and for being Scottish. Searching for greater religious & personal freedom they began to emigrate in larger numbers from Ulster to the British colonies in North America. An estimated 200,000 Ulster Scots (aka Scotch-Irish) emigrated to the American colonies between 1717 and 1775. Settling up and down the East coast and throughout Appalachia, these Scottish protestants brought with them their religion, the rebelliousness, as well as their nicknames.

Over the centuries the meanings of both “redneck” and “hillbilly” have changed. During the “Redneck War” of 1920-21 “redneck” was used to label the unionizing coal miners (many of whom were Scotch-Irish) who wore red bandanas in solidarity. The term has also been used to describe early 20th century southern Democrats as well as more literally to describe poor farmers with sunburnt necks. Hillbilly also took on a more literal interpretation to describe the people who settled the rural hilly areas of Appalachia and the Ozarks. Today both terms are generally used as derogatory slurs for poor rural whites.

Sunglasses

Humans have been making devices to shield their eyes from the sun for thousands of years. Today one company dominates the market.

Living around the Arctic where the bright sunlight reflects off the ice & snow, the indigenous peoples of North America & Greenland developed the earliest sunglasses. These 4,000 year old proto-sunglasses were carved from a variety of materials and featured very thin slits allowing the wearer to see while keeping their eyes protected by blocking excessive sunlight. This idea has been recreated many times in a variety of styles from the 1930s to the present.

The traditional “sunglasses” of the Arctic have been reinvented many times over the years.

The Venetians, who had been making clear corrective eyeglasses since the 13th century, were among the first to produce sunglasses with glass. In the 18th century the glass makers of Murano produced green-tinted eyeglasses (as well as what resemble handheld mirrors but with transparent green glass) through which wealthy Venetians could look across the water while protecting their eyes from reflected light.

Venetians used green glass to protect their eyes while soldiers in the American Civil War used a variety of colors.

By the 19th century it was not uncommon for soldiers, on both sides of the American Civil War, to wear colored spectacles of blue/gray/green to protect their eyes while marching in the sun. But sunglasses were still primarily utilitarian. They didn’t become a fashionable part of mainstream culture until the 20th century.

20th Century Sunglasses

In the early 20th century Sam Foster had a plastics company that primarily sold women’s hair accessories, but as the trend in women’s hair changed to shorter hair styles (negating the need for so many hair accessories), he had to find a new product to sell. In 1929 he began selling inexpensive plastic sunglasses to beachgoers for 10 cents a pair on the Atlantic City boardwalk. This was the beginning of the Foster Grant eyewear company. Foster Grant sunglasses became the shades of Hollywood celebrities which helped make sunglasses not just about protecting your eyes but also about fashion. Sunglasses could now be about style as well as function.

In 1929 Bausch & Lomb, who were already making optical equipment for the military, began work for the U.S. Army Air Corps to develop sunglasses that wouldn’t fog up and would reduce glare for pilots. This gave us the iconic “Ray-Ban Aviator” sunglasses. Aviator sunglasses were also the start of Ray-Ban eyewear company, which began as the civilian division of Bausch & Lomb. Ray-Ban would go on to make another iconic model of sunglasses, the Wayfarer, in 1956.

(Side fact: Roy Orbison fell into his wayfarer signature style accidentally while on tour with the Beatles in 1963. He forgot his regular glasses on the plane and had to wear his wayfarer sunglasses on stage, and thus was born his iconic look.)

Sunglasses became about function & fashion in the 20th century. Also, Tom Cruise movies helped popularize two of Ray-Ban’s most famous models.

Luxottica

Today the sunglasses market is dominated by Luxottica, an Italian eyewear juggernaut which is the largest eyewear company in the world. They’re the company actually making the sunglasses of luxury brands such as Chanel, Prada, Ralph Lauren, Versace, etc. Luxottica’s dominance is due in large part to their vertical integration control over the eyewear industry. They own major distribution retail stores such as LensCrafters, Target Optical, Pearle Vision, and Sunglass Hut. They own major eyeglass brands including Oakley and Ray-Ban, and they manufacture the eyewear for all of the above. They even own EyeMed, the second largest vision insurance company in America. You could go from getting a vision prescription, to selecting a pair of glasses, to buying them at a retail store and pay Luxottica at every step of the way.

Luxottica’s control over the market is why eyewear prices have gone up and not down. The proliferation of brands & stores competing for sales isn’t as competitive as it seems since Luxottica is behind many of them. In Luxottica owned stores 89% of the products available are made by Luxottica. Most of these glasses are the same quality, just different styles. Because of Luxottica, frames that cost maybe $15 to produce can be sold for hundreds of dollars. As of 2019 Luxottica controlled around 40% of the eyewear market.

60 Minutes’s 2012 report on eyeglass juggernaut Luxottica.

Added info: Beyond just blocking excessive bright light, good sunglasses block most ultraviolet (UV) light from damaging your eyes. Darker glasses don’t necessarily block more UV light so it’s worth buying reputable sunglasses that have been engineered & certified to offer UV protection. It’s better to not wear any sunglasses at all than ones that don’t block UV light because your pupils will widen in the shade of junk sunglasses and in so doing allow in more UV rays.

MSG (Safe to Eat)

Reports that MSG is dangerous stem from one anecdotal letter and years of racism.

Monosodium glutamate (MSG) is a compound made up of sodium and glutamate (an amino acid) found naturally in our bodies and in a variety of foods (tomatoes, cheeses, anchovies, mushrooms, etc). Usually when it’s mentioned people are referring to the synthesized food additive version which is added to meals to bring out their umami flavors. It’s been a commercially produced food additive since 1909 but, despite being used by tens of millions of people, 42% of Americans today think it’s dangerous. The cause of this fear goes back to one article.

Chinese Restaurant Syndrome

The April 4, 1968 edition of the New England Journal of Medicine contained a letter titled Chinese-Restaurant Syndrome by Dr. Robert Ho Man Kwok on his observations of eating American Chinese food. Kwok said that about 15 to 20 minutes after eating at a Chinese restaurant he developed a headache, weakness, heart palpitations, and numbness. He proposed several possible causes but singled out MSG as the answer. This single letter was the beginning of decades of mistrust in MSG.

The ideas of MSG side-effects and “Chinese Restaurant Syndrome” have largely been fueled by racism. Suspicion or fear of East Asian cultures, the exoticism of the “Orient”, and/or a general lack of knowledge has led some people to be suspicious of Asian cuisine. In 1969 New York City imposed regulations on MSG use in Chinese restaurants but not on MSG in general. While the supposed adverse reactions to MSG should cause caution for any food item containing MSG, Chinese food in particular got singled out and maligned. Lots of processed western foods contain MSG, lots of plants naturally contain significant levels of MSG, and yet Doritos and shiitake mushrooms didn’t seem to get singled out quite like Chinese food did.

Asian restaurants were singled out and maligned for their use of MSG, but Western processed foods were not.

Safe to Eat

There is no connection between MSG and the symptoms Kwok described. The US Food & Drug Administration states that MSG is safe to eat and that there is no evidence to support claims of headaches and nausea from eating normal amounts of MSG. In double-blind studies using subjects who claimed to have sensitivity to MSG some subjects were blindly given MSG and, unaware they were eating MSG, had no ill effects. These tests were unable to reproduce any of the side-effects claimed about MSG.

MSG, like any food additive, is safe in moderation. Excess anything can make you sick. Because of the association of Chinese food with MSG some Asian restaurants in the US have reduced their usage of MSG just to satisfy public opinion, to the detriment of the food and the customers’ taste buds.