“Pumpkin” Spice

The autumnal flavor designed to resemble the spices in freshly baked pumpkin pie (but doesn’t contain any actual pumpkin).

To start, pumpkin spice does not contain pumpkin. Rather, it is a spice blend of cinnamon, ginger, allspice, nutmeg, & clove which is used as an ingredient in pumpkin pies. This spice mix (or variations of it) goes back to Colonial America. Today though, instead of real spices, most commercially produced foods use pumpkin spice flavoring made from chemicals to simulate these spices and to replicate the taste of a freshly baked pumpkin pie.

One reason a synthetic flavoring is used (in lattes for example) is that using the actual spices make a latte taste a bit more like Indian masala tea (chai tea) instead of pumpkin pie. Synthesized pumpkin spice flavoring has been engineered to taste like the spices after they have been transformed by the pie baking process. Other reasons for using a synthetic flavor are reliability (the flavor is the same every time) and cost (synthetic flavoring is a lot cheaper than using actual spices).

He who controls the spice controls the universe

The craze for all things pumpkin spice began in 2003 with the limited release of Starbucks’ specialty seasonal drink, the Pumpkin Spice Latte (PSL). Building off of the success of their winter themed Peppermint Mocha and Eggnog Latte, Starbucks wanted an autumnal offering. Inspired by the flavors of freshly baked pumpkin pie they created the Pumpkin Spice Latte.

From big brands to small, these are just a few of the pumpkin spice products available for your autumnal seasonal needs.

In 2004 the drink was offered nationwide and became the most popular seasonal Starbucks beverage, generating an estimated $1.4 billion in sales as of 2017. The PSL started the flavor trend of all things getting a limited edition pumpkin spice variety. Candles, lip balm, cereal, soap, SPAM, candy, air fresheners, and more all have seasonal pumpkin spice variations.

Added info: Starting in 2015 the Starbucks PSL recipe was changed to actually contain a small amount of pumpkin. The flavor of the drink however is still created using pumpkin spice flavoring. Also, despite the autumnal seasonality of the drink, the PSL is on the Starbucks Secret Menu and you can buy it all year round.

Pumpkin beers are far older than pumpkin spice lattes (and probably older than pumpkin spice). Pumpkins were used as a malt substitute in Colonial America to make beer, but the seasonal pumpkin beers we enjoy today weren’t invented until the 1980s.

MSG (Safe to Eat)

Reports that MSG is dangerous stem from one anecdotal letter and years of racism.

Monosodium glutamate (MSG) is a compound made up of sodium and glutamate (an amino acid) found naturally in our bodies and in a variety of foods (tomatoes, cheeses, anchovies, mushrooms, etc). Usually when it’s mentioned people are referring to the synthesized food additive version which is added to meals to bring out their umami flavors. It’s been a commercially produced food additive since 1909 but, despite being used by tens of millions of people, 42% of Americans today think it’s dangerous. The cause of this fear goes back to one article.

Chinese Restaurant Syndrome

The April 4, 1968 edition of the New England Journal of Medicine contained a letter titled Chinese-Restaurant Syndrome by Dr. Robert Ho Man Kwok on his observations of eating American Chinese food. Kwok said that about 15 to 20 minutes after eating at a Chinese restaurant he developed a headache, weakness, heart palpitations, and numbness. He proposed several possible causes but singled out MSG as the answer. This single letter was the beginning of decades of mistrust in MSG.

The ideas of MSG side-effects and “Chinese Restaurant Syndrome” have largely been fueled by racism. Suspicion or fear of East Asian cultures, the exoticism of the “Orient”, and/or a general lack of knowledge has led some people to be suspicious of Asian cuisine. In 1969 New York City imposed regulations on MSG use in Chinese restaurants but not regulations on MSG in general. While the supposed adverse reactions to MSG should make consumers wary of any food item containing MSG, Chinese food in particular got singled out and maligned. Lots of processed western foods contain MSG, lots of plants naturally contain significant levels of MSG, and yet Doritos and shiitake mushrooms didn’t seem to get singled out quite like Chinese food did.

Asian restaurants were singled out and maligned for their use of MSG, but Western processed foods were not.

Safe to Eat

There is no connection between MSG and the symptoms Kwok described. The US Food & Drug Administration states that MSG is safe to eat and that there is no evidence to support claims of headaches and nausea from eating normal amounts of MSG. In double-blind studies using subjects who claimed to have sensitivity to MSG some subjects were blindly given MSG and, unaware they were eating MSG, had no ill effects. These tests were unable to reproduce any of the side-effects claimed about MSG.

MSG, like any food additive, is safe in moderation. Excess anything can make you sick. Because of the association of Chinese food to MSG, some Asian restaurants in the US have reduced their usage of MSG just to satisfy public opinion, to the detriment of the food and the customers’ taste buds.

Pineapples as Status Symbols

Because of their rarity pineapples became European decorative elements and status symbols.

The pineapple is native to South America but across thousands of years of cultivation it spread to Central America as well. The first European to encounter a pineapple was Columbus in 1493 who brought some back to the Spanish royal court (along with tobacco, gold, chili peppers, and the people he kidnapped). Europeans had never tasted anything like pineapple before and, because of their scarcity, to own one quickly became an exotic status symbol of the ultra wealthy.

Pineapples were in high demand but there was low supply so enterprising individuals set out to grow pineapples in Europe. The tropical conditions pineapples require make growing them in Europe a challenge. It took until the 17th century for farmers in the Netherlands to succeed, followed by the English in the 18th century. Mathew Decker even memorialized his pineapple growing achievement by commissioning a painting in 1720. These efforts produced more, albeit still not many, pineapples for the European & American markets. A single pineapple could go for around $8,000 in today’s dollars. A cheaper alternative was to rent a pineapple which people would do to show off at parties and such. These pineapples would be rented from person to person until the final person paid to eat it, unless it had rotted by then. A further down-market option was pineapple jam which could be shipped from Central/South America.

Because of their popularity, pineapples became a decorative element in a host of artistic mediums.

Pineapple Art

The Caribbean custom of placing pineapples at the front of a friendly home as a welcome to strangers, combined with years of being displayed at happy European social gatherings, led pineapples to becoming international symbols of hospitality. This combined with their association to wealth & high society helped make the pineapple a popular artistic motif. From this we get carved pineapple embellishments as finials on staircases, at the tops of columns, on New England gateposts, above front doors, as fountains, as furniture accents, Christopher Wren placed gilded copper pineapples on St. Paul’s Cathedral in London, the centerpiece of the Dunmore Pineapple folly in Scotland is a massive pineapple, etc.

Added info: any association of pineapple with Hawaii comes after the fruit was introduced there by the Spanish in the 18th century. Pineapple is not native to Hawaii.

American Lobsters: From Trash to Treasure

North America lobsters were originally a poor person’s food.

While an expensive luxury today, the American lobster (aka Homarus americanus, the Maine lobster, Canadian lobster, northern lobster, etc) was once a food of last resort. Native American tribes of the northeastern coastal regions would use lobsters as fertilizer, fish bait, and when necessary food. European colonists also viewed lobsters as inferior last-resort bottom feeders. These “cockroaches of the sea” became relegated to food for the poor, for servants, prisoners, slaves, and sometimes even feed for livestock.

The turnaround for lobsters began in the 19th century with two new industries: canning and railroads. As canning became a viable way to preserve & ship food, lobster meat became a cheap export from the New England area. Lobster meat was sent via rail to locations all around North America. This was followed by tourists visiting New England along some of the same rail lines. These tourists were able to finally taste fresh lobster meat instead of canned and lobster’s popularity grew. By the 1880s demand for lobster (especially fresh lobster which must be shipped alive) combined with a decrease in supply, lead to higher prices. This helped establish lobster as the expensive delicacy we think of today.

Expensive?

Like any commodity, lobster is subject to price fluctuations. While lobster typically maintains its cultural status as an expensive delicacy, this doesn’t always reflect the real cost. For example the over abundance of lobsters around 2009 sent the wholesale price of lobster from around $6 a pound to half that – but it would have been hard to notice. Restaurants don’t typically reduce their prices because an ingredient has suddenly become cheaper.

However, when lobster is less expensive it does appear in unexpected places. Around 2012 Gastropubs included lobster in dishes such as macaroni & cheese, fast food chains included lobster on their menus, Walgreens in downtown Boston even sold live lobsters – all things you don’t usually see when a commodity is expensive. Today, even in years when lobster is abundant and the cost is low, it is still thought of as a luxury item.

the Shamrock Shake & Uncle O’Grimacey

The McDonalds Shamrock Shake helped pay for the first Ronald McDonald house.

In 1970 McDonalds introduced their lemon/lime flavored green Saint Patrick’s Day Shake. It eventually changed flavors and names to become the mint flavored, and alliteratively titled, Shamrock Shake. Like the autumnal artificial scarcity of pumpkin spice, the Shamrock Shake is only available around Saint Patrick’s Day in the February through March timeframe (except in Philadelphia where it has two seasons).

McDonald’s founder Ray Kroc, Mayor Frank Rizzo, members of the Eagles organization, Fred Hill & his daughter all attended the opening of the first Ronald McDonald House, October 15, 1974.

Philadelphia’s two seasons of Shamrock Shakes goes back to the role it played in creating the first ever Ronald McDonald House. In 1969 Kim Hill, daughter of Philadelphia Eagle Fred Hill, was diagnosed with leukemia. By 1973 the Hills and members of the Eagles organization started the Eagles Fly for Leukemia charity which helped pay for the new oncology wing at the Children’s Hospital of Philadelphia (CHOP).

After seeing parents camped out in the hallways and waiting rooms while their children received treatments, the charity went a step further in 1974 and purchased an old seven-bedroom house at 4032 Spruce Street not far from the hospital. The house would be a place for visiting families to stay free of charge while their children received treatment – a “home away from home”. To help pay for this the Eagles partnered with the local McDonalds restaurants asking them to donate money from their next promotional food item, which just happened to be the Shamrock Shake. The Eagles asked them to donate 25 cents per shake but McDonalds executives asked if they could have the naming rights to the house if they donated all of the proceeds. Eagles general manager Jimmy Murray said “… for that money, they could name it the Hamburglar House.” From this, the first ever Ronald McDonald House was established in Philadelphia in 1974. Today there are more than 375 Ronald McDonald House programs around the world which, at what would have been more than 2.5 million overnight stays in hotels, save families around $930 million each year.

Uncle O’Grimacey

As positive as the Shamrock Shake’s impact has been, there have been some missteps. To help promote the Shamrock Shake, McDonalds introduced the new mascot character Uncle O’Grimacey in 1975. The Irish uncle of the purple mascot Grimace, Uncle O’Grimacey (complete in his kelly green hat, shamrock-patterned vest, and shillelagh) would travel from Ireland each year bringing Shamrock Shakes to McDonaldland. Uncle O’Grimacey was quietly phased out of McDonalds marketing after a few years due in part to an alleged incident in Philadelphia in 1978 where the person portraying him made statements in support of the IRA and that British soldiers were better dead than alive.

Casual racism isn’t just relegated to the semi-distant past however. In 2017 McDonalds ran an ad promoting the Shamrock Shake. Unfortunately they had a man wearing a tartan Tam o’ shanter playing the shake like a set of bagpipes (which would be Scottish) while standing in-front of Stonehenge (which is in England). McDonalds stopped the ad and apologized saying they are “… strongly supportive of Ireland and respectful of its culture”. Begosh and Begorrah.

Uncle O’Grimacey bringing Shamrock Shakes to McDonaldland.

Acquired Overbites

We have slight overbites because we use forks & knifes.

Before people ate with knives & forks they used their hands. In the absence of utensils people would have to clench and rip their food with their teeth. While crude, the result of all this pulling meant that people’s top and bottom rows of teeth lined up edge-to-edge. The introduction of utensils changed that.

The use of a knife & fork meant that a person no longer had to use their teeth to pull at their food, they could cut their food on their plate first. As a result anyone who used utensils (including us today) developed an overbite. Most of us have slight overbites where our top teeth hang out a bit in front of our bottom teeth, which comes from using utensils.

Looking at skulls we can see this effect across time, across cultures, and across social classes. Aristocrats could usually afford knives & forks before the peasantry, so they developed overbites before anyone else. For example, the more affluent members of 18th century Western Europe developed overbites before the people who couldn’t afford silverware. Further, the overbite took longer to develop in the American colonies, who were poorer than their countrymen back home in Europe.

In China the overbite developed as far back as 800 CE. Instead of knives & forks the Chinese aristocracy used chopsticks, but to eat meat with chopsticks the meat had to be pre-chopped as part of the meal preparation. As a result they didn’t have to pull at their food either and developed overbites centuries before Europeans.

QI discusses the development of overbites, and as it pertains to Richard III

Coffee: Sun-Grown or Shade-Grown?

Coffee plants want to be grown in the shade, which is better for the flavor and the environment.

Coffee plants thrive in the warm (but not too warm) areas between the tropics of Cancer and Capricorn, an area nicknamed the “coffee belt” (although, with global warming the areas in which coffee plants can grow is shrinking). These evergreen plants grow to be about 12ft tall and, while they like the warmth, they also like the shade. Coffee plants naturally grow best underneath the canopy of trees. Traditionally people would collect the coffee berries from plants growing wild around the forest and then process the seeds to make coffee. Enter industrialization.

Sun-tolerant coffee plants grown in efficient rows for sun-grown coffee. Photo by Shade Grown Coffee film.

Here Comes The Sun

From the 1970s to the early 1990s coffee producers were encouraged by the US Agency for International Development (USAID) to “upgrade” their processes and switch from shade-grown production to sun cultivation. Sun-tolerant plants had been engineered to better handle direct sunlight. With sun-grown cultivation you can grow coffee plants in greater density, harvest beans more efficiently through mechanization, producing higher yields, and make more money. This isn’t without costs.

One of the first steps for sun-grown coffee is deforestation (which increases global warming). Without trees there are no fallen leaves serving as mulch keeping weeds down. Leaves also biodegrade adding nutrients to the soil. This means sun-cultivated coffee requires more herbicides and fertilizers than shade-grown coffee. Further, when there are less trees there are less birds, and without as many birds to eat the insects, you need more pesticides. All of this means more chemicals on the plants and in the soil.

Made in the Shade

While still incorporating trees and other vegetation, modern shade-grown coffee farms can arrange their coffee plants more efficiently than former traditional practices. Even though this usually means lower yields and longer harvest times compared to sun-grown coffee, shade-grown coffee sells at a premium which can compensate producers for these factors.

The trees of shade-grown coffee farms serve as homes to hundreds of bird species. In Peru for example, the coffee plants of sun-grown coffee farms are home to around 61 bird species. This is in stark contrast to the trees of Peruvian shade-grown coffee farms which are home to 243 bird species. The Smithsonian Migratory Bird Center has said that, “shade-grown coffee production is the next best thing to a natural forest.”

As for the coffee itself, shade-grown coffee plants produce beans with higher density, developing natural sugars, which makes for better tasting coffee. Sun-grown coffee speeds up the growing process, which is good for maximizing efficiency, but it also creates higher acidity resulting in a more bitter taste.

People in Ethiopia, sitting in the shade, processing shade-grown coffee.

So shade-grown tastes better, requires less chemicals, it helps hundreds of bird species, and it helps stop global warming. Next time you’re buying coffee spend the extra few cents for shade-grown.

Added info: Coffee beans frequently come with little logos attesting to various positive attributes in which the coffee was produced. The certification that best represents the environmental benefits of shade-grown coffee is the Bird-Friendly label from the Smithsonian Migratory Bird Center.

Bird-Friendly is widely considered the gold-standard in coffee certification as it means the coffee is organic, shade-grown, and helps the local ecosystem. That said, given the various benchmarks that must be achieved, it’s hard to become certified as Bird-Friendly which means it’s hard to come by Bird-Friendly coffee.

Herbs & Spices (and Salt)

Herbs come from the leaves of a plant, spices are from any other part of a plant (and salt is a mineral).

In cooking they are frequently used together to flavor a meal, and their names get used interchangeably, but herbs and spices are not the same. In short:

HERBS
Herbs are a seasoning that are the leaves of a plant

SPICES
Spices are a seasoning from any part of a plant other than the leaves (roots, stalks, bark, seeds, and sometimes even the fruit)

Herbs

In greater detail, an herb typically comes from smaller deciduous plants without a bark stem. There are exceptions of course as lavender, sage, rosemary aren’t deciduous and never lose their leaves. Either way, the key is that the green leaves become herbs when they are used in cooking, medicine, teas, cosmetics, etc.

Spices

Spices can come from everything but the leaves of a plant. For example cinnamon comes from bark, ginger comes from roots, pepper comes from seeds, and chili powder comes from the pulverized fruit of the chili pepper. Saffron, one of the most expensive spices in the world at around $10,000 a pound, comes from the hand-picked stigma & styles from the Crocus sativus flower.

Allspice, despite a misconception, is not a blend of spices but is just one spice which comes from the berry of the Pimenta dioica tree. Its name comes from the fact that it tastes like a combination of cinnamon, nutmeg, and clove.

Herb & Spice Plants

There are a few plants that produce both an herb and a spice. The leaves of the Coriandrum sativum produce the herb cilantro while the seeds become coriander. Similarly the Dill plant produces dill the herb from its leaves, and dill the spice from its seeds.

Salt

The food seasoning odd-one-out, salt is a mineral and does not come from a plant (although salt is present in plants). There is a lot to say about salt but in short it’s been used as a preservative and a seasoning for thousands of years. It’s the only food seasoning that doesn’t come from a plant.

Norwegian Salmon Sushi

Japanese sushi didn’t contain salmon until a deal with Norway in 1992.

Like many of the oldest things in Japan, sushi originally came from China. Its earliest form was fish stored in fermented rice. The rice was used as a preservative and wasn’t eaten. Through a series of culinary improvements over the centuries the dish eventually became raw fish served with rice (to be eaten, not thrown out), which is how we know it today.

Norwegian salmon

Of the fish used to make sushi, salmon was not usually one of them. Pacific salmon tend to have parasites, making it unsafe to eat raw and needing to be cooked. Enter the Norwegians. Bjorn Eirik Olsen was part of a delegation to Japan in 1985 trying to sell Norwegian salmon to the Japanese. Norway had begun farming salmon in the 1970s and by the 1980s had an excessive amount of fish they needed to find buyers for. At the same time Japan had overfished their waters and were looking to diversify their supply of fish.

Selling salmon to the Japanese public for use in sushi was a difficult proposition because Japanese salmon wasn’t safe to eat raw. A marketing campaign couldn’t say that Norwegian salmon was parasite-free since that would only make people think of parasites, which wouldn’t help sales. It took a few years but by 1992 Olsen got Japanese frozen food producer Nichirei to purchase 5,000 tons of salmon at a heavily discounted price but on the condition that they sell it in grocery stores as raw salmon specifically for sushi. They also labeled their parasite-free Atlantic Norwegian salmon as ‘sāmon’ instead of the Japanese word for salmon ‘sake’, to help differentiate the two types. This was followed by a marketing campaign where they had chefs on Japanese TV demonstrate using salmon. It was a success.

In the years that followed salmon’s popularity took off. Salmon sushi started in the cheap sushi restaurants but eventually spread to restaurants of all levels in Japan and around the world.

Animal Names vs Meat Names

In English we have different names for animals vs those same animals as food because of the Norman conquest of England in 1066 CE.

From the 5th century until the 11th century England was ruled by the Anglo-Saxons. The Anglo-Saxons were descendant of Germanic tribes which is why, if we look along the language family tree, we can see that English is related to a host of Germanic languages. The early English language of the Anglo-Saxons took a turn however in 1066 CE when the Normans invaded and conquered the country.

The Normans were a French speaking people from Normandy, the northwestern area of France. After crossing the channel and conquering England, they became the ruling class. This led to a tri-lingual system where:

  • Latin was the language of the Church
  • Anglo-Saxon English was the language of the common people
  • Norman French was the language of the nobility, courts, and government administration
A portion of the Bayeux Tapestry documenting the Anglo-Saxon defeat to the Normans at the Battle of Hastings in 1066 CE.

What’s For Dinner?

Anglo-Saxons became the working-class hunters and farmers of England and, as they were the ones tending to the animals, they called the animals by their English names. The Norman rulers however more frequently encountered these animals when they were served on a plate, and in this culinary context called them by their French names.

Over the centuries this practice of using two different names was adopted into Middle English which then evolved into our Modern English. This linguistic duality, where a living animal is called one name in English while also being called by a different French name as food, has continued through to the present.

English animal vs French meat dual names include:

  • cow vs beef
  • calf vs veal
  • pig vs pork
  • sheep vs mutton
  • deer vs venison (although originally any hunted animal was called “venison”)
  • snail vs escargot

Interestingly we use the word “chicken” for both the animal and the meat. This is likely because chicken was one of the few meats that everyone could afford and since the common people were raising and eating them, their practice of using the English language name in both contexts carried on.

Also the word for “fish” in French is “poisson” which is too close to the word “poison”. It’s thought that this linguistic similarity, and the danger if you get them confused, is why we kept the English language word for both the animal and the meat. We also tend to use species names such as “salmon” or “flounder”, avoiding “fish” and “poisson” altogether.

Added info: this English vs French origin linguistic duality is found in a host of other examples beyond food. Deadly vs morbid, job vs profession, cookie vs biscuit, smell vs odor, calling vs vocation, etc.

Belvoir Castle, whose name means “beautiful view” in French, is a Norman castle in central England. The Normans pronounced it the French way as “bell-vwah”, but the local Anglo-Saxons had difficulty saying this and called it “Beaver Castle” instead, a practice that continues to today.