Mistakes Happen (Sometimes Intentionally)

Nothing is perfect and we should embrace mistakes and imperfections.

Mistaken Mistakes

Persian carpets (aka Iranian carpets) come in a diversity of designs and sizes, but they frequently contain repeating symmetrical patterns. One alleged feature in handmade Persian carpets is a mistake in the design pattern (not in the construction) included intentionally. This “Persian flaw” serves as a reminder that only Allah is perfect. The flaw would be something small only noticed by the keenest of observers. It’s also been said that the Amish have a similar practice, that they include an intentional flaw (a “humility block”) in their quilts as a reminder that only God is perfect … but it isn’t true.

Lancaster curator Wendell Zercher has quoted Amish quilt makers as saying “… no one has to remind us that we’re not perfect.” As for Persian flaws, most accounts of this idea come from Western sources and is probably an example of orientalism. While both of these are nice stories that probably help to sell imperfect rugs & quilts, we have little to no evidence to support them. If anything, to intentionally make just one mistake out of humility would prove the opposite, bragging that you have the ability to make a perfect creation (but choose not to).

Actual “Mistakes”

There are however some cultures that really do include intentional imperfections in their work. Women in the Punjab region between India & Pakistan create Phulkari shawls of intricate designs. In these designs they sometimes include “mistakes” which are momentary changes in the overall design pattern. These changes are included to mark important events during the creation of the shawl (births, weddings, deaths, etc). Sometimes the symmetrical pattern is disrupted as spiritual protection from the evil eye.

On the left is a phulkari shawl with intentional changes to the pattern. To the right is a Navajo weaving featuring a “spirit line”.

Some Navajo also include imperfections in their weavings for spiritual reasons. The ch’ihónít’i (aka the “spirit line” or the “weaver’s path”) is a single line leading out of the middle of a design to the edge of the weaving. The spirit line is thought to give the weaver’s soul a way to exit the weaving so as to not get trapped in the design.

Embrace Imperfections

Of course if you accept that nothing is perfect then you have no need to add imperfections because everything is imperfect. The Japanese concept of wabi-sabi is the Zen view that everything is imperfect, impermanent, vulnerable. Unlike Western design ideas which frequently strive for idealized perfection, wabi-sabi celebrates the imperfections that make everything (and everyone) unique.

Kintsugi repaired ceramics, using gold & lacquer to feature (rather than hide) the imperfections.

Building off of wabi-sabi, kintsugi is the practice of repairing broken pottery with bits of valuable metals & lacquer that, rather than trying to seamlessly hide the repaired cracks, highlights them. Kintsugi honors the history of the object and celebrates its imperfections. Nothing lasts forever and we should recognize the beauty of imperfect vessels.

A crash course on the Japanese concept of wabi-sabi.

Ugly Fruits & Vegetables

In the West this embrace of the imperfect has recently manifested itself in ugly fruits & vegetables. Imperfect looking produce has traditionally gone unsold and makes up 40% of total food waste. Producers throw away food because they don’t think retailers will want it (it doesn’t meet “quality standards”) and then retail stores throw away the unsold odd looking food that customers won’t buy. This is all despite the fact that the taste and nutritional content of this “ugly” food may be identical to “normal” looking produce.

The European Union declared 2014 the European Year Against Food Waste. The French supermarket chain Intermarché began their “Inglorious Fruits and Vegetables” marketing campaign that celebrated ugly looking produce, gave them their own section in the store, and sold them at a discount. It proved so successful that other stores began their own campaigns as customers began to accept the wabi-wabi nature of produce.

The Intermarché marketing campaign to help reduce food waste was a huge success.

the First Thanksgiving Menu

Lacking key ingredients, the menu at the first Thanksgiving of 1621 was a bit different than the traditional turkey dinner of today.

In the fall of 1621 the English Pilgrims and the Wampanoag came together in Massachusetts for, what has subsequently become a much mythologized, 3 day harvest festival. That they were still alive following the deaths of half their fellow pilgrims the previous winter, having their supplies fortified by the Wampanoag, and then having completed a successful summer growing season, the pilgrims had a lot to be thankful for. What they ate as they gave thanks is debatable.

Definitely on the Menu

One food that was definitely served was venison. Massasoit, the leader of the Wampanoag, had 5 deer brought to the event. Another meat on the menu was “wild fowl”, but exactly what kind of birds these were is unknown. It’s possible that there was turkey at the first Thanksgiving but more likely it was goose or duck (or a combination). Other regional bird options at the time would have been swan and passenger pigeon.

Also definitely present was corn. The Wampanoag, who used the Three Sisters method of farming, had taught the pilgrims how to grow corn. As the pilgrims had grown a successful crop of Flint corn (aka “Indian corn”) it was cooked into a porridge, a bread, and/or with beans.

Maybe on the Menu

Given that the Plymouth Colony was by the water it’s very likely that seafood was also served. Eels, clams, muscles, cod, bass, and/or lobsters were very likely a part of the meal. It’s worth noting though that, unlike today, lobster was considered a food of last resort.

There were certainly vegetables & fruits on the menu but which ones were never specified (other than corn). Chestnuts, walnuts, beans, onions, carrots, cabbage, pumpkins, and various squashes were all grown in the area. Blueberries, plums, grapes, and raspberries were also grown in the area and could have been present. While cranberries might have been served cranberry sauce definitely was not since the colonists lacked the necessary sugar (and that cranberry sauce didn’t exist for another 50 years).

Not on the Menu

Even though pumpkins may have been present, pumpkin pie definitely was not. The pilgrims had neither the butter nor the flour necessary to make pumpkin pie – they didn’t even have an oven in 1621. Something pumpkin pie-esque that may have been prepared is a spiced pumpkin soup/custard cooked directly inside a pumpkin which was roasted on hot ashes.

There was no stuffing because, again, the colonists lacked the necessary flour. There were also no potatoes (mashed or otherwise). Potatoes came from South America and, while they had made their way to Europe by the late 16th century via the Spanish, they had yet to make their way to New England. There also weren’t any forks on the table since they too hadn’t made their way to North America yet (but on the upside nobody present had an overbite).

A historical reenactment of how to cook some of the foods present at the first Thanksgiving.

“Pumpkin” Spice

The autumnal flavor designed to resemble the spices in freshly baked pumpkin pie (but doesn’t contain any actual pumpkin).

Pumpkin spice does not contain pumpkin. It’s a blend of cinnamon, ginger, allspice, nutmeg, and clove used as an ingredient to spice up pumpkin pies. This spice mix (or variations of it) goes back as far as colonial America. Unlike the spice blend you buy in the store however, most commercially produced pumpkin spice flavored products don’t contain these spices. Commercial pumpkin spice flavor uses chemicals to simulate these spices which replicates the taste of a freshly baked pumpkin pie.

One reason a synthetic flavor is used, in a latte for example, is that using the actual spices makes it taste a bit more like Indian masala tea (chai tea) instead of pumpkin pie. This synthesized flavor has been engineered to taste like the spices after they have been transformed by the pie baking process. Other reasons for using a synthetic flavor are reliability (the flavor is the same every time) and cost (synthetic flavoring is a lot cheaper than using actual spices).

He who controls the spice controls the universe

The craze for all things pumpkin spice began in 2003 with the limited release of Starbucks’ latest seasonal specialty drink, the Pumpkin Spice Latte (PSL). With the success of their winter themed Peppermint Mocha and Eggnog Latte, Starbucks wanted an autumnal offering. Inspired by the flavors of freshly baked pumpkin pie the marketing team chose the name Pumpkin Spice Latte despite the drink not containing any actual pumpkin.

From big brands to small, just a few of the pumpkin spice products available for your autumnal seasonal needs.

In 2004 the drink was offered nationwide and became the most popular seasonal Starbucks beverage, generating an estimated $1.4 billion in sales as of 2017. It also started the flavor trend of all things getting a limited edition pumpkin spice variety. You can find candles, lip balm, cereal, soap, SPAM, chocolate candy, air fresheners, beer, and more all with pumpkin spice flavors.

Added info: Starting in 2015 the Starbucks PSL now contains some amount of pumpkin, but the flavor of the drink is still created using a pumpkin spice flavoring. Also, despite the autumnal seasonality of the drink, the PSL is on the Starbucks Secret Menu and you can buy it all year round.

MSG (Safe to Eat)

Reports that MSG is dangerous stem from one anecdotal letter and years of racism.

Monosodium glutamate (MSG) is a compound made up of sodium and glutamate (an amino acid) found naturally in our bodies and in a variety of foods (tomatoes, cheeses, anchovies, mushrooms, etc). Usually when it’s mentioned people are referring to the synthesized food additive version which is added to meals to bring out their umami flavors. It’s been a commercially produced food additive since 1909 but, despite being used by tens of millions of people, 42% of Americans today think it’s dangerous. The cause of this fear goes back to one article.

Chinese Restaurant Syndrome

The April 4, 1968 edition of the New England Journal of Medicine contained a letter titled Chinese-Restaurant Syndrome by Dr. Robert Ho Man Kwok on his observations of eating American Chinese food. Kwok said that about 15 to 20 minutes after eating at a Chinese restaurant he developed a headache, weakness, heart palpitations, and numbness. He proposed several possible causes but singled out MSG as the answer. This single letter was the beginning of decades of mistrust in MSG.

The ideas of MSG side-effects and “Chinese Restaurant Syndrome” have largely been fueled by racism. Suspicion or fear of East Asian cultures, the exoticism of the “Orient”, and/or a general lack of knowledge has led some people to be suspicious of Asian cuisine. In 1969 New York City imposed regulations on MSG use in Chinese restaurants but not regulations on MSG in general. While the supposed adverse reactions to MSG should make consumers wary of any food item containing MSG, Chinese food in particular got singled out and maligned. Lots of processed western foods contain MSG, lots of plants naturally contain significant levels of MSG, and yet Doritos and shiitake mushrooms didn’t seem to get singled out quite like Chinese food did.

Asian restaurants were singled out and maligned for their use of MSG, but Western processed foods were not.

Safe to Eat

There is no connection between MSG and the symptoms Kwok described. The US Food & Drug Administration states that MSG is safe to eat and that there is no evidence to support claims of headaches and nausea from eating normal amounts of MSG. In double-blind studies using subjects who claimed to have sensitivity to MSG some subjects were blindly given MSG and, unaware they were eating MSG, had no ill effects. These tests were unable to reproduce any of the side-effects claimed about MSG.

MSG, like any food additive, is safe in moderation. Excess anything can make you sick. Because of the association of Chinese food to MSG, some Asian restaurants in the US have reduced their usage of MSG just to satisfy public opinion, to the detriment of the food and the customers’ taste buds.

Pineapples as Status Symbols

Because of their rarity pineapples became European decorative elements and status symbols.

The pineapple is native to South America but across thousands of years of cultivation it spread to Central America as well. The first European to encounter a pineapple was Columbus in 1493 who brought some back to the Spanish royal court (along with tobacco, gold, chili peppers, and the people he kidnapped). Europeans had never tasted anything like pineapple before and, because of their scarcity, to own one quickly became an exotic status symbol of the ultra wealthy.

Pineapples were in high demand but there was low supply so enterprising individuals set out to grow pineapples in Europe. The tropical conditions pineapples require make growing them in Europe a challenge. It took until the 17th century for farmers in the Netherlands to succeed, followed by the English in the 18th century. Mathew Decker even memorialized his pineapple growing achievement by commissioning a painting in 1720. These efforts produced more, albeit still not many, pineapples for the European & American markets. A single pineapple could go for around $8,000 in today’s dollars. A cheaper alternative was to rent a pineapple which people would do to show off at parties and such. These pineapples would be rented from person to person until the final person paid to eat it, unless it had rotted by then. A further down-market option was pineapple jam which could be shipped from Central/South America.

Because of their popularity, pineapples became a decorative element in a host of artistic mediums.

Pineapple Art

The Caribbean custom of placing pineapples at the front of a friendly home as a welcome to strangers, combined with years of being displayed at happy European social gatherings, led pineapples to becoming international symbols of hospitality. This combined with their association with wealth & high society helped make the pineapple a popular artistic motif. From this we get carved pineapple embellishments as finials on staircases, at the tops of columns, on New England gateposts, above front doors, as fountains, as furniture accents, Christopher Wren placed gilded copper pineapples on St. Paul’s Cathedral in London, the centerpiece of the Dunmore Pineapple folly in Scotland is a massive pineapple, etc.

Added info: any association of pineapple with Hawaii comes after the fruit was introduced there by the Spanish in the 18th century. Pineapple is not native to Hawaii.

American Lobsters: From Trash to Treasure

North America lobsters were originally a poor person’s food.

While an expensive luxury today, the American lobster (aka Homarus americanus, the Maine lobster, Canadian lobster, northern lobster, etc) was once a food of last resort. Native American tribes of the northeastern coastal regions would use lobsters as fertilizer, fish bait, and when necessary food. European colonists also viewed lobsters as inferior last-resort bottom feeders. These “cockroaches of the sea” became relegated to food for the poor, for servants, prisoners, slaves, and sometimes even feed for livestock.

The turnaround for lobsters began in the 19th century with two new industries: canning and railroads. As canning became a viable way to preserve & ship food lobster meat became a cheap export from the New England area. Lobster meat was sent via rail to locations all around North America. This was followed by tourists visiting New England along some of the same rail lines. These tourists were able to finally taste fresh lobster meat instead of canned and lobster’s popularity grew. By the 1880s demand for lobster (especially fresh lobster which must be shipped alive) combined with a decrease in supply, lead to higher prices. This helped establish lobster as the expensive delicacy we think of today.

Expensive?

Like any commodity, lobster is subject to price fluctuations. While lobster typically maintains its cultural status as an expensive delicacy, this doesn’t always reflect the real cost. For example, the over abundance of lobsters around 2009 sent the wholesale price of lobster from around $6 a pound to half that – but it would have been hard to notice. Restaurants don’t typically reduce their prices because an ingredient has suddenly become cheaper.

However, when lobster is less expensive it does appear in unexpected places. Around 2012 Gastropubs included lobster in dishes such as macaroni & cheese, fast food chains included lobster on their menus, Walgreens in downtown Boston even sold live lobsters – all things you don’t usually see when a commodity is expensive. Today, even in years when lobster is abundant and the cost is low, it is still thought of as a luxury item.

the Shamrock Shake & Uncle O’Grimacey

The McDonalds Shamrock Shake helped pay for the first Ronald McDonald house.

In 1970 McDonalds introduced their lemon/lime flavored green Saint Patrick’s Day Shake. It eventually changed flavors and names to become the mint flavored, and alliteratively titled, Shamrock Shake. Like the autumnal artificial scarcity of pumpkin spice, the Shamrock Shake is only available around Saint Patrick’s Day in the February through March timeframe (except in Philadelphia where it has two seasons).

McDonald’s founder Ray Kroc, Mayor Frank Rizzo, members of the Eagles organization, Fred Hill & his daughter all attended the opening of the first Ronald McDonald House, October 15, 1974.

Philadelphia’s two seasons of Shamrock Shakes goes back to the role it played in creating the first ever Ronald McDonald House. In 1969 Kim Hill, daughter of Philadelphia Eagle Fred Hill, was diagnosed with leukemia. By 1973 the Hills and members of the Eagles organization started the Eagles Fly for Leukemia charity which helped pay for the new oncology wing at the Children’s Hospital of Philadelphia (CHOP).

After seeing parents camped out in the hallways and waiting rooms while their children received treatments, the charity went a step further in 1974 and purchased an old seven-bedroom house at 4032 Spruce St. not far from the hospital. The house would be a place for visiting families to stay free of charge while their children received treatment – a “home away from home”. To help pay for this the Eagles partnered with the local McDonalds restaurants asking them to donate money from their next promotional food item, which just happened to be the Shamrock Shake. The Eagles asked them to donate 25 cents per shake but McDonalds executives asked if they could have the naming rights to the house if they donated all of the proceeds. Eagles general manager Jimmy Murray said “… for that money, they could name it the Hamburglar House.” From this, the first ever Ronald McDonald House was established in Philadelphia in 1974. Today there are more than 375 Ronald McDonald House programs around the world which, at what would have been more than 2.5 million overnight stays in hotels, save families around $930 million each year.

Uncle O’Grimacey

As positive as the Shamrock Shake’s impact has been, there have been some missteps. To help promote the Shamrock Shake, McDonalds introduced the new mascot character Uncle O’Grimacey in 1975. The Irish uncle of the purple mascot Grimace, Uncle O’Grimacey (complete in his kelly green hat, shamrock-patterned vest, and a shillelagh) would travel from Ireland each year bringing Shamrock Shakes to McDonaldland. Uncle O’Grimacey was quietly phased out of McDonalds marketing after a few years due in part to an alleged incident in Philadelphia in 1978 where the person portraying him made statements in support of the IRA and that British soldiers were better dead than alive.

Casual racism isn’t just relegated to the distant past however. In 2017 McDonalds ran an ad promoting the Shamrock Shake. Unfortunately they had a man wearing a tartan Tam o’ shanter playing the shake like a set of bagpipes (which would be Scottish) while standing in-front of Stonehenge (which is in England). McDonalds stopped the ad and apologized saying they are “… strongly supportive of Ireland and respectful of its culture”. Begosh and Begorrah.

Uncle O’Grimacey bringing Shamrock Shakes to McDonaldland.

Acquired Overbites

We have slight overbites because we use forks & knifes.

Before people ate with forks & knifes they used their hands. In the absence of utensils people would have to clench and rip their food with their teeth. While crude, the result of all this pulling meant that people’s top and bottom rows of teeth lined up edge-to-edge. The introduction of utensils changed that.

The use of a fork & knife meant that a person no longer had to use their teeth to pull at their food, they could cut their food on their plate first. As a result anyone who used utensils (including us today) developed an overbite. Most of us have slight overbites where our top teeth hang out a bit in front of our bottom teeth, which comes from using utensils.

This effect can be traced across time, across cultures, and across social classes. Aristocratic classes could usually afford knives & forks before the peasantry, so they developed overbites before anyone else. For example, the more affluent members of 18th century Western Europe developed overbites before the people who couldn’t afford silverware. Even more interesting the overbite took longer to develop in the American colonies, who were poorer than their countrymen back home in Europe.

In China the overbite developed as far back as 800 CE. Instead of knives & forks the Chinese aristocracy used chopsticks, but to eat meat with chopsticks the meat had to be pre-chopped as part of the meal preparation. As a result they didn’t have to pull at their food either, and developed overbites centuries before Europeans.

QI discusses the development of overbites, and as it pertains to Richard III

Coffee: Sun-Grown or Shade-Grown?

Coffee plants want to be grown in the shade, which is better for the flavor and the environment.

Coffee plants thrive in the warm (but not too warm) areas between the tropics of Cancer and Capricorn, an area nicknamed the “coffee belt” (although, as global warming continues the areas in which coffee plants can grow is shrinking). These evergreen plants grow to be about 12ft tall and, while they like the warmth, they also like the shade. Coffee plants naturally grow best underneath the canopy of trees. Traditionally people would collect the coffee berries from plants growing wild around the forest and then process the seeds to make coffee. Enter industrialization.

Sun-tolerant coffee plants grown in efficient rows for sun-grown coffee. Photo by Shade Grown Coffee film.

Here Comes The Sun

From the 1970s to the early 1990s coffee producers were encouraged by the US Agency for International Development (USAID) to “upgrade” their processes and switch from shade-grown production to sun cultivation. Sun-tolerant plants had been engineered to better handle direct sunlight. With sun-grown cultivation you can grow coffee plants in greater density, harvest beans more efficiently through mechanization, producing higher yields, and make more money. This isn’t without costs.

One of the first steps for sun-grown coffee is deforestation (which increases global warming). Without trees there are no fallen leaves serving as mulch keeping weeds down. Leaves also biodegrade adding nutrients to the soil. This means sun-cultivated coffee requires more herbicides and fertilizers than shade-grown coffee. Further, when there are less trees there are less birds, and without as many birds to eat the insects, you need more pesticides. All of this means more chemicals on the plants and in the soil.

Made in the Shade

While still incorporating trees and other vegetation, modern shade-grown coffee farms can arrange their coffee plants more efficiently than former traditional practices. Even though this usually means lower yields and longer harvest times compared to sun-grown coffee, shade-grown coffee sells at a premium which can compensate producers for these factors.

The trees of shade-grown coffee farms serve as homes to hundreds of bird species. In Peru for example, the coffee plants of sun-grown coffee farms are home to around 61 bird species. This is in stark contrast to the trees of Peruvian shade-grown coffee farms which are home to 243 bird species. The Smithsonian Migratory Bird Center has said that, “shade-grown coffee production is the next best thing to a natural forest.”

As for the coffee itself, shade-grown coffee plants produce beans with higher density, developing natural sugars, which makes for better tasting coffee. Sun-grown coffee speeds up the growing process, which is good for maximizing efficiency, but it also creates higher acidity resulting in a more bitter taste.

People in Ethiopia, sitting in the shade, processing shade-grown coffee.

So shade-grown tastes better, requires less chemicals, it helps hundreds of bird species, and it helps stop global warming. Next time you’re buying coffee spend the extra few cents for shade-grown.

Added info: Coffee beans frequently come with little logos attesting to various positive attributes in which the coffee was produced. The certification that best represents the environmental benefits of shade-grown coffee is the Bird-Friendly label from the Smithsonian Migratory Bird Center.

Bird-Friendly is widely considered the gold-standard in coffee certification as it means the coffee is organic, shade-grown, and helps the local ecosystem. That said, given the various benchmarks that must be achieved, it’s hard to become certified as Bird-Friendly which means it’s hard to come by Bird-Friendly coffee.

Herbs & Spices (and Salt)

Herbs come from the leaves of a plant, spices are from any other part of a plant (and salt is a mineral).

In cooking they are frequently used together to flavor a meal, and their names get used interchangeably, but herbs and spices are not the same. In short:

HERBS
Herbs are a seasoning that are the leaves of a plant

SPICES
Spices are a seasoning from any part of a plant other than the leaves (roots, stalks, bark, seeds, and sometimes even the fruit)

Herbs

In greater detail, an herb typically comes from smaller deciduous plants without a bark stem. There are exceptions of course as lavender, sage, rosemary aren’t deciduous and never lose their leaves. Either way, the key is that the green leaves become herbs when they are used in cooking, medicine, teas, cosmetics, etc.

Spices

Spices can come from everything but the leaves of a plant. For example cinnamon comes from bark, ginger comes from roots, pepper comes from seeds, and chili powder comes from the pulverized fruit of the chili pepper. Saffron, one of the most expensive spices in the world at around $10,000 a pound, comes from the hand-picked stigma & styles from the Crocus sativus flower.

Allspice, despite a misconception, is not a blend of spices but is just one spice which comes from the berry of the Pimenta dioica tree. Its name comes from the fact that it tastes like a combination of cinnamon, nutmeg, and clove.

Herb & Spice Plants

There are a few plants that produce both an herb and a spice. The leaves of the Coriandrum sativum produce the herb cilantro while the seeds become coriander. Similarly the Dill plant produces dill the herb from its leaves, and dill the spice from its seeds.

Salt

The food seasoning odd-one-out, salt is a mineral and does not come from a plant (although salt is present in plants). There is a lot to say about salt but in short it’s been used as a preservative and a seasoning for thousands of years. It’s the only food seasoning that doesn’t come from a plant.