Monster Cereals

The seasonal cereal Halloween treat.

Autumn brings seasonal foods – pumpkin spice lattes, pumpkin beer, apple cider, and monster cereals. Created in 1971 by General Mills, monster cereals are a line of sugary cereals with cartoon mascots based on classic Universal Monsters.

Cereal mascots as we know them began in 1933 with the Rice Krispies characters of Snap, Crackle and Pop. After WWII with televised commercials cereal mascots became animated, increasing the competition for attention & dollars. In the late 1960s General Mills had new flavoring ideas and was in search of a marketing strategy that would appeal to kids. They had a chocolate and a strawberry flavoring that, when added to milk, would flavor the milk like chocolate or strawberry. What they needed were mascots to sell the cereal.

Laura Levine of the ad agency Dancer Fitzgerald Sample, who had been hired to help sell these new cereals, was the person who had the inspired idea of using monsters. Introduced in 1971 Count Chocula and Franken Berry were the first monster cereal creations, based on the classic horror monsters of Dracula and Frankenstein (or “Frankenstein’s monster” if we’re being pedantic). In 1973 the ghost Boo Berry (whose animated voice was loosely modeled off of actor Peter Lorre) was added to the lineup, selling a blueberry flavored cereal. 

monster cereal collage
Various monsters have been a part of monster cereals over the years.

Count Chocula, Franken Berry, and Boo Berry have been the reliable trio of monster cereal flavors since the 1970s but other flavors have been tried. Frute Brute and Fruity Yummy Mummy were both been launched, discontinued, relaunched, and discontinued again over the years with their flavoring reconfigured at different times. In 2021 to celebrate the 50th anniversary of monster cereals General Mills began selling “Monster Mash” boxes, which are a mix of the monster cereal flavors. In 2023 the zombie themed, caramel apple flavored, Carmella Creeper was released as the latest new monster cereal. Finally, monster cereals used to be available year round but in 2010 they became a seasonal product, rising from the darkness every autumn Halloween season (but you can search for them all year round).

Frankenstool

One curious side effect of the Franken Berry strawberry flavoring was realized in 1972 when a 12 year old boy was admitted to the hospital with pink / red poop. Documented by the University of Maryland Medical School the condition of “Franken Berry stool” was publicized in the paper “Benign red pigmentation of stool resulting from food coloring in a new breakfast cereal (the Franken Berry stool).” Later it was discovered that Boo Berry would change poop green. 

The formulation of both cereals was eventually changed to avoid this.

A deep dive into Monster Cereals.

Pink Doughnut Boxes

Pink doughnut boxes exist because of Cambodian immigrants.

In movies and TV shows doughnut boxes are frequently pink. This is in part because many of the doughnut boxes in the Los Angeles area are pink. These pink boxes are a subtle hint that, if you spot them in a story set in New York City or somewhere, it was actually shot in LA. The reason these doughnut boxes are pink is because of Cambodian immigrant Ted Ngoy and his doughnut shop empire. 

In 1975 Ngoy and his family fled the Khmer Rouge on the last flight out of Phenom Penh and emigrated to America. He started life over as a janitor in a Lutheran Church but eventually noticed how popular the doughnut shop was near his other job as a gas station attendant. Ngoy enrolled in Winchell’s training program learning the ins & outs of running a doughnut shop.

Taking what he learned Ngoy started his own doughnut shop, Christy’s Donuts, in the La Habra area in 1977. Eventually this sole shop begat others and Ted & his wife owned over 50 locations in southern California. Along the way he would sponsor other Cambodian immigrants, setting them up for business in his doughnut shops. But it was during the scrappy early days that he came upon the idea for pink boxes.

the character of Marsellus Wallace crossing the street in Pulp Fiction carrying a box of doughnuts
The Pulp Fiction character of Marsellus Wallace carries a pink box of doughnuts across the street just before his day gets much much worse.

Pretty in Pink

Supposedly Ngoy wanted red boxes as red is the color of luck for Chinese-Cambodians. White on the other hand is the color of mourning & death. The closest his box vendor Westco had were leftover pink boxes which sold for a few cents cheaper than white boxes. For the price and the symbolism(ish) pink became the color of boxes for Ngoy’s shops.

Soon Ngoy’s competitors were using pink boxes as well. In 2003 these boxes inspired Kenneth “Cat Daddy” Pogson for the box design of his new company Voodoo Doughnut in Portland, Oregon who have some of the most famous pink doughnut boxes around.

Today you still see pink doughnut boxes around LA. They’re so connected to southeast Asian immigrants that they became a canvas for Cambodian American artists in 2022. As of 2020 it’s estimated around 80% of the independent doughnut shops in California are owned by Cambodian-Americans, many of whom credit “Uncle Ted” for getting them started.

Added info: similar to how Ted Ngoy’s influence helped Cambodian immigrants dominate the LA doughnut scene, Tippi Hedren is credited with helping Vietnamese immigrants dominate the nail salon industry.

Also, the highs & lows of Ted Ngoy’s life story are enough to fill multiple lifetimes. You can learn more about him in the 2020 documentary The Donut King.

The wild ride of the highs & lows of Ted Ngoy’s life as The Donut King.

Sunday Morning reports on the Cambodian history of California doughnut shops.

Breakfast, Lunch, & Dinner (Supper, & Tea)

The names and details of our daily meals are relatively recent creations.

Breakfast

The clue being in the name, breakfast is the first meal of the day, the meal where you “break your fast” (the fast of not eating overnight in your sleep). That said this first meal of the day wasn’t always first thing in the morning like it is today. Up through the early Middle Ages people would rise and do without eating until after they had worked for several hours.

Further complicating things this late morning first meal of the day, before being called breakfast, was called dinner. From the Old French “disner” meaning “to break one’s fast” the first meal of the day only became “breakfast” in the 15th century. This early meal would be bread, maybe some cheese, and some alcohol (alcohol being safer to drink than water).

Dinner and Supper

As breakfast became breakfast, dinner moved from the 1st time slot to the 2nd. You would eat a small meal upon waking (breakfast), eat a large meal in the late morning to give you energy for the rest of your work (dinner), and then a small meal in the evening. The small meal at the end of the day was supper, from the French “souper”. This was typically a soup that you supped, a soup that was slow cooked throughout the day to be ready in the evening.

But dinner wasn’t done moving and moved again from the 2nd time slot to the 3rd, replacing supper as the last meal of the day. This change wasn’t all at once. The dinner shift in time slot was due to several reasons not least of which was the changing nature of how people worked. When people worked out of their homes or in an agrarian lifestyle in the fields near their homes, it was easier to prepare & eat a large meal in the middle of the day. Through the Industrial Revolution work moved to factories & offices and it became impractical to have a large meal in the middle of the day. As such dinner continued to be the biggest meal of the day but it moved to the end of the day when people returned from work.

That said, while “dinner” is the term most people use for the big meal at the end of the day some people (particularly those of agricultural backgrounds) still call this meal supper. Generally speaking though “dinner” and “supper” are seen as synonymous terms for the same meal. As such the Last Supper could have been the Last Dinner.

Lunch

With breakfast at the start of the day, and dinner now the last meal of the day, this left a time slot opening in the middle of the day. Lunch is essentially if dinner and supper switched places and supper changed its name. Starting in the 18th century lunch became a small midday meal, increasing in popularity as more and more people had their dinner at the end of the day.

Tea Time

So what is tea / tea time? After the Portuguese Catherine of Braganza introduced tea to England in the 17th century it eventually became a staple of British life. Tea as a meal took two forms: Afternoon Tea and High Tea. Confusingly, afternoon tea is the classier of the two.

Tea was originally had after the large midday meal of dinner, as tea was believed to assist digestion. As dinner moved to the end of the day tea time was created as a way to hold people over between lunch and dinner while still having tea after midday. Afternoon tea, as the name suggests, was served in the afternoon. It was a light meal of tea served with cucumber sandwiches, scones, cakes and other elegant snack foods – it’s tea time of the upper class (because who else had the time to break for fancy foods in the midafternoon?). High tea on the other hand was the meal of the working class. Working people couldn’t take a break midafternoon so they had their tea with heartier snacks after they came home in the evening but before their supper (or dinner).

As dinner replaced supper as the final meal of the day some people in British countries merged dinner and high tea, calling this meal “tea”.

Brain Freeze

The short headache triggered by cold food and/or drinks touching the inside of your mouth.

To start, brain freeze (aka “ice cream headache” or “cold-stimulus headache”) only affects about 30-50% of the population. Most people can eat ice cream and drink extra cold drinks without any fear of reprisal from their nervous system.

Brain freeze occurs when the roof of your mouth or the back of your throat suddenly come into contact with cold food, cold drinks, or even cold air. The trigeminal nerve in your head reacts to the cold by telling the arteries connected to the meninges (the membranes surrounding your brain) to contract to conserve warmth (much like how our bodies react to the cold in general). Then the body sends more warm blood up to the head telling those same arteries to expand. This quick succession of vasoconstriction and vasodilation of blood vessels triggers pain receptors along the trigeminal nerve which creates the pain you feel behind the eyes or forehead during a brain freeze.

A lot of nerve

While we all have a trigeminal nerve its varying sensitivity may explain why not everyone gets brain freeze. For example 37% of Americans may get brain freeze but only around 15% of Danish adults do. Further, 93% of people who get migraines are also susceptible to brain freeze.

the Myth of 8 Glasses of Water

You don’t need to drink 8 glasses of water a day.

In short: you only need to drink water when you’re thirsty. For millions of years humans and our human ancestors survived using thirst as an indicator that it’s time for more water. It wasn’t until the 20th century that the idea of drinking 8 glasses of water a day began.

We all need water to live but liquid water isn’t our only source. Coffee, tea, juice, soft drinks, fruits, vegetables, etc. all contain water. Depending on your diet you can get around 20% of the water you need just from food. Then because coffee, milk, juice, tea, etc. are mostly water, you’re probably already getting all the water you need each day without having to drink 8 more glasses of it.

… But Maybe You Do Need More Water

Daily water consumption is about maintaining balance: you need to replace the water you lose. If you live in a hot climate, or you’re sweating from exercise, you lose water faster than someone sitting still in a temperate climate. As such you need to replace water faster than normal which means drinking more water.

Also, should you be lost on a hike somewhere, you should ration sweat not water. Try to limit your physical exertion and sweat less but drink when you need to. A common mistake is that you should ration your water which, while it’s true you don’t want to waste a limited resource, if you’re thirsty you should drink. Your water isn’t doing you any good sitting inside a bottle.

Water water everywhere

On the flip side it’s possible to drink too much water. Exercise-associated hyponatremia is where you’re engaged in an endurance activity such as running a marathon, you sweat out water and sodium, but then you only drink water. In drinking regular water you manage to replenish your lost water but not your sodium. The result is low blood-sodium levels. This imbalance can cause poor nerve communication which leads to poor muscle control, poor performance, etc. Athletes with hyponatremia can feel nauseous, develop muscle cramps, and become confused leading some to think they’re dehydrated and drink even more water (making the situation worse).

Hyponatremia is becoming more prevalent in sports as an increasing number of novice athletes participate in long-distance endurance activities. For example in the 2002 Boston Marathon 13% of runners were found to have hyponatremia from drinking too much water. Athletes need to replenish their sodium levels along with their water. Part of the solution (pardon the pun) is to drink sports beverages that contain electrolytes (which are salts and can replenish sodium levels). This is why sports drinks boast about having electrolytes.

So, if you’re thirsty, drink some water and if you’re engaged in an endurance sport remember to get some electrolytes along with your water.

Added info: to bust another myth, consuming caffeinated beverages won’t dehydrate you. While excessive caffeine has a number of downsides, drinking coffee or tea is an acceptable a way to hydrate.

Adam Ruins Everything dives into the myth of 8 glasses of water a day.

Mistakes Happen (Sometimes Intentionally)

Nothing is perfect and we should embrace mistakes and imperfections.

Mistaken Mistakes

Persian carpets (aka Iranian carpets) come in a diversity of designs and sizes, but they frequently contain repeating symmetrical patterns. One alleged feature in handmade Persian carpets is a mistake in the design pattern (not in the construction) included intentionally. This “Persian flaw” serves as a reminder that only Allah is perfect. The flaw would be something small only noticed by the keenest of observers. It’s also been said that the Amish have a similar practice, that they include an intentional flaw (a “humility block”) in their quilts as a reminder that only God is perfect … but it isn’t true.

Lancaster curator Wendell Zercher has quoted Amish quilt makers as saying “… no one has to remind us that we’re not perfect.” As for Persian flaws, most accounts of this idea come from Western sources and is probably an example of orientalism. While both of these are nice stories that probably help to sell imperfect rugs & quilts, we have little to no evidence to support them. If anything, to intentionally make just one mistake out of humility would prove the opposite, bragging that you have the ability to make a perfect creation (but choose not to).

Actual “Mistakes”

There are however some cultures that really do include intentional imperfections in their work. Women in the Punjab region between India & Pakistan create Phulkari shawls of intricate designs. In these designs they sometimes include “mistakes” which are momentary changes in the overall design pattern. These changes are included to mark important events during the creation of the shawl (births, weddings, deaths, etc). Sometimes the symmetrical pattern is disrupted as spiritual protection from the evil eye.

On the left is a phulkari shawl with intentional changes to the pattern. To the right is a Navajo weaving featuring a “spirit line”.

Some Navajo also include imperfections in their weavings for spiritual reasons. The ch’ihónít’i (aka the “spirit line” or the “weaver’s path”) is a single line leading out of the middle of a design to the edge of the weaving. The spirit line is thought to give the weaver’s soul a way to exit the weaving so as to not get trapped in the design.

Embrace Imperfections

Of course if you accept that nothing is perfect then you have no need to add imperfections because everything is imperfect. The Japanese concept of wabi-sabi is the Zen view that everything is imperfect, impermanent, vulnerable. Unlike Western design ideas which frequently strive for idealized perfection, wabi-sabi celebrates the imperfections that make everything (and everyone) unique.

Kintsugi repaired ceramics, using gold & lacquer to feature (rather than hide) the imperfections.

Building off of wabi-sabi, kintsugi is the practice of repairing broken pottery with bits of valuable metals & lacquer that, rather than trying to seamlessly hide the repaired cracks, highlights them. Kintsugi honors the history of the object and celebrates its imperfections. Nothing lasts forever and we should recognize the beauty of imperfect vessels.

A crash course on the Japanese concept of wabi-sabi.

Ugly Fruits & Vegetables

In the West this embrace of the imperfect has recently manifested itself in ugly fruits & vegetables. Imperfect looking produce has traditionally gone unsold and makes up 40% of total food waste. Producers throw away food because they don’t think retailers will want it (it doesn’t meet “quality standards”) and then retail stores throw away the unsold odd looking food that customers won’t buy. This is all despite the fact that the taste and nutritional content of this “ugly” food may be identical to “normal” looking produce.

The European Union declared 2014 the European Year Against Food Waste. The French supermarket chain Intermarché began their “Inglorious Fruits and Vegetables” marketing campaign that celebrated ugly looking produce, they gave the foods their own section in the store, and sold them at a discount. It proved so successful that other stores & delivery services, such as Imperfect Foods, started to do likewise as consumers began to accept the wabi-wabi nature of produce.

The Intermarché marketing campaign to help reduce food waste was a huge success.

the First Thanksgiving Menu

Lacking key ingredients, the menu at the first Thanksgiving of 1621 was a bit different than the traditional turkey dinner of today.

In the fall of 1621 the English Pilgrims and the Wampanoag came together in Massachusetts for, what has subsequently become a much mythologized, 3 day harvest festival. The pilgrims had a lot to be thankful for — that they were still alive following the deaths of half their fellow pilgrims the previous winter, that they had their supplies fortified by the Wampanoag, and that they had completed a successful summer growing season. What they ate as they gave thanks is debatable.

Definitely on the Menu

One food that was definitely served was venison. Massasoit, the leader of the Wampanoag, had 5 deer brought to the event. Another meat on the menu was “wild fowl”, but exactly what kind of birds these were is unknown. It’s possible that there was turkey at the first Thanksgiving but more likely it was goose or duck (or a combination). Other regional bird options at the time would have been swan and passenger pigeon.

Also definitely present was corn. The Wampanoag, who used the Three Sisters method of farming, had taught the pilgrims how to grow corn. As the pilgrims had grown a successful crop of Flint corn (aka “Indian corn”) it was cooked into a porridge, a bread, and/or with beans.

Maybe on the Menu

Given that the Plymouth Colony was by the water it’s very likely that seafood was also served. Eels, clams, muscles, cod, bass, and/or lobsters were very likely a part of the meal. It’s worth noting though that, unlike today, lobster was considered a food of last resort.

There were certainly vegetables & fruits on the menu but which ones were never specified (other than corn). Chestnuts, walnuts, beans, onions, carrots, cabbage, pumpkins, and various squashes were all grown in the area. Blueberries, plums, grapes, and raspberries were also grown in the area and could have been present. While cranberries might have been served cranberry sauce definitely was not since the colonists lacked the necessary sugar (and that cranberry sauce didn’t exist for another 50 years).

Not on the Menu

Even though pumpkins may have been present, pumpkin pie definitely was not. The pilgrims had neither the butter nor the flour necessary to make pumpkin pie – they didn’t even have an oven in 1621. Something pumpkin pie-esque that may have been prepared is a spiced pumpkin soup/custard cooked directly inside a pumpkin which was roasted on hot ashes.

There was no stuffing because, again, the colonists lacked the necessary flour. There were also no potatoes (mashed or otherwise). Potatoes came from South America and, while they had made their way to Europe by the late 16th century via the Spanish, they had yet to make their way to New England. There also weren’t any forks on the table since they too hadn’t made their way to North America yet (but on the upside nobody present had an overbite).

A historical reenactment of how to cook some of the foods present at the first Thanksgiving.

“Pumpkin” Spice

The autumnal flavor designed to resemble the spices in freshly baked pumpkin pie (but doesn’t contain any actual pumpkin).

Pumpkin spice does not contain pumpkin. It’s a blend of cinnamon, ginger, allspice, nutmeg, and clove used as an ingredient to spice up pumpkin pies. This spice mix (or variations of it) goes back as far as colonial America. Unlike the spice blend you buy in the store however, the pumpkin spice used in most commercially produced products doesn’t contain these spices. Commercial pumpkin spice flavor uses chemicals to simulate these spices which replicates the taste of a freshly baked pumpkin pie.

One reason a synthetic flavor is used, in lattes for example, is that using the actual spices makes it taste a bit more like Indian masala tea (chai tea) instead of pumpkin pie. Synthesized pumpkin spice flavoring has been engineered to taste like the spices after they have been transformed by the pie baking process. Other reasons for using a synthetic flavor are reliability (the flavor is the same every time) and cost (synthetic flavoring is a lot cheaper than using actual spices).

He who controls the spice controls the universe

The craze for all things pumpkin spice began in 2003 with the limited release of Starbucks’ latest specialty seasonal drink, the Pumpkin Spice Latte (PSL). With the success of their winter themed Peppermint Mocha and Eggnog Latte, Starbucks wanted an autumnal offering. Inspired by the flavors of freshly baked pumpkin pie the marketing team chose the name Pumpkin Spice Latte.

From big brands to small, just a few of the pumpkin spice products available for your autumnal seasonal needs.

In 2004 the drink was offered nationwide and became the most popular seasonal Starbucks beverage, generating an estimated $1.4 billion in sales as of 2017. It also started the flavor trend of all things getting a limited edition pumpkin spice variety. You can find candles, lip balm, cereal, soap, SPAM, chocolate candy, air fresheners, beer, and more all with pumpkin spice flavors.

Added info: Starting in 2015 the Starbucks PSL now contains some amount of pumpkin, but the flavor of the drink is still created using a pumpkin spice flavoring. Also, despite the autumnal seasonality of the drink, the PSL is on the Starbucks Secret Menu and you can buy it all year round.

MSG (Safe to Eat)

Reports that MSG is dangerous stem from one anecdotal letter and years of racism.

Monosodium glutamate (MSG) is a compound made up of sodium and glutamate (an amino acid) found naturally in our bodies and in a variety of foods (tomatoes, cheeses, anchovies, mushrooms, etc). Usually when it’s mentioned people are referring to the synthesized food additive version which is added to meals to bring out their umami flavors. It’s been a commercially produced food additive since 1909 but, despite being used by tens of millions of people, 42% of Americans today think it’s dangerous. The cause of this fear goes back to one article.

Chinese Restaurant Syndrome

The April 4, 1968 edition of the New England Journal of Medicine contained a letter titled Chinese-Restaurant Syndrome by Dr. Robert Ho Man Kwok on his observations of eating American Chinese food. Kwok said that about 15 to 20 minutes after eating at a Chinese restaurant he developed a headache, weakness, heart palpitations, and numbness. He proposed several possible causes but singled out MSG as the answer. This single letter was the beginning of decades of mistrust in MSG.

The ideas of MSG side-effects and “Chinese Restaurant Syndrome” have largely been fueled by racism. Suspicion or fear of East Asian cultures, the exoticism of the “Orient”, and/or a general lack of knowledge has led some people to be suspicious of Asian cuisine. In 1969 New York City imposed regulations on MSG use in Chinese restaurants but not regulations on MSG in general. While the supposed adverse reactions to MSG should make consumers wary of any food item containing MSG, Chinese food in particular got singled out and maligned. Lots of processed western foods contain MSG, lots of plants naturally contain significant levels of MSG, and yet Doritos and shiitake mushrooms didn’t seem to get singled out quite like Chinese food did.

Asian restaurants were singled out and maligned for their use of MSG, but Western processed foods were not.

Safe to Eat

There is no connection between MSG and the symptoms Kwok described. The US Food & Drug Administration states that MSG is safe to eat and that there is no evidence to support claims of headaches and nausea from eating normal amounts of MSG. In double-blind studies using subjects who claimed to have sensitivity to MSG some subjects were blindly given MSG and, unaware they were eating MSG, had no ill effects. These tests were unable to reproduce any of the side-effects claimed about MSG.

MSG, like any food additive, is safe in moderation. Excess anything can make you sick. Because of the association of Chinese food to MSG, some Asian restaurants in the US have reduced their usage of MSG just to satisfy public opinion, to the detriment of the food and the customers’ taste buds.

Pineapples as Status Symbols

Because of their rarity pineapples became European decorative elements and status symbols.

The pineapple is native to South America but across thousands of years of cultivation it spread to Central America as well. The first European to encounter a pineapple was Columbus in 1493 who brought some back to the Spanish royal court (along with tobacco, gold, chili peppers, and the people he kidnapped). Europeans had never tasted anything like pineapple before and, because of their scarcity, to own one quickly became an exotic status symbol of the ultra wealthy.

Pineapples were in high demand but there was low supply so enterprising individuals set out to grow pineapples in Europe. The tropical conditions pineapples require make growing them in Europe a challenge. It took until the 17th century for farmers in the Netherlands to succeed, followed by the English in the 18th century. Mathew Decker even memorialized his pineapple growing achievement by commissioning a painting in 1720. These efforts produced more, albeit still not many, pineapples for the European & American markets. A single pineapple could go for around $8,000 in today’s dollars. A cheaper alternative was to rent a pineapple which people would do to show off at parties and such. These pineapples would be rented from person to person until the final person paid to eat it, unless it had rotted by then. A further down-market option was pineapple jam which could be shipped from Central/South America.

Because of their popularity, pineapples became a decorative element in a host of artistic mediums.

Pineapple Art

The Caribbean custom of placing pineapples at the front of a friendly home as a welcome to strangers, combined with years of being displayed at happy European social gatherings, led pineapples to becoming international symbols of hospitality. This combined with their association to wealth & high society helped make the pineapple a popular artistic motif. From this we get carved pineapple embellishments as finials on staircases, at the tops of columns, on New England gateposts, above front doors, as fountains, as furniture accents, Christopher Wren placed gilded copper pineapples on St. Paul’s Cathedral in London, the centerpiece of the Dunmore Pineapple folly in Scotland is a massive pineapple, etc.

Added info: any association of pineapple with Hawaii comes after the fruit was introduced there by the Spanish in the 18th century. Pineapple is not native to Hawaii.