The short headache triggered by cold food and/or drinks touching the inside of your mouth.
To start, brain freeze (aka “ice cream headache” or “cold-stimulus headache”) only affects about 30-50% of the population. Most people can eat ice cream and drink extra cold drinks without any fear of reprisal from their nervous system.
Brain freeze occurs when the roof of your mouth or the back of your throat suddenly come into contact with cold food, cold drinks, or even cold air. The trigeminal nerve in your head reacts to the cold by telling the arteries connected to the meninges (the membranes surrounding your brain) to contract to conserve warmth (much like how our bodies react to the cold in general). Then the body sends more warm blood up to the head telling those same arteries to expand. This quick succession of vasoconstriction and vasodilation of blood vessels triggers pain receptors along the trigeminal nerve which creates the pain you feel behind the eyes or forehead during a brain freeze.
A lot of nerve
While we all have a trigeminal nerve its varying sensitivity may explain why not everyone gets brain freeze. For example 37% of Americans may get brain freeze but only around 15% of Danish adults do. Further, 93% of people who get migraines are also susceptible to brain freeze.
In short: you only need to drink water when you’re thirsty. For millions of years humans and our human ancestors survived using thirst as an indicator that it’s time for more water. It wasn’t until the 20th century that the idea of drinking 8 glasses of water a day began.
We all need water to live but liquid water isn’t our only source. Coffee, tea, juice, soft drinks, fruits, vegetables, etc. all contain water. Depending on your diet you can get around 20% of the water you need just from food. Then because coffee, milk, juice, tea, etc. are mostly water, you’re probably already getting all the water you need each day without having to drink 8 more glasses of it.
… But Maybe You Do Need More Water
Daily water consumption is about maintaining balance: you need to replace the water you lose. If you live in a hot climate, or you’re sweating from exercise, you lose water faster than someone sitting still in a temperate climate. As such you need to replace water faster than normal which means drinking more water.
Also, should you be lost on a hike somewhere, you should ration sweat not water. Try to limit your physical exertion and sweat less but drink when you need to. A common mistake is that you should ration your water which, while it’s true you don’t want to waste a limited resource, if you’re thirsty you should drink. Your water isn’t doing you any good sitting inside a bottle.
Water water everywhere
On the flip side it’s possible to drink too much water. Exercise-associated hyponatremia is where you’re engaged in an endurance activity such as running a marathon, you sweat out water and sodium, but then you only drink water. In drinking regular water you manage to replenish your lost water but not your sodium. The result is low blood-sodium levels. This imbalance can cause poor nerve communication which leads to poor muscle control, poor performance, etc. Athletes with hyponatremia can feel nauseous, develop muscle cramps, and become confused leading some to think they’re dehydrated and drink even more water (making the situation worse).
Hyponatremia is becoming more prevalent in sports as an increasing number of novice athletes participate in long-distance endurance activities. For example in the 2002 Boston Marathon 13% of runners were found to have hyponatremia from drinking too much water. Athletes need to replenish their sodium levels along with their water. Part of the solution (pardon the pun) is to drink sports beverages that contain electrolytes (which are salts and can replenish sodium levels). This is why sports drinks boast about having electrolytes.
So, if you’re thirsty, drink some water and if you’re engaged in an endurance sport remember to get some electrolytes along with your water.
Added info: to bust another myth, consuming caffeinated beverages won’t dehydrate you. While excessive caffeine has a number of downsides, drinking coffee or tea is an acceptable a way to hydrate.
Adam Ruins Everything dives into the myth of 8 glasses of water a day.
Nothing is perfect and we should embrace mistakes and imperfections.
Mistaken Mistakes
Persian carpets (aka Iranian carpets) come in a diversity of designs and sizes, but they frequently contain repeating symmetrical patterns. One alleged feature in handmade Persian carpets is a mistake in the design pattern (not in the construction) included intentionally. This “Persian flaw” serves as a reminder that only Allah is perfect. The flaw would be something small only noticed by the keenest of observers. It’s also been said that the Amish have a similar practice, that they include an intentional flaw (a “humility block”) in their quilts as a reminder that only God is perfect … but it isn’t true.
Lancaster curator Wendell Zercher has quoted Amish quilt makers as saying “… no one has to remind us that we’re not perfect.” As for Persian flaws, most accounts of this idea come from Western sources and is probably an example of orientalism. While both of these are nice stories that probably help to sell imperfect rugs & quilts, we have little to no evidence to support them. If anything, to intentionally make just one mistake out of humility would prove the opposite, bragging that you have the ability to make a perfect creation (but choose not to).
Actual “Mistakes”
There are however some cultures that really do include intentional imperfections in their work. Women in the Punjab region between India & Pakistan create Phulkari shawls of intricate designs. In these designs they sometimes include “mistakes” which are momentary changes in the overall design pattern. These changes are included to mark important events during the creation of the shawl (births, weddings, deaths, etc). Sometimes the symmetrical pattern is disrupted as spiritual protection from the evil eye.
On the left is a phulkari shawl with intentional changes to the pattern. To the right is a Navajo weaving featuring a “spirit line”.
Some Navajo also include imperfections in their weavings for spiritual reasons. The ch’ihónít’i (aka the “spirit line” or the “weaver’s path”) is a single line leading out of the middle of a design to the edge of the weaving. The spirit line is thought to give the weaver’s soul a way to exit the weaving so as to not get trapped in the design.
Embrace Imperfections
Of course if you accept that nothing is perfect then you have no need to add imperfections because everything is imperfect. The Japanese concept of wabi-sabi is the Zen view that everything is imperfect, impermanent, vulnerable. Unlike Western design ideas which frequently strive for idealized perfection, wabi-sabi celebrates the imperfections that make everything (and everyone) unique.
Kintsugi repaired ceramics, using gold & lacquer to feature (rather than hide) the imperfections.
Building off of wabi-sabi, kintsugi is the practice of repairing broken pottery with bits of valuable metals & lacquer that, rather than trying to seamlessly hide the repaired cracks, highlights them. Kintsugi honors the history of the object and celebrates its imperfections. Nothing lasts forever and we should recognize the beauty of imperfect vessels.
A crash course on the Japanese concept of wabi-sabi.
Ugly Fruits & Vegetables
In the West this embrace of the imperfect has recently manifested itself in ugly fruits & vegetables. Imperfect looking produce has traditionally gone unsold and makes up 40% of total food waste. Producers throw away food because they don’t think retailers will want it (it doesn’t meet “quality standards”) and then retail stores throw away the unsold odd looking food that customers won’t buy. This is all despite the fact that the taste and nutritional content of this “ugly” food may be identical to “normal” looking produce.
The European Union declared 2014 the European Year Against Food Waste. The French supermarket chain Intermarché began their “Inglorious Fruits and Vegetables” marketing campaign that celebrated ugly looking produce, they gave the foods their own section in the store, and sold them at a discount. It proved so successful that other stores & delivery services, such as Imperfect Foods, started to do likewise as consumers began to accept the wabi-wabi nature of produce.
The Intermarché marketing campaign to help reduce food waste was a huge success.
Lacking key ingredients, the menu at the first Thanksgiving of 1621 was a bit different than the traditional turkey dinner of today.
In the fall of 1621 the English Pilgrims and the Wampanoag came together in Massachusetts for, what has subsequently become a much mythologized, 3 day harvest festival. The pilgrims had a lot to be thankful for — that they were still alive following the deaths of half their fellow pilgrims the previous winter, that they had their supplies fortified by the Wampanoag, and that they had completed a successful summer growing season. What they ate as they gave thanks is debatable.
Definitely on the Menu
One food that was definitely served was venison. Massasoit, the leader of the Wampanoag, had 5 deer brought to the event. Another meat on the menu was “wild fowl”, but exactly what kind of birds these were is unknown. It’s possible that there was turkey at the first Thanksgiving but more likely it was goose or duck (or a combination). Other regional bird options at the time would have been swan and passenger pigeon.
Also definitely present was corn. The Wampanoag, who used the Three Sisters method of farming, had taught the pilgrims how to grow corn. As the pilgrims had grown a successful crop of Flint corn (aka “Indian corn”) it was cooked into a porridge, a bread, and/or with beans.
Maybe on the Menu
Given that the Plymouth Colony was by the water it’s very likely that seafood was also served. Eels, clams, muscles, cod, bass, and/or lobsters were very likely a part of the meal. It’s worth noting though that, unlike today, lobster was considered a food of last resort.
There were certainly vegetables & fruits on the menu but which ones were never specified (other than corn). Chestnuts, walnuts, beans, onions, carrots, cabbage, pumpkins, and various squashes were all grown in the area. Blueberries, plums, grapes, and raspberries were also grown in the area and could have been present. While cranberries might have been served cranberry sauce definitely was not since the colonists lacked the necessary sugar (and that cranberry sauce didn’t exist for another 50 years).
Not on the Menu
Even though pumpkins may have been present, pumpkin pie definitely was not. The pilgrims had neither the butter nor the flour necessary to make pumpkin pie – they didn’t even have an oven in 1621. Something pumpkin pie-esque that may have been prepared is a spiced pumpkin soup/custard cooked directly inside a pumpkin which was roasted on hot ashes.
There was no stuffing because, again, the colonists lacked the necessary flour. There were also no potatoes (mashed or otherwise). Potatoes came from South America and, while they had made their way to Europe by the late 16th century via the Spanish, they had yet to make their way to New England. There also weren’t any forks on the table since they too hadn’t made their way to North America yet (but on the upside nobody present had an overbite).
A historical reenactment of how to cook some of the foods present at the first Thanksgiving.
The autumnal flavor designed to resemble the spices in freshly baked pumpkin pie (but doesn’t contain any actual pumpkin).
Pumpkin spice does not contain pumpkin. It’s a blend of cinnamon, ginger, allspice, nutmeg, and clove used as an ingredient to spice up pumpkin pies. This spice mix (or variations of it) goes back as far as colonial America. Unlike the spice blend you buy in the store however, the pumpkin spice used in most commercially produced products doesn’t contain these spices. Commercial pumpkin spice flavor uses chemicals to simulate these spices which replicates the taste of a freshly baked pumpkin pie.
One reason a synthetic flavor is used, in lattes for example, is that using the actual spices makes it taste a bit more like Indian masala tea (chai tea) instead of pumpkin pie. Synthesized pumpkin spice flavoring has been engineered to taste like the spices after they have been transformed by the pie baking process. Other reasons for using a synthetic flavor are reliability (the flavor is the same every time) and cost (synthetic flavoring is a lot cheaper than using actual spices).
He who controls the spice controls the universe
The craze for all things pumpkin spice began in 2003 with the limited release of Starbucks’ latest specialty seasonal drink, the Pumpkin Spice Latte (PSL). With the success of their winter themed Peppermint Mocha and Eggnog Latte, Starbucks wanted an autumnal offering. Inspired by the flavors of freshly baked pumpkin pie the marketing team chose the name Pumpkin Spice Latte.
From big brands to small, just a few of the pumpkin spice products available for your autumnal seasonal needs.
In 2004 the drink was offered nationwide and became the most popular seasonal Starbucks beverage, generating an estimated $1.4 billion in sales as of 2017. It also started the flavor trend of all things getting a limited edition pumpkin spice variety. You can find candles, lip balm, cereal, soap, SPAM, chocolate candy, air fresheners, beer, and more all with pumpkin spice flavors.
Added info: Starting in 2015 the Starbucks PSL now contains some amount of pumpkin, but the flavor of the drink is still created using a pumpkin spice flavoring. Also, despite the autumnal seasonality of the drink, the PSL is on the Starbucks Secret Menu and you can buy it all year round.
Reports that MSG is dangerous stem from one anecdotal letter and years of racism.
Monosodium glutamate (MSG) is a compound made up of sodium and glutamate (an amino acid) found naturally in our bodies and in a variety of foods (tomatoes, cheeses, anchovies, mushrooms, etc). Usually when it’s mentioned people are referring to the synthesized food additive version which is added to meals to bring out their umami flavors. It’s been a commercially produced food additive since 1909 but, despite being used by tens of millions of people, 42% of Americans today think it’s dangerous. The cause of this fear goes back to one article.
Chinese Restaurant Syndrome
The April 4, 1968 edition of the New England Journal of Medicine contained a letter titled Chinese-Restaurant Syndrome by Dr. Robert Ho Man Kwok on his observations of eating American Chinese food. Kwok said that about 15 to 20 minutes after eating at a Chinese restaurant he developed a headache, weakness, heart palpitations, and numbness. He proposed several possible causes but singled out MSG as the answer. This single letter was the beginning of decades of mistrust in MSG.
The ideas of MSG side-effects and “Chinese Restaurant Syndrome” have largely been fueled by racism. Suspicion or fear of East Asian cultures, the exoticism of the “Orient”, and/or a general lack of knowledge has led some people to be suspicious of Asian cuisine. In 1969 New York City imposed regulations on MSG use in Chinese restaurants but not regulations on MSG in general. While the supposed adverse reactions to MSG should make consumers wary of any food item containing MSG, Chinese food in particular got singled out and maligned. Lots of processed western foods contain MSG, lots of plants naturally contain significant levels of MSG, and yet Doritos and shiitake mushrooms didn’t seem to get singled out quite like Chinese food did.
Asian restaurants were singled out and maligned for their use of MSG, but Western processed foods were not.
Safe to Eat
There is no connection between MSG and the symptoms Kwok described. The US Food & Drug Administration states that MSG is safe to eat and that there is no evidence to support claims of headaches and nausea from eating normal amounts of MSG. In double-blind studies using subjects who claimed to have sensitivity to MSG some subjects were blindly given MSG and, unaware they were eating MSG, had no ill effects. These tests were unable to reproduce any of the side-effects claimed about MSG.
MSG, like any food additive, is safe in moderation. Excess anything can make you sick. Because of the association of Chinese food to MSG, some Asian restaurants in the US have reduced their usage of MSG just to satisfy public opinion, to the detriment of the food and the customers’ taste buds.
Because of their rarity pineapples became European decorative elements and status symbols.
The pineapple is native to South America but across thousands of years of cultivation it spread to Central America as well. The first European to encounter a pineapple was Columbus in 1493 who brought some back to the Spanish royal court (along with tobacco, gold, chili peppers, and the people he kidnapped). Europeans had never tasted anything like pineapple before and, because of their scarcity, to own one quickly became an exotic status symbol of the ultra wealthy.
Pineapples were in high demand but there was low supply so enterprising individuals set out to grow pineapples in Europe. The tropical conditions pineapples require make growing them in Europe a challenge. It took until the 17th century for farmers in the Netherlands to succeed, followed by the English in the 18th century. Mathew Decker even memorialized his pineapple growing achievement by commissioning a painting in 1720. These efforts produced more, albeit still not many, pineapples for the European & American markets. A single pineapple could go for around $8,000 in today’s dollars. A cheaper alternative was to rent a pineapple which people would do to show off at parties and such. These pineapples would be rented from person to person until the final person paid to eat it, unless it had rotted by then. A further down-market option was pineapple jam which could be shipped from Central/South America.
Because of their popularity, pineapples became a decorative element in a host of artistic mediums.
Pineapple Art
The Caribbean custom of placing pineapples at the front of a friendly home as a welcome to strangers, combined with years of being displayed at happy European social gatherings, led pineapples to becoming international symbols of hospitality. This combined with their association to wealth & high society helped make the pineapple a popular artistic motif. From this we get carved pineapple embellishments as finials on staircases, at the tops of columns, on New England gateposts, above front doors, as fountains, as furniture accents, Christopher Wren placed gilded copper pineapples on St. Paul’s Cathedral in London, the centerpiece of the Dunmore Pineapple folly in Scotland is a massive pineapple, etc.
Added info: any association of pineapple with Hawaii comes after the fruit was introduced there by the Spanish in the 18th century. Pineapple is not native to Hawaii.
North America lobsters were originally a poor person’s food.
While an expensive luxury today, the American lobster (aka Homarus americanus, the Maine lobster, Canadian lobster, northern lobster, etc) was once a food of last resort. Native American tribes of the northeastern coastal regions would use lobsters as fertilizer, fish bait, and when necessary food. European colonists also viewed lobsters as inferior last-resort bottom feeders. These “cockroaches of the sea” became relegated to food for the poor, for servants, prisoners, slaves, and sometimes even feed for livestock.
The turnaround for lobsters began in the 19th century with two new industries: canning and railroads. As canning became a viable way to preserve & ship food, lobster meat became a cheap export from the New England area. Lobster meat was sent via rail to locations all around North America. This was followed by tourists visiting New England along some of the same rail lines. These tourists were able to finally taste fresh lobster meat instead of canned and lobster’s popularity grew. By the 1880s demand for lobster (especially fresh lobster which must be shipped alive) combined with a decrease in supply, lead to higher prices. This helped establish lobster as the expensive delicacy we think of today.
Expensive?
Like any commodity, lobster is subject to price fluctuations. While lobster typically maintains its cultural status as an expensive delicacy, this doesn’t always reflect the real cost. For example the over abundance of lobsters around 2009 sent the wholesale price of lobster from around $6 a pound to half that – but it would have been hard to notice. Restaurants don’t typically reduce their prices because an ingredient has suddenly become cheaper.
However, when lobster is less expensive it does appear in unexpected places. Around 2012 Gastropubs included lobster in dishes such as macaroni & cheese, fast food chains included lobster on their menus, Walgreens in downtown Boston even sold live lobsters – all things you don’t usually see when a commodity is expensive. Today, even in years when lobster is abundant and the cost is low, it is still thought of as a luxury item.
The McDonalds Shamrock Shake helped pay for the first Ronald McDonald house.
In 1970 McDonalds introduced their lemon/lime flavored green Saint Patrick’s Day Shake. It eventually changed flavors and names to become the mint flavored, and alliteratively titled, Shamrock Shake. Like the autumnal artificial scarcity of pumpkin spice, the Shamrock Shake is only available around Saint Patrick’s Day in the February through March timeframe (except in Philadelphia where it has two seasons).
McDonald’s founder Ray Kroc, Mayor Frank Rizzo, members of the Eagles organization, Fred Hill & his daughter all attended the opening of the first Ronald McDonald House, October 15, 1974.
Philadelphia’s two seasons of Shamrock Shakes goes back to the role it played in creating the first ever Ronald McDonald House. In 1969 Kim Hill, daughter of Philadelphia Eagle Fred Hill, was diagnosed with leukemia. By 1973 the Hills and members of the Eagles organization started the Eagles Fly for Leukemia charity which helped pay for the new oncology wing at the Children’s Hospital of Philadelphia (CHOP).
After seeing parents camped out in the hallways and waiting rooms while their children received treatments, the charity went a step further in 1974 and purchased an old seven-bedroom house at 4032 Spruce Street not far from the hospital. The house would be a place for visiting families to stay free of charge while their children received treatment – a “home away from home”. To help pay for this the Eagles partnered with the local McDonalds restaurants asking them to donate money from their next promotional food item, which just happened to be the Shamrock Shake. The Eagles asked them to donate 25 cents per shake but McDonalds executives asked if they could have the naming rights to the house if they donated all of the proceeds. Eagles general manager Jimmy Murray said “… for that money, they could name it the Hamburglar House.” From this, the first ever Ronald McDonald House was established in Philadelphia in 1974. Today there are more than 375 Ronald McDonald House programs around the world which, at what would have been more than 2.5 million overnight stays in hotels, save families around $930 million each year.
Uncle O’Grimacey
As positive as the Shamrock Shake’s impact has been, there have been some missteps. To help promote the Shamrock Shake, McDonalds introduced the new mascot character Uncle O’Grimacey in 1975. The Irish uncle of the purple mascot Grimace, Uncle O’Grimacey (complete in his kelly green hat, shamrock-patterned vest, and shillelagh) would travel from Ireland each year bringing Shamrock Shakes to McDonaldland. Uncle O’Grimacey was quietly phased out of McDonalds marketing after a few years due in part to an alleged incident in Philadelphia in 1978 where the person portraying him made statements in support of the IRA and that British soldiers were better dead than alive.
Casual racism isn’t just relegated to the semi-distant past however. In 2017 McDonalds ran an ad promoting the Shamrock Shake. Unfortunately they had a man wearing a tartan Tam o’ shanter playing the shake like a set of bagpipes (which would be Scottish) while standing in-front of Stonehenge (which is in England). McDonalds stopped the ad and apologized saying they are “… strongly supportive of Ireland and respectful of its culture”. Begosh and Begorrah.
Uncle O’Grimacey bringing Shamrock Shakes to McDonaldland.
We have slight overbites because we use forks & knifes.
Before people ate with knives & forks they used their hands. In the absence of utensils people would have to clench and rip their food with their teeth. While crude, the result of all this pulling meant that people’s top and bottom rows of teeth lined up edge-to-edge. The introduction of utensils changed that.
The use of a knife & fork meant that a person no longer had to use their teeth to pull at their food, they could cut their food on their plate first. As a result anyone who used utensils (including us today) developed an overbite. Most of us have slight overbites where our top teeth hang out a bit in front of our bottom teeth, which comes from using utensils.
Looking at skulls we can see this effect across time, across cultures, and across social classes. Aristocrats could usually afford knives & forks before the peasantry, so they developed overbites before anyone else. For example, the more affluent members of 18th century Western Europe developed overbites before the people who couldn’t afford silverware. Further, the overbite took longer to develop in the American colonies, who were poorer than their countrymen back home in Europe.
In China the overbite developed as far back as 800 CE. Instead of knives & forks the Chinese aristocracy used chopsticks, but to eat meat with chopsticks the meat had to be pre-chopped as part of the meal preparation. As a result they didn’t have to pull at their food either and developed overbites centuries before Europeans.
QI discusses the development of overbites, and as it pertains to Richard III