The German concept of belonging & happiness that English doesn’t have a word for.
Sitting in a tent at Oktoberfest one song that will be played again and again is Ein Prosit. It only has four words in the lyrics, it takes less than 30 seconds to sing, and after singing it the band leader directs everyone to drink. The lyrics are:
Ein Prosit, ein Prosit Der Gemütlichkeit
A toast, a toast To Gemütlichkeit
What exactly are we toasting? What is Gemütlichkeit?
Gemütlichkeit (roughly: ge-mut-lee-kite) is a German word that we don’t have a direct translation for in English. It’s a feeling of happy belonging, sort of like cozy but unlike cozy it’s felt in the company of others. Gemütlichkeit can’t be felt alone. It’s the good feeling you get wandering a Christmas market with your family, it’s a summer BBQ in a friend’s back yard, and of course it’s gathering together at a beer garden. Gemütlichkeit is a state of mind. It’s the enjoyment of simple pleasures shared with others.
Part of gemütlichkeit’s meaning comes from its origins. In the early 19th century Biedermeier period, industrialization helped create a new German middle class. This growing population used their new found money & free-time to embrace a quieter, simpler life. Feeling secure and happy with friends & family was more important than politics. This was also around the start of Oktoberfest, which began as a wedding festival but turned into an annual tradition in 1811. Gemütlichkeit and Oktoberfest go well together because, as people gather for good food, beer, and fun, they’re celebrating the simple things in life with others.
The terms redneck and hillbilly both come from rebellious 17th century Scottish protestants.
In 17th century, King Charles I pushed for greater religious uniformity across the British Isles. Scottish Presbyterians disapproved as these reforms were increasingly Catholic in style & organization. In 1638 thousands of Scots signed the National Covenant (sometimes using their own blood as ink), signifying their preference for a Presbyterian Church of Scotland and their refusal to accept the reforms made by Charles. Going one step further, some of these “Covenanters” took to wearing red cloth on their necks as an outward sign of their resistance. These dissenting Scottish religious rebels were the original “red necks”.
Political and religious tension continued around the British Isles throughout the late 17th century which led to the 1688 Glorious Revolution. On the one side of this revolution was Catholic King James II and those who supported a strong monarchy, on the other were Protestants & Parliamentarians. Afraid of a Catholic dynasty and that James would leave the throne to his Catholic son James Francis Edward, seven influential English nobility invited the protestant Dutch Prince William of Orange to invade England and take the throne.
Around the same time, Scottish Presbyterian leader Richard Cameron was preaching a message of rebellion against the English. Being a religious nonconformist, Cameron took to being a field preacher and spread his radical message outdoors away from Scottish towns. His followers (the Cameronians) were given the nickname “hillmen” due to their outdoor religious gatherings.
As William of Orange easily invaded England, and successfully took the throne, he was supported by Scottish Protestants. The Scottish living in Northern Ireland at the time fought against the Jacobite supporters of King James. William of Orange was nicknamed “King Billy” and his Ulster Scots Protestant supporters were nicknamed “Billy boys”. Eventually these two Scottish Protestant rebel nicknames of “hillmen” and “Billy boys” got combined to form “hillbilly boys” and then just “hillbilly”.
American Rednecks & Hillbillies
Despite their successful support for William many Scottish were still oppressed for being Presbyterians and for being Scottish. Searching for greater religious & personal freedom they began to emigrate in larger numbers from Ulster to the British colonies in North America. An estimated 200,000 Ulster Scots (aka Scotch-Irish) emigrated to the American colonies between 1717 and 1775. Settling up and down the East coast and throughout Appalachia, these Scottish protestants brought with them their religion, the rebelliousness, as well as their nicknames.
Over the centuries the meanings of both “redneck” and “hillbilly” have changed. During the “Redneck War” of 1920-21 “redneck” was used to label the unionizing coal miners (many of whom were Scotch-Irish) who wore red bandanas in solidarity. The term has also been used to describe early 20th century southern Democrats as well as more literally to describe poor farmers with sunburnt necks. Hillbilly also took on a more literal interpretation to describe the people who settled the rural hilly areas of Appalachia and the Ozarks. Today both terms are generally used as derogatory slurs for poor rural whites.
Humans have been making devices to shield their eyes from the sun for thousands of years. Today one company dominates the market.
Living around the Arctic where the bright sunlight reflects off the ice & snow, the indigenous peoples of North America & Greenland developed the earliest sunglasses. These 4,000 year old proto-sunglasses were carved from a variety of materials and featured very thin slits allowing the wearer to see while keeping their eyes protected by blocking excessive sunlight. This idea has been recreated many times in a variety of styles from the 1930s to the present.
The Venetians, who had been making clear corrective eyeglasses since the 13th century, were among the first to produce sunglasses with glass. In the 18th century the glass makers of Murano produced green-tinted eyeglasses (as well as what resemble handheld mirrors but with transparent green glass) through which wealthy Venetians could look across the water while protecting their eyes from reflected light.
By the 19th century it was not uncommon for soldiers, on both sides of the American Civil War, to wear colored spectacles of blue/gray/green to protect their eyes while marching in the sun. But sunglasses were still primarily utilitarian. They didn’t become a fashionable part of mainstream culture until the 20th century.
20th Century Sunglasses
In the early 20th century Sam Foster had a plastics company that primarily sold women’s hair accessories, but as the trend in women’s hair changed to shorter hair styles (negating the need for so many hair accessories), he had to find a new product to sell. In 1929 he began selling inexpensive plastic sunglasses to beachgoers for 10 cents a pair on the Atlantic City boardwalk. This was the beginning of the Foster Grant eyewear company. Foster Grant sunglasses became the shades of Hollywood celebrities which helped make sunglasses not just about protecting your eyes but also about fashion. Sunglasses could now be about style as well as function.
In 1929 Bausch & Lomb, who were already making optical equipment for the military, began work for the U.S. Army Air Corps to develop sunglasses that wouldn’t fog up and would reduce glare for pilots. This gave us the iconic “Ray-Ban Aviator” sunglasses. Aviator sunglasses were also the start of Ray-Ban eyewear company, which began as the civilian division of Bausch & Lomb. Ray-Ban would go on to make another iconic model of sunglasses, the Wayfarer, in 1956.
Today the sunglasses market is dominated by Luxottica, an Italian eyewear juggernaut which is the largest eyewear company in the world. They’re the company actually making the sunglasses of luxury brands such as Chanel, Prada, Ralph Lauren, Versace, etc. Luxottica’s dominance is due in large part to their vertical integration control over the eyewear industry. They own major distribution retail stores such as LensCrafters, Target Optical, Pearle Vision, and Sunglass Hut. They own major eyeglass brands including Oakley and Ray-Ban, and they manufacture the eyewear for all of the above. They even own EyeMed, the second largest vision insurance company in America. You could go from getting a vision prescription, to selecting a pair of glasses, to buying them at a retail store and pay Luxottica at every step of the way.
Luxottica’s control over the market is why eyewear prices have gone up and not down. The proliferation of brands & stores competing for sales isn’t as competitive as it seems since Luxottica is behind many of them. In Luxottica owned stores 89% of the products available are made by Luxottica. Most of these glasses are the same quality, just different styles. Because of Luxottica, frames that cost maybe $15 to produce can be sold for hundreds of dollars. As of 2019 Luxottica controlled around 40% of the eyewear market.
Added info: Beyond just blocking excessive bright light, good sunglasses block most ultraviolet (UV) light from damaging your eyes. Darker glasses don’t necessarily block more UV light so it’s worth buying reputable sunglasses that have been engineered & certified to offer UV protection. It’s better to not wear any sunglasses at all than ones that don’t block UV light because your pupils will widen in the shade of junk sunglasses and in so doing allow in more UV rays.
Reports that MSG is dangerous stem from one anecdotal letter and years of racism.
Monosodium glutamate (MSG) is a compound made up of sodium and glutamate (an amino acid) found naturally in our bodies and in a variety of foods (tomatoes, cheeses, anchovies, mushrooms, etc). Usually when it’s mentioned people are referring to the synthesized food additive version which is added to meals to bring out their umami flavors. It’s been a commercially produced food additive since 1909 but, despite being used by tens of millions of people, 42% of Americans today think it’s dangerous. The cause of this fear goes back to one article.
Chinese Restaurant Syndrome
The April 4, 1968 edition of the New England Journal of Medicine contained a letter titled Chinese-Restaurant Syndrome by Dr. Robert Ho Man Kwok on his observations of eating American Chinese food. Kwok said that about 15 to 20 minutes after eating at a Chinese restaurant he developed a headache, weakness, heart palpitations, and numbness. He proposed several possible causes but singled out MSG as the answer. This single letter was the beginning of decades of mistrust in MSG.
The ideas of MSG side-effects and “Chinese Restaurant Syndrome” have largely been fueled by racism. Suspicion or fear of East Asian cultures, the exoticism of the “Orient”, and/or a general lack of knowledge has led some people to be suspicious of Asian cuisine. In 1969 New York City imposed regulations on MSG use in Chinese restaurants but not on MSG in general. While the supposed adverse reactions to MSG should cause caution for any food item containing MSG, Chinese food in particular got singled out and maligned. Lots of processed western foods contain MSG, lots of plants naturally contain significant levels of MSG, and yet Doritos and shiitake mushrooms didn’t seem to get singled out quite like Chinese food did.
Safe to Eat
There is no connection between MSG and the symptoms Kwok described. The US Food & Drug Administration states that MSG is safe to eat and that there is no evidence to support claims of headaches and nausea from eating normal amounts of MSG. In double-blind studies using subjects who claimed to have sensitivity to MSG some subjects were blindly given MSG and, unaware they were eating MSG, had no ill effects. These tests were unable to reproduce any of the side-effects claimed about MSG.
MSG, like any food additive, is safe in moderation. Excess anything can make you sick. Because of the association of Chinese food with MSG some Asian restaurants in the US have reduced their usage of MSG just to satisfy public opinion, to the detriment of the food and the customers’ taste buds.
Because of their rarity pineapples became European decorative elements and status symbols.
The pineapple is native to South America but across thousands of years of cultivation it spread to Central America as well. The first European to encounter a pineapple was Columbus in 1493 who brought some back to the Spanish royal court (along with tobacco, gold, chili peppers, and the people he kidnapped). Europeans had never tasted anything like pineapple before and, because of their scarcity, to own one quickly became an exotic status symbol of the ultra wealthy.
Pineapples were in high demand but there was low supply so enterprising individuals set out to grow pineapples in Europe. The tropical conditions pineapples require make growing them in Europe a challenge. It took until the 17th century for farmers in the Netherlands to succeed, followed by the English in the 18th century. Mathew Decker even memorialized his pineapple growing achievement by commissioning a painting in 1720. These efforts produced more, albeit still not many, pineapples for the European & American markets. A single pineapple could go for around $8,000 in today’s dollars. A cheaper alternative was to rent a pineapple which people would do to show off at parties and such. These pineapples would be rented from person to person until the final person paid to eat it, unless it had rotted by then. A further down-market option was pineapple jam which could be shipped from Central/South America.
The Caribbean custom of placing pineapples at the front of a friendly home as a welcome to strangers, combined with years of being displayed at happy European social gatherings, led pineapples to becoming international symbols of hospitality. This combined with their association with wealth & high society helped make the pineapple a popular artistic motif. From this we get carved pineapple embellishments as finials on staircases, at the tops of columns, on New England gateposts, above front doors, as fountains, as furniture accents, Christopher Wren placed gilded copper pineapples on St. Paul’s Cathedral in London, the centerpiece of the Dunmore Pineapple folly in Scotland is a massive pineapple, etc.
Added info: any association of pineapple with Hawaii comes after the fruit was introduced there by the Spanish in the 18th century. Pineapple is not native to Hawaii.
The saggy skin in front of a cat’s legs is an evolutionary feature, not a sign of an out of shape cat.
The primordial pouch is the formal name for the saggy belly cats have. Some breeds have a more prominent pouch than others, but from big cats to house cats they all have it. It’s a loose flap of skin & fat which is the result of evolution and not the because of spaying or neutering (again, big cats such as lions and tigers have it too and they aren’t spayed or neutered).
One use for this extra belly skin is abdominal protection from the rear paw kicks of other cats. Another benefit is it gives cats greater flexibility (especially when they sprint or jump). It is also theorized that when early cats would find a meal this skin would stretch to allow the belly to expand and hold more food.
Just because you started something doesn’t mean you have to finish it. Sometimes quitting is a good thing.
The Sunk-Cost Fallacy is where, because you have invested time / effort / money etc. into something, you feel you can’t quit. The cost of the thing makes you continue because you think that stopping would be a waste of all that time / effort / money etc. In reality however, if something isn’t worth it anymore, you should quit.
Humans are strongly loss averse. Losing something hurts more than gaining something by almost two to one. We’re naturally protective of the things we have and we focus more on what we may lose than what we may gain. This manifests itself when it’s time to move house, have a yard sale, or generally clean-up – people can have a difficult time parting with possessions. Similarly, walking out of a bad movie, turning around and asking for directions when driving around lost, or ending a relationship are all hard to do, partially because we are invested in them and we don’t want that investment to have been a waste. We don’t want to look foolish for having invested poorly so we double-down and continue with things we aren’t enjoying anymore to save face. By continuing forward no matter what we are increasing our investment costs as well as the damage by staying the course.
Sunk-costs are the investments we’ve made that can never come back – they’re in the past. They’re also irrelevant in considering our future paths. Past costs are looking backwards but your future choices are looking forward. For example, just because you’ve paid for a ticket to a concert doesn’t mean you have to go. If you’re feeling sick then maybe don’t go. The money you paid for the ticket is gone so all you have to consider now is: do I feel like going to this concert?
When evaluating potential courses of action, consider what is best for your future and don’t think too much about the past. The sunk-costs of your past can’t be recouped and sometimes it’s worth quitting something and turning in a new direction.
The whodunit murder mystery trope that the butler is the culprit goes back to one book, The Door.
The first known instance of the butler being guilty of a whodunit crime is the 1893 Sherlock Holmes story The Adventure of the Musgrave Ritual, where Brunton the butler tries to locate & steal a hidden treasure (spoiler). The next known instance was 1921’s The Strange Case of Mr Challoner by Herbert Jenkins, but being published at the dawn of the Golden Age of Mysteries the work got lost in the shuffle and nobody really took notice (of the butler or the story). It wasn’t until 1930’s The Door by Mary Roberts Rinehart that the trope really took off.
Mary Roberts Rinehart was the “American Agatha Christie”. She was a best selling author in the Golden Age of Mysteries who was enormously popular. When her sons launched a new publishing company she wanted to give them a successful novel to produce so she quickly wrote The Door and had the butler be the murderer. Also, as an example of a false memory / Mandela Effect, while the butler did it nobody every says “the butler did it” in the book.
It was around this time however that critic and writer S. S. Van Dine wrote the article Twenty Rules for Writing Detective Stories where one of his rules was that “A servant must not be chosen by the author as the culprit.” The success of The Door, combined with the turning literary tide against making a servant the villain, quickly made “the butler did it” both a popular plot device and a cliche joke. It began to pop up in other detective stories, it was satirized, and today it lives on as a trope of early 20th century whodunit stories.
Added info: Mary Rinehart was the victim of a real-life murder attempt. Her chef, Blas Reyes, was angered over not being promoted to the position of butler, which Rinehart filled with an external hire. On June 21, 1947 Reyes couldn’t take his frustration anymore, he walked into the library where Rinehart was, pulled out a gun, and from five feet away he fired … or tried to fire. The bullets were so old they didn’t fire. Rinehart ran for the kitchen door and what followed was a chase through the house with Reyes picking up kitchen knives on his way through the house following Rinehart. Eventually he was subdued by other staff members of the house and turned over to the police.
Also (far less dramatic), in regards to the duties of a butler, they vary greatly by household but a butler is typically the head of the dining room, wine cellar, and pantry. They are not usually an all-around assistant, but they can be depending on the employer.
Calendars based on the cycles of the moon have a shorter year than solar calendars. How that time discrepancy is dealt with depends on the culture.
Ancient cultures typically had two options for creating calendars: solar or lunar. Solar calendars track time based on the movement of the sun in the sky. It takes 365.24 days for the Earth to travel around the sun and make up a year. Lunar calendars however are based on the phases of the moon which restart every 29.5 days adding up to only 354.37 solar days. This leaves an 11 day discrepancy between lunar and solar calendars.
Intercalation of “Lunar” Calendars
To account for this 11 day difference some cultures engage in a practice known as “intercalation” which is the adding of extra days/weeks/months to synchronize your calendar with a solar year of 365.24 days. Many lunar calendars are, in reality, lunisolar calendars as they intercalate extra time to keep their lunar year somewhat aligned to our solar year.
A variety of cultures use lunisolar calendars, especially in East Asia. For example the traditional Chinese calendar is based on lunar cycles but adds a 13th month ever few years. This is why Chinese New Year doesn’t have a fixed date (on our calendar). Intercalation is used to keep the lunar New Year from straying too far which keeps it sometime between late January to late February.
The alternative to adding time is to do nothing about the 11 day discrepancy which has the cumulative effect of pushing holidays further and further around the calendar. This hands off approach can put spring holidays in the fall, winter months in the summer, etc. The Islamic calendar (the Hijri calendar) operates this way, which explains why Muslim religious holidays move around our solar based calendar so much. It takes 33 years for a holiday on a lunar calendar to come back around to its original position.
It’s not just lunisolar calendars that intercalate time. Our calendar year is 365 days but it takes the Earth 365.24 days to travel around the sun. We add time to our Gregorian calendar to account for the extra 0.24 day period of time. We do this by adding a Leap Day every 4 years to even things out.
Added info: The oldest known calendars are a group of carvings from around 32,000 BCE created by the Aurignacian people. These carvings are in antlers, bones, and cave walls found in France which have crescents, dots, and lines diagraming the cycle of the moon. These early lunar calendars document that, for tens of thousands of years, humans have tracked the passage of time by looking to the skies.
Between Mars and Jupiter lies the asteroid belt (aka. the “main asteroid belt”, as there are other areas with asteroids in our Solar System). Within this belt there are millions to billions of asteroids made up of rock and metals. Some are tiny particles but the largest is Ceres which is 580 miles in diameter. Large or small they’re hurdling through space at speeds up to 40,000 mph, so if one flew into a space craft it could be disastrous. Fortunately this isn’t really a problem.
Unlike asteroid belts in sci-fi movies, our main asteroid belt is not an obstacle course. Most of the asteroid belt is empty space. The four largest asteroids alone make up more than half the total mass of the entire belt and if you combined all of the asteroids together it would still be smaller than our moon. The average distance between asteroids is around 600,000 miles. According to Alan Stern of the Southwest Research Institute, “… if you want to come close enough to an asteroid to make detailed studies of it, you have to aim for one.” The odds of a spacecraft hitting one is less than 1 in a billion. It’s easier to fly through the asteroid belt than it is to actually hit an asteroid.