Labyrinths & Mazes

Labyrinths are made for contemplation while mazes are made for confusion.

The terms “labyrinth” and “maze” are used fairly interchangeably but they’re quite different. A labyrinth is a single unicursal path without choices – you keep walking forward and it will lead you out. A maze is the opposite. Mazes are multicursal puzzles filled with choices of where you could go. Mazes are designed to get you lost, labyrinths make it impossible to get lost.

the labyrinth and the Minotaur
Despite being famous for living in a labyrinth, the Minotaur actually lived in a maze.

One way out

Perhaps the most famous labyrinth is that of the Minotaur in Greek legend. The part-man part-bull Minotaur was said to live in a labyrinth designed by Daedalus. While the myth says “labyrinth” and contemporary illustrations showed the Minotaur at the center of a labyrinth, it was actually a maze. Reading the story it was cleverly designed to confuse (and trap) those who entered which is inline with a maze rather than a labyrinth.

the two basic forms of labyrinths
The two basic frameworks for labyrinths: Cretan/Classical and Four-Axis/Medieval.

Typical labyrinths wind back and forth from the outside to the center and then back out again, following a single path. There are twists & turns but no choices, you simply keep walking forward. Labyrinths have two popular frameworks: the Cretan/Classical style and the four-axis/Medieval design. The four-axis design was created during the Middle Ages and was popularized in the cathedral floors of northern France. Chartres Cathedral is the most famous example of a four-axis labyrinth design, a design which has been copied around the world (Chatham Massachusetts has an outdoor copy, Grace Cathedral in San Francisco has one, etc).

Adding to the confusion of the difference between labyrinths and mazes are turf mazes … which are actually labyrinths. In Northern Europe and the British Isles turf mazes are outdoor labyrinths made of short-cropped grass and sometimes stones. Their designs are similar to the ones found in Medieval cathedrals and were also made to be walked.

Walking a labyrinth is a meditative process of quiet introspection.

Walk the path

As to the purpose of labyrinths, there isn’t a single definitive answer. Some say they were easier alternatives than making religious pilgrimages to holy sites. Instead of traveling to a distant land you could pray as you walked the path of a labyrinth close to home. Labyrinths in this context were spiritual paths to God. Some labyrinths were built as entertainment for children. That said while the labyrinth of the Reims Cathedral was designed for spiritual reasons it was removed in 1779 because the priests felt children were having too much fun on the labyrinth during church services. Fishermen of Sweden believed that turf mazes could trap evil spirits, freeing the men to only have good luck on their trips out to sea. The late 20th century had a resurgence in labyrinth popularity which took on an additional New Age spiritual purpose.

Beyond the spiritual, labyrinths can have physical & psychological benefits. Typically found in quiet semi-secluded settings, labyrinths can help calm the mind through mindful meditation. During the pandemic they were a free outdoor resource for people looking to recenter themselves. Walking a labyrinth can trigger the relaxation response which has the benefits of reducing blood pressure and lowering stress levels.

boxwood-hedge maze at the the Governors' Palace at Colonial Williamsburg
A boxwood hedge maze at the the Governors’ Palace at Colonial Williamsburg, VA.

Land of confusion

From the calm mindfulness of labyrinths to the chaos of mazes. Mazes are puzzles. Unlike labyrinths where the correct path is always in front of you, mazes offer many options of alternate directions. Labyrinths are freedom from choices where mazes are nothing but choices (most of which are wrong).

As evidenced in the story of the Minotaur, mazes have existed for a very long time. As labyrinths grew in popularity in the Middle Ages so too did mazes. Even the word “maze” is from the Middle Ages meaning “delusion, bewilderment, confusion of thought”.

Hedge mazes were constructed/grown on European palace grounds as a fun novelty of the rich. The oldest surviving hedge maze in England is the six foot high Hampton Court Palace maze, planted between 1689-1695.

Corn mazes (or “maize mazes” as they are known in Britain, and variations of “maize labyrinths” in most other European languages) started in the early 1990s. The first corn maze was designed by famed maze creator Adrian Fisher and was commissioned by former Disney producer Don Frantz in Annville, Pennsylvania in 1993 (it was “Cornelius, the Cobasaurus”, a 3 acre dinosaur maze). Today farmers use GPS and drones to aid in the creation of corn mazes which generate considerable income. Treinen Farm in Lodi, Wisconsin estimate that they bring in 90% of their income from autumnal agrotourism (the corn maze, pumpkin patch, hayrides, etc).

Mazes have been popular for centuries and are a large part of pop culture.

I was lost but now I am found

The confusion of mazes can be frustrating, but it can also be rewarding. Since the late 19th century mazes have been used in science experiments to study animal psychology and the process of learning, and thereby how they may apply to humans. In 1882 John Lubbock wrote about how various insects could navigate simple mazes. The iconic idea of rats in mazes began with Willard Small who, in 1901, documented his experiments of placing rats in mazes and observing their behavior. Small used the Hampton Court Palace maze as the inspiration for his rat maze.

Modern cities are typically laid out on rectangular grid systems making navigation fairly easy. Older cities are a different story. Older cities have grown more organically and don’t typically follow a structured grid. The Greek town of Mykonos however is a purposeful example of not being designed on a grid as it’s said to have been intentionally laid out to be confusing for invading pirates. They used the confusion of mazes as a defensive tactic.

Artificial Intelligence also owes a debt to mazes. Bringing the the legend of the Minotaur and rats in mazes together, mathematician Claude Shannon created “Theseus”, an electronic mouse designed to solve mazes. In 1950 Shannon constructed a rearrangeable maze wired with circuits. Placing Theseus in the maze the mouse would advance, encounter obstacles, and then relay the information to the computer. The computer in turn would learn about the maze and then tell Theseus which way to go. Theseus was the first artificial learning device in the world and one of the first experiments in artificial intelligence.

Claude Shannon demonstrates Theseus, the maze-solving electronic mouse that laid the foundation for modern artificial intelligence.

On and on

The enduring appeal of labyrinths and mazes is their mystery. The mystery of the self and the mystery of possibility. A maze is a puzzle to solve, in a labyrinth the puzzle to solve is yourself.

Added info: the etymology of the word “clue” is tied (as it were) to the story of the Minotaur. A “clew” was a ball of thread, like the one Ariadne gave to Theseus to help him find his way in the labyrinth of the Minotaur. Over time the spelling and meaning changed to the “clue” we use today, like the clue Ariadne gave to Thesus.

Maze master Adrian Fisher talks mazes.

Claude Shannon demonstrates “Theseus”, the first artificial learning device which set the foundation for modern AI.


When a fragment of a song repeats over & over in your mind.

Earworms (aka Involuntary Musical Imagery) are pieces of music that ceaselessly repeat in your mind until something finally breaks the cycle, ending the loop. Almost everyone experiences earworms. Sometimes a song gets stuck in your head after you recently heard it but other times it can be triggered by a memory (such as seeing a product and remembering an old commercial jingle).

The types of songs that get stuck in our heads tend to be faster simpler melodies that have some unique/catchy element that make them stand out from other songs. They also tend to be (but are not always) songs you like, particular to your musical tastes, and are songs you listen to more often. Another quality that makes a song a strong candidate for an earworm, which is also a quality that our brains like, is repetition. Typically when a song gets stuck in your head it’s not the whole song but instead is just a catchy fragment of a song that can seamlessly repeat over & over. Related to the Zeigarnik effect and how our brains hold on to unfinished tasks, a song fragment will remain in our brains, unfinished, looping over & over until we are able to complete the song (or until we get distracted). As such one way to stop an earworm is to listen to the entire song. Like nudging the needle on a record player that is skipping over and over, listening to the entirety of the song can help break the loop and bring a sense of closure.

Other potential cures for earworms, beyond listening to the song in its entirety, are:
• Listen to “cure” songs (not the band, although …). Listening to other songs can distract/free your mind from the loop it is in.
• Do something else. Since we aren’t as good at multitasking as we thing we are, putting your conscious thoughts towards some other task can end the earworm.
• Chew gum. The act of chewing uses some of the same regions of the brain as speech and, since most earworms are songs with lyrics, chewing can help distract your brain from the looping lyrics of the earworm.

Added info: as for literal worms or bugs in your ears, it happens but it’s not common. Cockroaches (who are not adept at walking backwards), spiders, and flies seem to be the most common types of insects/arachnids that accidentally find their way into human ears. These creatures typically don’t want to be there but may end up getting stuck which is bad for everyone involved.

Finally the term “earworm” comes from the earwig insect which was thought to wiggle its way into your ears (which, thankfully, doesn’t happen).

TED-ed explores earworms.

Staircase Wit

Having the perfect comeback … after the fact.

L’esprit de l’escalier, or “staircase wit”, is when you think of the perfect thing to say … but it’s too late. The French name for this phenomenon comes from thinking of the perfect retort on your way down the stairs after leaving the conversation/argument. It’s a common enough experience that the phenomenon has a name. Thinking of what you should have said, after the fact, happens to everyone.

Staircase wit touches on counterfactual thinking, where we imagine alternate scenarios for events that have already happened. Deliberating on how things could have played out can lead to arguing with ourselves, where we try a discussion a second time in our minds trying to come up with the best response (witty or otherwise).

In the heat of the moment

When our ideas are challenged we can become flustered and emotional. It can be difficult to think straight, let alone to be witty, when you’re uncomfortable. Fear and anxiety can cause us to focus on a single line of thinking (depth-first processing) which, if you are trying to be witty, makes it more difficult to formulate a creative response.

When you’re in a good mood you’re more open to new ideas and are more creative (breadth-first processing). Wit requires creativity, confidence, and timing. Staying relaxed can help you be witty in the moment … and not after the fact on the staircase.

Added info: Oscar Wilde, a master of wit, still has a lot to teach us on the art of turning a phrase. Browse the internet or pick up a collection of his more memorable quotations for inspiration. If all else fails you can quote Wilde since, “Quotation is a serviceable substitute for wit.” – W. Somerset Maugham (a quote often misattributed to Wilde).

L’esprit de l’escalier is the basis for the Seinfeld episode “The Comeback” in season 8. George realizes the “perfect” comeback after being insulted in a meeting, only to screw things up again later.

“The jerk store called” … the Seinfeld episode “The Comeback” is based on the idea of staircase wit.

the Imp of the Perverse

The urge to do the wrong thing at the worst possible time.

The imp of the perverse is the phenomenon where you have the urge to do the wrong thing at the worst possible time. The name comes from the 1845 Edgar Allan Poe story of the same name which is part essay part short story. Poe lays out his theory that humans sometimes have a destructive drive that works against their own best interests. It then goes on to be a short story of murder (as Poe stories tend to do).

What if I …

Like the imps of folklore, or maybe a cartoon devil on your shoulder, we each have a mischievous side that tells us to do something wrong just because we can. It can be benign things such as the urge to shout in a quiet concert hall, or maybe to throw a coin off the top of a building, to jump out and scare someone, or maybe to tip over a carefully arranged stack of cans in the grocery store. Sometimes the ideas are more dangerous such as the urge to drive your car off the road, to push someone off a train platform, etc. We don’t do it, but that little thought pops up sometimes.

A special kind of imp of the perverse is the French concept of “L’appel du vide” or “the call of the void”, where you stand at the edge of a precipice and think “I could just jump right over the edge” but then quickly back away. Studies of L’appel du vide (aka High Place Phenomenon) suggest that it isn’t suicidal – quite the opposite. Researchers believe that this is your brain warning you to be careful. It’s driven by a desire to continue living rather than the other way around.

Choices and ideas

It’s not fully understood why we have these thoughts, why the imp of the perverse pops up from time to time. One possible explanation is that we like to have options even if we know we would never choose some of them. Just knowing that we could do or say something is satisfying enough without actually doing or saying it. Another possibility is that these thoughts are part of an internal rebellious drive, part of what psychoanalyst Otto Rank called our “counterwill”, where we oppose feeling confined or controlled and so we try and assert our own individuality. It could be an internal way of feeling like an individual by thinking counter to what is expected & acceptable.

That said, wild ideas can be useful. They might be long shots, but occasionally one of these ideas is the kind of out-of-the-box thinking that’s necessary for innovation. Sure most of these ideas are “hold my beer” bad but a few come along which might just be crazy enough to work. Purely rebellious ideas like dropping a coin off of a building isn’t going to do much to change the world but thinking outside of the norm is where big innovative ideas come from.

Thinking these thoughts is normal. Our prefrontal cortex, which is involved in impulse control, helps us follow social norms and not follow through on these ideas. Still, every now and then the imp of the perverse manages to serve up something helpful.

Giving in to the imp of the perverse, Jerry sets up Elaine for disaster.

the Bystander Effect

In larger groups people become less likely to help. When people are waiting for someone to do something, maybe you’re the person who should do something.

The bystander effect is a psychological theory where, the more people who are present the less likely someone is to help a person in need. Alone you would probably help but in a crowd you just expect someone else to do something. We regularly hear stories in the news or have examples in our own lives of situations that could have been avoided if someone in the crowd had acted – people who knew someone was dangerous but never told the authorities, people who witness harassment at work but never speak up, the driver broken-down on the side of the highway that everyone drives past, etc.

Part of why this happens is a “diffusion of responsibility” where members of a crowd feel less responsible to take action. “There are so many people here I bet someone else has already called an ambulance” or “someone else is probably more qualified to help”. Of course, if everyone assumes someone else will take action then nobody does.

Another reason this happens is social influence. People look around and take their cues from how others are behaving. We’re social creatures and most of us don’t like to go against the crowd. We try and fit in by doing what other people are doing. If a crowd of people seem unconcerned by something, and they continue going about their day as usual, you are less likely to go against the crowd and take action.

Less Likely To Help (… Some Conditions Apply)

While it is true that the larger the crowd the less likely someone is to assist, there are some caveats. For example: while we take our cues from how others around us are behaving, and if nobody is helping we are less likely to help, the opposite is also true. If other people are lending a hand then we’re actually more likely to help.

People are also more likely to help when a situation is a clear emergency. Ambiguous situations that aren’t life-threatening aren’t as likely to get assistance as an obvious emergency. Also someone who is trained to assist in an emergency is more likely to intervene. For example a medical professional who regularly helps people is more likely to provide assistance even if the rest of the crowd won’t. We’re also more likely to lend assistance to people who we perceive as part of our in-group, our “uchi” (people wearing the jersey of a team we support, people with political bumper stickers we agree with, etc).

I Need Help

If you find yourself in an emergency and there is a crowd of people, there are things you can do to improve your chances of getting help. The first thing is to make it clear you need help. Remove any ambiguity by clearly stating you need help. Singling people out also improves your chances. Make eye contact with individuals, ask them for help, tell them what you need. Directly appealing to individuals improves your chances of receiving help.

As for being a bystander, remember that you are someone. Instead of waiting for someone else to take action maybe you’re the very someone who should take action. If you were the only person around how would you behave? If you begin to help you increase the chances that other people will join in and help too, canceling out the bystander effect.

Added info: while many examples of the bystander effect exist, the definitive example is the 1964 murder of Kitty Genovese in New York City. Multiple people heard and even saw her being attacked but failed to take action until it was too late. There is a very good Stuff You Should Know episode about this case as well as a documentary.

the Nirvana Fallacy

Sometimes an imperfect solution is better than waiting for a perfect one.

The nirvana fallacy (aka the “perfect solution fallacy”) is when you compare an imperfect option to an idealized perfect option. Basically it’s when you dislike/reject an option just because it isn’t perfect. Rather than weighing the merits of realistic (albeit flawed) options, you pit realistic options up against unrealistic perfect options.

Something is better than nothing

In the world of COVID we see this with wearing masks. When you board a plane or enter a restaurant you have to wear a mask, you can remove your mask to eat or drink, but then you have to put your mask back on. This leads some to think “Well why even bother wearing the mask if we’re just going to take it off?”, but this is fallacious. The perfect solution would be to stay at home or to wear your mask all the time, but this is unrealistic. Even though temporarily removing your mask is flawed, to wear your mask at all is still better than never wearing a mask. The imperfect solution is better than not even attempting a solution just because it isn’t perfect (which, spoiler, the perfect solution neither exists nor will it ever exist).

The nirvana fallacy frequently finds its way into public policy debates. When some policy doesn’t fully solve a problem its political opponents will attack it for its flaws. However, no realistic solution will ever go far enough to satisfy all critics. Good governance is choosing the best possible available solution knowing that all options will be flawed.

When weighing your options don’t reject an option just because it isn’t perfect — all options will be imperfect. By holding out for the perfect option you can do more harm than good. Doing something is frequently better than doing nothing at all.

“Perfect is the enemy of good.” – Voltaire


Astrology, the idea that the stars are influencing your life, is completely fake.

Humans have been following the movements of the sun, the moon, the planets, and the stars for thousands of years. Using this celestial information to understand the seasons and the passage of time is logical. Using this information to predict the future or explain human personalities, is not logical (but understandable). People want to understand why things happen, the world can be scary, and finding some system in the stars is an attractive idea. A relatable narrative is more appealing than unpredictable chaos so it’s understandable that people would look to astrology (like how people fall for conspiracy theories).

While there are different kinds of astrology, the shared basics is that they use complex series of real astronomical calculations combined with made-up traits assigned to different constellations/alignments/times to “gain insights” into the workings of the world. The Western astrological system is rooted in Hellenistic astrology from the Mediterranean around 200-100 BCE (which itself is based in the much older Babylonian astrology). It’s from Hellenistic astrology that we get the Zodiac, horoscopes, star signs, and the kind of astrology we typically encounter in blogs and newspapers.

Despite millennia of study & measurements, nobody is any closer to explaining why astrology is supposedly real.


That said, astrology is completely fake. It’s pseudoscience, superstition, hooey. To start, there’s no reason a distant configuration of stars which looks vaguely like a crab or a bull would have any relationship with the events on Earth. But even if there was some kind of relationship there would need to be a force connecting us to these heavenly bodies, affecting us here on Earth. Science hasn’t found or been able to measure any kind of force at work. Neither gravity nor electromagnetism work like this. Maybe there is some unknown other force, that remains strong yet undetectable, interacting with us from distant stars trillions of miles away which has yet to be discovered.

Another problem is that astrological assessments/predictions should be at least consistent if not accurate. In 1985 scientist Shawn Carlson conducted a double-blind experiment with astrologers to match personality test results to natal charts (essentially their zodiac symbols). If personality types are immutably governed by the stars, matching a zodiac sign to a participant’s corresponding personality type should be easy. It was apparently not easy, as astrologers performed about the same as pure chance. Worse, the astrologer participants performed poorly in even finding their own personality profiles.

Maybe astrology succeeds despite the human failings of astrologers. Time twins, people born at the same time on the same day sometimes even in the same hospital, should have similar personalities. Unfortunately there is no correlation at all. Even without astrologers being involved astrology is inconsistent.

Part of the blame for astrology lies with its adherents who believe astrology is real. Paranormal skeptic James Randi conducted an exercise where he gave detailed horoscopes to a class full of students. Most of the students said the horoscope they received was quite accurate. The trick was that Randi gave the same horoscope to everyone in the class. What the students in Randi’s experiment fell for was the Barnum effect.

Barnum Effect

The Barnum effect (aka the Forer effect) is found in fortune telling and astrology where an assessment/reading seems to be about you but in reality can apply to almost anyone. These are statements that have been carefully worded to be specific and yet universal. For example, one might say that …

“You have a tendency to be critical of yourself. You have a great need for other people to like and admire you. You pride yourself as an independent thinker and do not accept others’ statements without satisfactory proof. At times you are extroverted, affable, sociable, while at other times you are introverted, wary, reserved.”

In fact these statements are part of what psychologist Bertram Forer gave to his test subjects as part of his 1948 study. When assessing the accuracy of these statements, participants in Forer’s experiment gave an average rating of 4.3 out of 5 (5 being the most accurate). It turns out every student was given the exact same statements. Horoscopes and other astrological readings frequently use the Barnum effect to seem specific to you but in reality can apply to almost anyone.

Confirmation Bias

Another way astrology can seem real is through confirmation bias. Believers remember the predictions that came true more than the ones that didn’t. When someone has an emotional desire for a certain outcome they can respond more favorably towards the evidence that supports their beliefs and dismiss or undervalue contradictory evidence. Selectively remembering the horoscopes that came true can make astrology seem real, even thought it’s not.

Other contributing factors are that people who believe in astrology tend to be of lower intelligence, and more narcissistic, than non-believers. A potential “self-centered worldview” (along with a shaky understanding of science) could be influencing factors leading people to believe in astrology.

Ultimately astrology is inconsistent, inaccurate, and unable to explain why any of it is supposedly happening. From Cicero to modern scientists we have compelling arguments and mountains of scientific evidence showing again and again that astrology isn’t real. As professor Ivan Kelly of the University of Saskatchewan wrote, “Astrology is part of our past and has undeniable historical value, but astrologers have given no plausible reason why it should have a role in our future.”

Added bonus: one famous believer in astrology was President Ronald Reagan. Astrologer Joan Quigley (the Rasputin of the Reagan White House) regularly consulted her star charts to advise the president on a host of matters. She advised the president on when to deliver speeches, when to have presidential debates, when he should schedule his cancer surgery, and even when to land Air Force One. It was generous of the Christian Moral Majority to overlook Reagan’s pagan beliefs.

The Sunk-Cost Fallacy

Just because you started something doesn’t mean you have to finish it. Sometimes quitting is a good thing.

The Sunk-Cost Fallacy is where, because you have invested time / effort / money etc. into something, you feel you can’t quit. The cost of the thing makes you continue because you think that stopping would be a waste of all that time / effort / money etc. In reality however, if something isn’t worth it anymore, you should quit.

Loss Aversion

Humans are strongly loss averse. Losing something hurts more than gaining something by almost two to one. We’re naturally protective of the things we have and we focus more on what we may lose than what we may gain. This manifests itself when it’s time to move house, have a yard sale, or generally clean-up – people can have a difficult time parting with possessions. Similarly, walking out of a bad movie, turning around and asking for directions when driving around lost, or ending a relationship are all hard to do, partially because we are invested in them and we don’t want that investment to have been a waste. We don’t want to look foolish for having invested poorly so we double-down and continue with things we aren’t enjoying anymore to save face. By continuing forward no matter what we are increasing our investment costs as well as the damage by staying the course.

Sunk-costs are the investments we’ve made that can never come back – they’re in the past. They’re also irrelevant in considering our future paths. Past costs are looking backwards but your future choices are looking forward. For example, just because you’ve paid for a ticket to a concert doesn’t mean you have to go. If you’re feeling sick then maybe don’t go. The money you paid for the ticket is gone so all you have to consider now is: do I feel like going to this concert?

When evaluating potential courses of action, consider what is best for your future and don’t think too much about the past. The sunk-costs of your past can’t be recouped and sometimes it’s worth quitting something and turning in a new direction.

Uchi & Soto

The concept of in-groups and out-groups that shapes Japanese culture at all levels.

Uchi & soto is the Japanese cultural concept that people can be (and are) sorted into one of two groups: your in-group (uchi) or your out-group (soto). Is the person you are interacting with part of your inner-circle? Based on which group someone is in dictates how you should behave.

Uchi (内) means “inside” – it’s the familiar, the home, the groups you belong to. Soto (外) means “outside” – it’s the unknown, strangers, foreigners, the groups you aren’t a part of. People in your family, your coworkers, can be thought of as part of your inner circle, your uchi. Non-family members however, or your boss, can be considered soto. To add more complexity, these categorizations are fluid. While your boss is ordinarily considered soto, if the two of you are meeting with a customer then you’re unified in representing the company and so your manager is now considered uchi while the customer is soto. When you get back to the office however your manager goes back to being soto.

Shifting categories

People are constantly moving between social circles based on the situation, creating a shifting web of relationships. The status of who you are interacting with, whether they are uchi or soto, influences how you behave. Soto people are shown respect and honor. This is done using keigo (“respectful language”), sometimes gifts are given, and as you honor soto people you humble yourself and members of your uchi. Foreign tourists are very much soto and as such will probably receive very polite honorable treatment.

To some degree however this honoring comes with tatemae (建前, “a façade”). A person’s true feelings, their honne (本音) is reserved only for members of their uchi. So a tourist may receive great service but really getting to know people can be difficult.

We can see uchi & soto played out in architecture as well. Traditional home design has a wall surrounding the property. These walls serve more as mental barriers than physical ones. The walls form a line of demarcation between the uchi and the soto. Where the uchi and soto meet in the house is the genkan which is the entryway where you remove your outside shoes before putting on your inside slippers – physical separations to match the mental separations.

Nobita from Japan explains the concept of uchi & soto.

Inoculation Hair Styles & Early Adopters

Early adopters of Parisian fashion helped make smallpox inoculations popular.

Inoculation is when you purposefully give someone an “antigenic substance” (a substance that triggers an immune response) to generate antibodies and help develop immunity to a particular disease. Around 1500 CE the Chinese developed a practice of inhaling a powder made from ground up smallpox crusts. By ingesting a less harmful version of the disease their immune systems could learn to fight the real thing. The Ethiopians and the Turks had a similar but different practice. They would make a small incision in the arm and place a piece of smallpox pustule inside, with the same goal of triggering an immune response and hopefully developing immunity.

Lady Mary Wortley Montagu of England saw the Turkish method while her husband was ambassador to the Ottoman Empire. She brought the technique to Western Europe and had her daughter inoculated in 1721. Despite evidence of success, westerners were skeptical of smallpox inoculations. When the Turkish procedure was done incorrectly the patient could get full-blown smallpox which has a fatality rate around 30% (or higher in children). Inoculations were an especially difficult sell in France, until smallpox killed King Louis XV and 10 of his courtiers in 1774.

Elaborate gravity defying pouf hair styles were all the rage in 18th century France.

Inoculation Hairdo

After the death of Louis XV, a nineteen year old Louis XVI was suddenly very motivated to get inoculated (additionally encouraged by his wife, Marie Antoinette, who had previously been inoculated back home in Austria). Soon others in the French royal court chose to follow suit. The royal court getting inoculated helped make the procedure more acceptable but what really helped was Mary Antoinette’s hair.

To celebrate the king’s inoculation Antoinette had a special gravity-defying pouf hair style constructed, the pouf à l’inoculation. The inoculation pouf featured a rising sun representing the king, an olive tree representing peace, and the rod of asclepius representing medicine. Soon other women wanted the same trendy hair style as the queen, and as the pouf à l’inoculation became popular around Paris so too did smallpox inoculations. An inoculation is a fairly invisible procedure but a spectacular hair style was a walking billboard celebrating that you had been successfully inoculated.

Early Adopters

In his 1962 book Diffusion of Innovations, Dr. Everett Rogers theorizes how and why innovative ideas/products are adopted (or rejected). After the initial stage where innovators introduce a new product, the early adopters evaluate if it’s worthwhile. Sometimes called “lighthouse customers”, early adopters serve as messengers & guides, communicating the values of a new product to others. While members of each stage of the innovation adoption lifecycle require their own marketing strategy, a key to the early majority adopting a new product is the approval of the early adopters. Once early adopters give the thumbs up, the early majority accept the new product and success is all but inevitable.

The work of Dr. Everett Rogers theorized how new ideas & products are adopted (or rejected) by society. Without the approval of Early Adopters the majority will never accept it.

The queen’s hairstyle influenced the royal courtiers, who influenced the bourgeoisie, who in turn influenced the population at large. Smallpox inoculation was an unknown, scary, and seemingly counter-intuitive procedure, but it was made fashionable (desirable even) through early adopters celebrating it. By making medicine a cool status symbol people everywhere wanted it.

Added info: While it’s fairly well known that Mary Antoinette never said “Let them eat cake”, and that “cake” in this case meant a form of bread, she was still unfairly vilified. Overall she seems to have been a decent queen (as monarchs go), but she did live a wildly extravagant lifestyle which certainly made her seem detached from the struggles of the common people.