The urge to do the wrong thing at the worst possible time.
The imp of the perverse is the phenomenon where you have the urge to do the wrong thing at the worst possible time. The name comes from the 1845 Edgar Allan Poe story of the same name which is part essay part short story. Poe lays out his theory that humans sometimes have a destructive drive that works against their best interests. It then goes on to be a short story of murder (as Poe stories tend to do).
What if I …
Like the imps of folklore, or maybe the cartoon devil on your shoulder, we each have a mischievous side that tells us to do something wrong just because we can. It can be something benign like the urge to shout in a quiet concert hall, or maybe to throw a coin off the top of a building, to jump out and scare someone, or maybe to tip over a carefully arranged stack of cans in the grocery store. Sometimes the ideas are more dangerous like thinking about driving your car off the road, pushing someone, etc. We don’t do it, but that little thought pops up sometimes.
A special kind of imp of the perverse is the French concept of “L’appel du vide” or “the call of the void”, where you stand at the edge of a precipice and think “I could just jump right over the edge” but then quickly back away. Studies of L’appel du vide (aka High Place Phenomenon) suggest that it isn’t suicidal – quite the opposite. Researchers believe that this is your brain warning you to be careful. It’s driven by a desire to continue living rather than the other way around.
Choices and ideas
It’s not fully understood why we have these thoughts, why the imp of the perverse pops up from time to time. One possible explanation is that we like to have options even if we know we would never choose some of them. Just knowing that we could do/say something is satisfying enough without actually doing/saying it. Another possibility is that these thoughts are part of an internal rebellious drive, part of what psychoanalyst Otto Rank called our “counterwill”, where we oppose feeling confined/controlled and so we try and assert our own individuality. It could be an internal way of feeling like an individual by thinking counter to what is expected & acceptable.
That said, some of these wild ideas are useful. They might be long shots, but occasionally one of these ideas is the kind of out-of-the-box thinking that’s necessary for innovation. Sure most of these ideas are “hold my beer” bad but a few come along which might just be crazy enough to work. Purely rebellious ideas like dropping a coin off the top of a tall building isn’t going to do much but thinking outside of the norm is where big innovative ideas come from.
Thinking these thoughts is normal. Our prefrontal cortex, which is involved in impulse control, helps us follow social norms and not follow through on these ideas. Still, every now and then the imp of the perverse manages to serve up something helpful.
In larger groups people become less likely to help. When people are waiting for someone to do something, maybe you’re the person who should do something.
The bystander effect is a psychological theory where, the more people who are present the less likely someone is to help a person in need. Alone you would probably help but in a crowd you just expect someone else to do something. We regularly hear stories in the news or have examples in our own lives of situations that could have been avoided if someone in the crowd had acted – people who knew someone was dangerous but never told the authorities, people who witness harassment at work but never speak up, the driver broken-down on the side of the highway that everyone drives past, etc.
Part of why this happens is a “diffusion of responsibility” where members of a crowd feel less responsible to take action. “There are so many people here I bet someone else has already called an ambulance” or “someone else is probably more qualified to help”. Of course, if everyone assumes someone else will take action then nobody does.
Another reason this happens is social influence. People look around and take their cues from how others are behaving. We’re social creatures and most of us don’t like to go against the crowd. We try and fit in by doing what other people are doing. If a crowd of people seem unconcerned by something, and they continue going about their day as usual, you are less likely to go against the crowd and take action.
Less Likely To Help (… Some Conditions Apply)
While it is true that the larger the crowd the less likely someone is to assist, there are some caveats. For example: while we take our cues from how others around us are behaving, and if nobody is helping we are less likely to help, the opposite is also true. If other people are lending a hand then we’re actually more likely to help.
People are also more likely to help when a situation is a clear emergency. Ambiguous situations that aren’t life-threatening aren’t as likely to get assistance as an obvious emergency. Also someone who is trained to assist in an emergency is more likely to intervene. For example a medical professional who regularly helps people is more likely to provide assistance even if the rest of the crowd won’t. We’re also more likely to lend assistance to people who we perceive as part of our in-group, our “uchi” (people wearing the jersey of a team we support, people with political bumper stickers we agree with, etc).
I Need Help
If you find yourself in an emergency and there is a crowd of people, there are things you can do to improve your chances of getting help. The first thing is to make it clear you need help. Remove any ambiguity by clearly stating you need help. Singling people out also improves your chances. Make eye contact with individuals, ask them for help, tell them what you need. Directly appealing to individuals improves your chances of receiving help.
As for being a bystander, remember that you are someone. Instead of waiting for someone else to take action maybe you’re the very someone who should take action. If you were the only person around how would you behave? If you begin to help you increase the chances that other people will join in and help too, canceling out the bystander effect.
Added info: while many examples of the bystander effect exist, the definitive example is the 1964 murder of Kitty Genovese in New York City. Multiple people heard and even saw her being attacked but failed to take action until it was too late. There is a very good Stuff You Should Know episode about this case as well as a documentary.
Sometimes an imperfect solution is better than waiting for a perfect one.
The nirvana fallacy (aka the “perfect solution fallacy”) is when you compare an imperfect option to an idealized perfect option. Basically it’s when you dislike/reject an option just because it isn’t perfect. Rather than weighing the merits of realistic (albeit flawed) options, you pit realistic options up against unrealistic perfect options.
Something is better than nothing
In the world of COVID we see this with wearing masks. When you board a plane or enter a restaurant you have to wear a mask, you can remove your mask to eat or drink, but then you have to put your mask back on. This leads some to think “Well why even bother wearing the mask if we’re just going to take it off?”, but this is fallacious. The perfect solution would be to stay at home or to wear your mask all the time, but this is unrealistic. Even though temporarily removing your mask is flawed, to wear your mask at all is still better than never wearing a mask. The imperfect solution is better than not even attempting a solution just because it isn’t perfect (which, spoiler, the perfect solution neither exists nor will it ever exist).
The nirvana fallacy frequently finds its way into public policy debates. When some policy doesn’t fully solve a problem its political opponents will attack it for its flaws. However, no realistic solution will ever go far enough to satisfy all critics. Good governance is choosing the best possible available solution knowing that all options will be flawed.
When weighing your options don’t reject an option just because it isn’t perfect — all options will be imperfect. By holding out for the perfect option you can do more harm than good. Doing something is frequently better than doing nothing at all.
Astrology, the idea that the stars are influencing your life, is completely fake.
Humans have been following the movements of the sun, the moon, the planets, and the stars for thousands of years. Using this celestial information to understand the seasons and the passage of time is logical. Using this information to predict the future or explain human personalities, is not logical (but understandable). People want to understand why things happen, the world can be scary, and finding some system in the stars is an attractive idea. A relatable narrative is more appealing than unpredictable chaos so it’s understandable that people would look to astrology (like how people fall for conspiracy theories).
While there are different kinds of astrology, the shared basics is that they use complex series of real astronomical calculations combined with made-up traits assigned to different constellations/alignments/times to “gain insights” into the workings of the world. The Western astrological system is rooted in Hellenistic astrology from the Mediterranean around 200-100 BCE (which itself is based in the much older Babylonian astrology). It’s from Hellenistic astrology that we get the Zodiac, horoscopes, star signs, and the kind of astrology we typically encounter in blogs and newspapers.
That said, astrology is completely fake. It’s pseudoscience, superstition, hooey. To start, there’s no reason a distant configuration of stars which looks vaguely like a crab or a bull would have any relationship with the events on Earth. But even if there was some kind of relationship there would need to be a force connecting us to these heavenly bodies, affecting us here on Earth. Science hasn’t found or been able to measure any kind of force at work. Neither gravity nor electromagnetism work like this. Maybe there is some unknown other force, that remains strong yet undetectable, interacting with us from distant stars trillions of miles away which has yet to be discovered.
Another problem is that astrological assessments/predictions should be at least consistent if not accurate. In 1985 scientist Shawn Carlson conducted a double-blind experiment with astrologers to match personality test results to natal charts (essentially their zodiac symbols). If personality types are immutably governed by the stars, matching a zodiac sign to a participant’s corresponding personality type should be easy. It was apparently not easy, as astrologers performed about the same as pure chance. Worse, the astrologer participants performed poorly in even finding their own personality profiles.
Maybe astrology succeeds despite the human failings of astrologers. Time twins, people born at the same time on the same day sometimes even in the same hospital, should have similar personalities. Unfortunately there is no correlation at all. Even without astrologers being involved astrology is inconsistent.
Part of the blame for astrology lies with its adherents who believe astrology is real. Paranormal skeptic James Randi conducted an exercise where he gave detailed horoscopes to a class full of students. Most of the students said the horoscope they received was quite accurate. The trick was that Randi gave the same horoscope to everyone in the class. What the students in Randi’s experiment fell for was the Barnum effect.
The Barnum effect (aka the Forer effect) is found in fortune telling and astrology where an assessment/reading seems to be about you but in reality can apply to almost anyone. These are statements that have been carefully worded to be specific and yet universal. For example, one might say that …
“You have a tendency to be critical of yourself. You have a great need for other people to like and admire you. You pride yourself as an independent thinker and do not accept others’ statements without satisfactory proof. At times you are extroverted, affable, sociable, while at other times you are introverted, wary, reserved.”
In fact these statements are part of what psychologist Bertram Forer gave to his test subjects as part of his 1948 study. When assessing the accuracy of these statements, participants in Forer’s experiment gave an average rating of 4.3 out of 5 (5 being the most accurate). It turns out every student was given the exact same statements. Horoscopes and other astrological readings frequently use the Barnum effect to seem specific to you but in reality can apply to almost anyone.
Another way astrology can seem real is through confirmation bias. Believers remember the predictions that came true more than the ones that didn’t. When someone has an emotional desire for a certain outcome they can respond more favorably towards the evidence that supports their beliefs and dismiss or undervalue contradictory evidence. Selectively remembering the horoscopes that came true can make astrology seem real, even thought it’s not.
Other contributing factors are that people who believe in astrology tend to be of lower intelligence, and more narcissistic, than non-believers. A potential “self-centered worldview” (along with a shaky understanding of science) could be influencing factors leading people to believe in astrology.
Ultimately astrology is inconsistent, inaccurate, and unable to explain why any of it is supposedly happening. From Cicero to modern scientists we have compelling arguments and mountains of scientific evidence showing again and again that astrology isn’t real. As professor Ivan Kelly of the University of Saskatchewan wrote, “Astrology is part of our past and has undeniable historical value, but astrologers have given no plausible reason why it should have a role in our future.”
Added bonus: one famous believer in astrology was President Ronald Reagan. Astrologer Joan Quigley (the Rasputin of the Reagan White House) regularly consulted her star charts to advise the president on a host of matters. She advised the president on when to deliver speeches, when to have presidential debates, when he should schedule his cancer surgery, and even when to land Air Force One. It was generous of the Christian Moral Majority to overlook Reagan’s pagan beliefs.
Just because you started something doesn’t mean you have to finish it. Sometimes quitting is a good thing.
The Sunk-Cost Fallacy is where, because you have invested time / effort / money etc. into something, you feel you can’t quit. The cost of the thing makes you continue because you think that stopping would be a waste of all that time / effort / money etc. In reality however, if something isn’t worth it anymore, you should quit.
Humans are strongly loss averse. Losing something hurts more than gaining something by almost two to one. We’re naturally protective of the things we have and we focus more on what we may lose than what we may gain. This manifests itself when it’s time to move house, have a yard sale, or generally clean-up – people can have a difficult time parting with possessions. Similarly, walking out of a bad movie, turning around and asking for directions when driving around lost, or ending a relationship are all hard to do, partially because we are invested in them and we don’t want that investment to have been a waste. We don’t want to look foolish for having invested poorly so we double-down and continue with things we aren’t enjoying anymore to save face. By continuing forward no matter what we are increasing our investment costs as well as the damage by staying the course.
Sunk-costs are the investments we’ve made that can never come back – they’re in the past. They’re also irrelevant in considering our future paths. Past costs are looking backwards but your future choices are looking forward. For example, just because you’ve paid for a ticket to a concert doesn’t mean you have to go. If you’re feeling sick then maybe don’t go. The money you paid for the ticket is gone so all you have to consider now is: do I feel like going to this concert?
When evaluating potential courses of action, consider what is best for your future and don’t think too much about the past. The sunk-costs of your past can’t be recouped and sometimes it’s worth quitting something and turning in a new direction.
The concept of in-groups and out-groups that shapes Japanese culture at all levels.
Uchi & soto is the Japanese cultural concept that people can be (and are) sorted into one of two groups: your in-group (uchi) or your out-group (soto). Is the person you are interacting with part of your inner-circle? Based on which group someone is in dictates how you should behave.
Uchi (内) means “inside” – it’s the familiar, the home, the groups you belong to. Soto (外) means “outside” – it’s the unknown, strangers, foreigners, the groups you aren’t a part of. People in your family, your coworkers, can be thought of as part of your inner circle, your uchi. Non-family members however, or your boss, can be considered soto. To add more complexity, these categorizations are fluid. While your boss is ordinarily considered soto, if the two of you are meeting with a customer then you’re unified in representing the company and so your manager is now considered uchi while the customer is soto. When you get back to the office however your manager goes back to being soto.
People are constantly moving between social circles based on the situation, creating a shifting web of relationships. The status of who you are interacting with, whether they are uchi or soto, influences how you behave. Soto people are shown respect and honor. This is done using keigo (“respectful language”), sometimes gifts are given, and as you honor soto people you humble yourself and members of your uchi. Foreign tourists are very much soto and as such will probably receive very polite honorable treatment.
To some degree however this honoring comes with tatemae (建前, “a façade”). A person’s true feelings, their honne (本音) is reserved only for members of their uchi. So a tourist may receive great service but really getting to know people can be difficult.
We can see uchi & soto played out in architecture as well. Traditional home design has a wall surrounding the property. These walls serve more as mental barriers than physical ones. The walls form a line of demarcation between the uchi and the soto. Where the uchi and soto meet in the house is the genkan which is the entryway where you remove your outside shoes before putting on your inside slippers – physical separations to match the mental separations.
Early adopters of Parisian fashion helped make smallpox inoculations popular.
Inoculation is when you purposefully give someone an “antigenic substance” (a substance that triggers an immune response) to generate antibodies and help develop immunity to a particular disease. Around 1500 CE the Chinese developed a practice of inhaling a powder made from ground up smallpox crusts. By ingesting a less harmful version of the disease their immune systems could learn to fight the real thing. The Ethiopians and the Turks had a similar but different practice. They would make a small incision in the arm and place a piece of smallpox pustule inside, with the same goal of triggering an immune response and hopefully developing immunity.
Lady Mary Wortley Montagu of England saw the Turkish method while her husband was ambassador to the Ottoman Empire. She brought the technique to Western Europe and had her daughter inoculated in 1721. Despite evidence of success, westerners were skeptical of smallpox inoculations. When the Turkish procedure was done incorrectly the patient could get full-blown smallpox which has a fatality rate around 30% (or higher in children). Inoculations were an especially difficult sell in France, until smallpox killed King Louis XV and 10 of his courtiers in 1774.
After the death of Louis XV, a nineteen year old Louis XVI was suddenly very motivated to get inoculated (additionally encouraged by his wife, Marie Antoinette, who had previously been inoculated back home in Austria). Soon others in the French royal court chose to follow suit. The royal court getting inoculated helped make the procedure more acceptable but what really helped was Mary Antoinette’s hair.
To celebrate the king’s inoculation Antoinette had a special gravity-defying pouf hair style constructed, the pouf à l’inoculation. The inoculation pouf featured a rising sun representing the king, an olive tree representing peace, and the rod of asclepius representing medicine. Soon other women wanted the same trendy hair style as the queen, and as the pouf à l’inoculation became popular around Paris so too did smallpox inoculations. An inoculation is a fairly invisible procedure but a spectacular hair style was a walking billboard celebrating that you had been successfully inoculated.
In his 1962 book Diffusion of Innovations, Dr. Everett Rogers theorizes how and why innovative ideas/products are adopted (or rejected). After the initial stage where innovators introduce a new product, the early adopters evaluate if it’s worthwhile. Sometimes called “lighthouse customers”, early adopters serve as messengers & guides, communicating the values of a new product to others. While members of each stage of the innovation adoption lifecycle require their own marketing strategy, a key to the early majority adopting a new product is the approval of the early adopters. Once early adopters give the thumbs up, the early majority accept the new product and success is all but inevitable.
The queen’s hairstyle influenced the royal courtiers, who influenced the bourgeoisie, who in turn influenced the population at large. Smallpox inoculation was an unknown, scary, and seemingly counter-intuitive procedure, but it was made fashionable (desirable even) through early adopters celebrating it. By making medicine a cool status symbol people everywhere wanted it.
Added info: While it’s fairly well known that Mary Antoinette never said“Let them eat cake”, and that “cake” in this case meant a form of bread, she was still unfairly vilified. Overall she seems to have been a decent queen (as monarchs go), but she did live a wildly extravagant lifestyle which certainly made her seem detached from the struggles of the common people.
The Barkley Marathons is an ultramarathon that is “set up for you to fail.”
For runners who find the traditional marathon distance of 26.2 miles not challenging enough, there is the ultramarathon. An ultramarathon is any race beyond 26.2 miles. Some are a set distance while others are a set time with runners going as many miles as they can within that time.
While all ultras are grueling, some are particularly noteworthy. The Badwater 135 is a 135 mile race going from the lowest point in California to the base of the highest, from Death Valley to the trailhead of Mt. Whitney. The Marathon des Sables (The Marathon of the Sands) is 150 miles of running in the Moroccan portion of the Sahara Desert where runners have to carry their own food & water. Part of the entrance fee also covers the repatriation of your corpse should you die. While there is no real ranking of the most difficult ultras, one that makes every list is the Barkley Marathons.
Set in the rugged hills of Eastern Tennessee, the Barkley Marathons is an annual race where 35 to 40 runners look to run 100+ miles in less than 60 hours. The course is 5 laps around the woods of Frozen Head State Park, up and down the hills of mostly unmarked trails. There is no electronic tracking and participants are not allowed any GPS devices, leaving runners to wayfind by map & compass. To prove you’ve made each full lap you find books in the woods at designated places and tear out the page corresponding to your running bib number. Because of the many hills the total cumulative elevation gain is around 54,000 feet, or 2 Mount Everests in 3 days.
The Barkley Marathons is universally considered one of the hardest races in the world. Most people who start never finish. The temperature changes, the distance, the lack of sleep (the race runs day & night), and the terrain (the hills, the thorns, the uneven ground) all work against you. Founded in 1986 by Gary “Lazarus Lake” Cantrell, more than half of the races have ended with no-one completing the course. As of 2021 the full race has only been completed 18 times by 15 runners – around a 1.3% completion rate.
The idea for the race came from the 1977 escape of James Earl Ray from the Brushy Mountain State Penitentiary (which is located beside Frozen Head State Park). In 55 hours Ray only made it 8 miles from jail because of the terrain. Cantrell felt that in 55 hours he should have been able to make it 100 miles, and so began the Barkley Marathons.
How and why would you do this?
The registration process to enter the Barkley is a secret. There is no website. Entrants pay a $1.60 entry fee and write an essay on why they should be allowed to participate. First time participants are also required to bring their license plate with them which Cantrell strings together and hangs like a curtain at the starting area. For repeat participants Cantrell requests some article of new clothes that he is in need of (flannel shirts, socks, etc.). Each year one person is allowed to participate who Cantrell knows will almost certainly fail, the “human sacrifice.” This person is given bib number 1.
Why would someone do this? As with running a regular 26.2 mile marathon, or any sort of endurance challenge, participants want to know what they are capable of. For most people winning isn’t the goal (or even an option). You’re in competition with yourself more so than with the other runners. People want to see, when really put to the test, what can they accomplish? What are they made of? The Barkley Marathons sits at the edge of impossibility, giving participants the rare chance to learn about themselves and see what they’re made of.
“If you’re going to face a real challenge it has to be a real challenge. You can’t accomplish anything without the possibility of failure.”
GARY “LAZARUS LAKE” CANTRELL, Barkley Marathons founder
As part of a Guinness marketing effort in the early 1990s, thousands of Irish pubs around the world have been built using standardized design templates.
Recognized around the world, the Irish pub is one of the most well-known Irish cultural exports – and where there’s an Irish pub there’s usually Guinness. In the 1980s Guinness began to track the causal relationship between new Irish pubs and regional increases in Guinness beer sales. As new pubs opened, Guinness sales went up. If Guinness could help create more Irish pubs then they could also increase their own revenue.
Ahead of the 1990 World Cup in Italy, Guinness sales representatives traveled around Italy meeting with potential Italian business partners with the goal of opening Irish pubs. Their pitch was built around revenue generation and how Irish pubs have a more profitable beverage-to-food ratio than most other bars. From January to June of 1990 Italy opened 58 Irish pubs, welcoming Irish soccer fans and drinkers of all kinds. However, the critical factor to revenue generation was that these pubs needed to appear authentic – enter the “pub in a box”.
Pub in a Box
Successful Irish pubs outside of Ireland have the look & feel of the real thing. As part of their expansion effort Guinness assembled a team to analyze, quantify, & document the seemingly ineffable essence of the Irish pub. The Irish Pub Concept helped determine the critical success factors to operating an Irish pub. Chief among these factors is visual authenticity.
Founded in 1990, the Irish Pub Company of Dublin was one of the first companies to offer “authentic” Irish pubs for export. Instead of doing all of the work yourself they’ll take your dimensions and design, manufacture, and ship all of the necessary materials to you. Do you want the rural Irish pub style or the Victorian? Maybe you want the general “Celtic” style. They offer a variety of prepackaged pub types that come complete with all the knickknacks for the walls. To date they have designed & shipped over 2,000 pubs to more than 50 countries.
The Irish Pub Co. isn’t alone. Ól Irish Pubs and GGD Global also offer to design & ship you a “pub in a box”. This Disney-ized packaging of Irish culture is not without criticism. For one it raises questions of authenticity. It’s true these are pubs that have been designed & manufactured in Ireland. However, it’s difficult to claim authenticity when your pub has a fake Irish country store as part of the decor. Instead of organically collecting meaningful mementos for your bar, these superficial design packages ship all the rusty farm equipment, dusty old bottles, and framed photos of strangers you need to give the illusion of authenticity. Why take years cultivating a unique local flavor when you can just throw up a portrait of Michael Collins or the Molly Maguires?
An additional criticism is of Guinness for helping to bring these “pub in a box” bars into existence. Established Irish bars were expected to keep serving Guinness beer while the Guinness company was busy creating additional local competition. Beginning in the early ‘90s some bars boycotted and stopped serving Guinness. McGillin’s Olde Ale House of Philadelphia still does not serve Guinness as a result of the “pub in a box” fallout with Guinness.
Better than nothing
To many customers the ambiance that these cookie-cutter bars generate is all that matters – the question of authenticity never crosses their minds. The theatrical set dressing used by these bars creates a fun environment. Even for those who recognize the dubious credibility of these establishments, some feel to have a “pub in a box” Irish bar is better than having none at all.
As America has helped transform St. Patrick’s Day into an all-out extravaganza, Irish pubs (authentic or otherwise) are increasingly patronized not only by the diaspora but by people of all backgrounds. The pub offers people of all stripes an environment that is hard to find anywhere else. The long tradition of the pub serving as a gathering place for the local community can still be carried out by these “pub in a box” bars … just don’t scrutinize the bric-à-brac too closely.
Added info: If you’re interested in standardized / templated restaurant experiences, you may also be interested in learning about how the Thai government’s culinary diplomacy has successfully spread Thai restaurants around the world.
A man’s face, a rabbit – different cultures see different things because humans are hardwired to look for patterns.
Over thousands of years of evolution our brains are hardwired to find patterns. For example, finding tiger stripes in the tall grass is a pretty valuable ability. We use pattern recognition for defense, for finding information, for recognizing friendly faces, etc. As our brains are constantly searching for patterns we’re bound to get it wrong sometimes and find meaning in things where there is none.
Pareidolia is when we incorrectly recognize something where there is really nothing. This can be auditory, such as “hearing” a word in what’s really just random sounds or white noise, but most of the time pareidolia is visual. We “see” animals in clouds, we “see” butterflies in Rorschach inkblot tests. What we “see” the most however is faces. Facial pareidolia is when we see faces in things such as electrical outlets, the front of cars, in the burnt patterns of grilled cheese sandwiches, or on the surface of the moon.
Who is on the Moon?
The surface of the moon is marked by impact craters from asteroids as well as large craters of solidified ancient lava. In the same way ancient humans connected the stars to create constellations, people have looked at these lunar markings and “seen” a variety of things.
The Man in the Moon
A European tradition going back at least to the 14th century finds the whole body of a man carrying sticks on the surface of the moon. While stories vary, he’s said to be a man caught gathering sticks on a Sunday. As punishment for breaking the Sabbath he was banished to the moon. The Haida of the Pacific Northwest North America see this shape as a boy (instead of a man) who had been gathering firewood by moonlight. The boy insulted the moon and was similarly banished to the moon as punishment.
Other traditions see just the face of a man and not the whole body. Some say the man is Caine from the Bible, also sent to the moon as punishment. Talmudic folk tradition says this person is Jacob.
Jack & Jill
The nursery rhyme of Jack & Jill is based on the Scandinavian myth of Hjuki and Bila. The two children were said to be carrying a pail of water when the moon god Mani carried them to the moon (where they can be seen carrying their pail).
The Woman in the Moon
Sometimes the man/boy carrying sticks on the moon is said to be a woman (a witch of course) carrying sticks. In the southern hemisphere however, where the moon is seen upside down (depending on your cultural point of view), the Māori of New Zealand see a different shape as a woman. Rona was carrying water at night but tripped when there was insufficient moonlight to light her way. Hurt and angry she cursed the moon. The moon heard her insults and (like the punishment in the Haida legend) she’s now on the surface of the moon along with her water jug.
The Samoans say this woman is Sina, who thought the moon looked like a giant breadfruit and asked the moon to come down to let her child have a bite. The moon, insulted by this, took Sina, the tools she was working with, and her child back to the moon.
A Pair of Hands
In some Hindu traditions the hands of Astangi Mata are seen on the surface on the moon.
Name of Ali
In Islam, where there is a history of aniconism and not depicting sentient beings in art, there is a Shiʿite tradition of seeing the name of Ali (the son-in-law of Muhammad) written on the surface of the moon.
In India the Buddhist Jātaka tales has the story of a rabbit that sacrifices itself by jumping into a fire. The rabbit is saved and placed on the surface of the moon. In China the rabbit Yutu is seen on the moon preparing the elixir of life in a bowl. The Japanese also see a rabbit with a bowl but instead of a magical elixir it’s preparing rice cakes.
Mesoamerican groups also see a rabbit on the moon. As the one story goes Techuciztecatl (the moon) was hit in the face with a rabbit, the imprint of which is still on the moon.
The Selish people tell of a wolf who was romantically pursuing a toad in the moonlight. Just before being caught by the wolf the toad leaped so high she landed on the moon. Another toad on the moon is a variation of the Chinese rabbit on the moon story. In this version after the rabbit prepared the magical elixir for the Moon goddess Chang’e, the goddess drank the elixir and was transformed into a toad.
The Kimbundu tribe of Angola have the story of a prince who was only permitted to marry the daughter of the moon. Only a frog knew the way to get to the moon so he served as messenger between the Earth and the moon. Now the frog can be seen on the moon.
Thousands of years of humans have looked up at the moon from cultures around the world and have, through creativity and pareidolia, seen a variety of things. Cultures have explained these figures with creation myths or moral lessons, giving us the stories we know today.
Bonus: One of the most famous versions of the man in the moon is seen in the 1902 Georges Méliès film Le Voyage dans la Lune.