Mistakes Happen (Sometimes Intentionally)

Nothing is perfect and we should embrace mistakes and imperfections.

Mistaken Mistakes

Persian carpets (aka Iranian carpets) come in a diversity of designs and sizes, but they frequently contain repeating symmetrical patterns. One alleged feature in handmade Persian carpets is a mistake in the design pattern (not in the construction) included intentionally. This “Persian flaw” serves as a reminder that only Allah is perfect. The flaw would be something small only noticed by the keenest of observers. It’s also been said that the Amish have a similar practice, that they include an intentional flaw (a “humility block”) in their quilts as a reminder that only God is perfect … but it isn’t true.

Lancaster curator Wendell Zercher has quoted Amish quilt makers as saying “… no one has to remind us that we’re not perfect.” As for Persian flaws, most accounts of this idea come from Western sources and is probably an example of orientalism. While both of these are nice stories that probably help to sell imperfect rugs & quilts, we have little to no evidence to support them. If anything, to intentionally make just one mistake out of humility would prove the opposite, bragging that you have the ability to make a perfect creation (but choose not to).

Actual “Mistakes”

There are however some cultures that really do include intentional imperfections in their work. Women in the Punjab region between India & Pakistan create Phulkari shawls of intricate designs. In these designs they sometimes include “mistakes” which are momentary changes in the overall design pattern. These changes are included to mark important events during the creation of the shawl (births, weddings, deaths, etc). Sometimes the symmetrical pattern is disrupted as spiritual protection from the evil eye.

On the left is a phulkari shawl with intentional changes to the pattern. To the right is a Navajo weaving featuring a “spirit line”.

Some Navajo also include imperfections in their weavings for spiritual reasons. The ch’ihónít’i (aka the “spirit line” or the “weaver’s path”) is a single line leading out of the middle of a design to the edge of the weaving. The spirit line is thought to give the weaver’s soul a way to exit the weaving so as to not get trapped in the design.

Embrace Imperfections

Of course if you accept that nothing is perfect then you have no need to add imperfections because everything is imperfect. The Japanese concept of wabi-sabi is the Zen view that everything is imperfect, impermanent, vulnerable. Unlike Western design ideas which frequently strive for idealized perfection, wabi-sabi celebrates the imperfections that make everything (and everyone) unique.

Kintsugi repaired ceramics, using gold & lacquer to feature (rather than hide) the imperfections.

Building off of wabi-sabi, kintsugi is the practice of repairing broken pottery with bits of valuable metals & lacquer that, rather than trying to seamlessly hide the repaired cracks, highlights them. Kintsugi honors the history of the object and celebrates its imperfections. Nothing lasts forever and we should recognize the beauty of imperfect vessels.

A crash course on the Japanese concept of wabi-sabi.

Ugly Fruits & Vegetables

In the West this embrace of the imperfect has recently manifested itself in ugly fruits & vegetables. Imperfect looking produce has traditionally gone unsold and makes up 40% of total food waste. Producers throw away food because they don’t think retailers will want it (it doesn’t meet “quality standards”) and then retail stores throw away the unsold odd looking food that customers won’t buy. This is all despite the fact that the taste and nutritional content of this “ugly” food may be identical to “normal” looking produce.

The European Union declared 2014 the European Year Against Food Waste. The French supermarket chain Intermarché began their “Inglorious Fruits and Vegetables” marketing campaign that celebrated ugly looking produce, they gave the foods their own section in the store, and sold them at a discount. It proved so successful that other stores & delivery services, such as Imperfect Foods, started to do likewise as consumers began to accept the wabi-wabi nature of produce.

The Intermarché marketing campaign to help reduce food waste was a huge success.

the First Thanksgiving Menu

Lacking key ingredients, the menu at the first Thanksgiving of 1621 was a bit different than the traditional turkey dinner of today.

In the fall of 1621 the English Pilgrims and the Wampanoag came together in Massachusetts for, what has subsequently become a much mythologized, 3 day harvest festival. The pilgrims had a lot to be thankful for — that they were still alive following the deaths of half their fellow pilgrims the previous winter, that they had their supplies fortified by the Wampanoag, and that they had completed a successful summer growing season. What they ate as they gave thanks is debatable.

Definitely on the Menu

One food that was definitely served was venison. Massasoit, the leader of the Wampanoag, had 5 deer brought to the event. Another meat on the menu was “wild fowl”, but exactly what kind of birds these were is unknown. It’s possible that there was turkey at the first Thanksgiving but more likely it was goose or duck (or a combination). Other regional bird options at the time would have been swan and passenger pigeon.

Also definitely present was corn. The Wampanoag, who used the Three Sisters method of farming, had taught the pilgrims how to grow corn. As the pilgrims had grown a successful crop of Flint corn (aka “Indian corn”) it was cooked into a porridge, a bread, and/or with beans.

Maybe on the Menu

Given that the Plymouth Colony was by the water it’s very likely that seafood was also served. Eels, clams, muscles, cod, bass, and/or lobsters were very likely a part of the meal. It’s worth noting though that, unlike today, lobster was considered a food of last resort.

There were certainly vegetables & fruits on the menu but which ones were never specified (other than corn). Chestnuts, walnuts, beans, onions, carrots, cabbage, pumpkins, and various squashes were all grown in the area. Blueberries, plums, grapes, and raspberries were also grown in the area and could have been present. While cranberries might have been served cranberry sauce definitely was not since the colonists lacked the necessary sugar (and that cranberry sauce didn’t exist for another 50 years).

Not on the Menu

Even though pumpkins may have been present, pumpkin pie definitely was not. The pilgrims had neither the butter nor the flour necessary to make pumpkin pie – they didn’t even have an oven in 1621. Something pumpkin pie-esque that may have been prepared is a spiced pumpkin soup/custard cooked directly inside a pumpkin which was roasted on hot ashes.

There was no stuffing because, again, the colonists lacked the necessary flour. There were also no potatoes (mashed or otherwise). Potatoes came from South America and, while they had made their way to Europe by the late 16th century via the Spanish, they had yet to make their way to New England. There also weren’t any forks on the table since they too hadn’t made their way to North America yet (but on the upside nobody present had an overbite).

A historical reenactment of how to cook some of the foods present at the first Thanksgiving.

the Clear Craze & Prison Electronics

The design fad that had a practical application in prison.

In the late 1980s a trend for clear products took hold – clear electronics, clear drinks. Clear beverages and clear beauty products were pitched as more “pure” than their traditional counterparts because they were free of artificial colors. Health conscious consumers began to associate clear with clean. There is of course no correlation between clarity and health but the trend for clear beverages took off regardless.

Learning from the failure of New Coke, that you don’t mess with your best selling product, various brands created new clear products (leaving their bestsellers untouched) as an attempt to appeal to underserved demographics. In 1992 Pepsi released Crystal Pepsi which was a clear soda without preservatives or caffeine. In response to this Coca-Cola released Tab Clear which was created solely to sabotage Crystal Pepsi. Tab Clear was marketed as a diet soda which they hoped would confuse people into thinking Crystal Pepsi was a diet soda, which it did. Both clear sodas were retired by 1994 (although Crystal Pepsi has come back from time to time in limited releases).

various clear drinks that became popular (or not)
The Clear Craze gave us a variety of clear drinks – some more popular than others.

In 1993 Gillette released a series of clear antiperspirants. In the same year Coors Brewing Company released the clear alcoholic beverage Zima. Building off of the 1980s popularity of wine coolers, Zima was a new alternative to beer. It was a lemon-lime drink that was produced for a surprisingly long time (until 2008 in the United States), but was more popular in Japan. Part of Zima’s popularity problem in the US had to do with the fact that it was more popular with women than men which (for some consumers) made it seem like a drink for women. This cut out a large part of the potential male customer base.

A 1993 Zima commercial, complete with a peppering of Z’s substituting for S’s.

In 1993 the Miller Brewing Company chose a few American cities for a limited release of Miller Clear. Through intense filtering they removed the color (and by some critics, the flavor) from a lager to make a clear beer that was less heavy than traditional beer and with greater “drinkability.” It never left the limited release stage.

A 1993 Miller Clear commercial demonstrating it’s radical style while proclaiming that it’s “The first regular beer without all the heaviness.”
A collage of clear electronics - the clear craze was more popular in electronics
The Clear Craze was even more popular in electronics.

Electronics also became transparent, allowing you to see the internal workings of the device. The trend for transparent/translucent electronics lasted into the late 1990s, much longer than the craze for transparent beverages.

In 1983 Swatch released the celebrated Jelly Fish model of watch which had a transparent body allowing you to see the gears (a version of which you can still buy today). A variety of brands made clear telephones some of which would flash when a call was coming in. As part of the Play It Loud! series, in 1995 you could buy a clear Nintendo Game Boy (the clear model’s color was called “X-Ray”). The first iMac series, which ran from 1998 to 2003, all featured clear / colored translucent bodies. While clear products were a fun novelty that faded out around the late 1990s they are still very much alive in one particular market: prisons.

Clear Prison Electronics

Behind bars transparent goods allow prison security to easily inspect for contraband. Depending on the prison system there are different rules & requirements for goods in prison. Some rules are to aid in the search for contraband (such as clear plastic) while other requirements are to prevent objects from being turned into weapons (such as using silicon instead of hard plastic). Prison music players & TVs frequently have a lower maximum volume so as to not annoy one’s neighbors, forcing the listener to be very close to the speaker. An alternative is that some players have the speakers removed altogether and only provide a headphone jack (to be used with clear headphones).

Prison electronics has continued the Clear Craze, but for practical reasons rather than stylish ones.

While the cassette was a popular music format in the 1980s, and has seen a resurgence in recent years, it never completely went away. Cassettes have been in steady use in prisons because they’re harder to turn into weapons than CDs and clear cassettes in particular can be quickly inspected for contraband. Thanks to the US’s prison population (the largest and highest per-capita in the world, something America is definitely #1 at) they were able to keep the cassette industry alive long after it fell out of favor with the mainstream public. Even with the introduction of mp3 players into prisons, cassettes are still popular as they are easier to share/trade than digital files.

Bob Barker and Keefe Group are just two examples of companies who specialize in clear products intended for correctional facilities.

Today

You can still find transparent/translucent products today (in and out of prisons). Coca-Cola Clear is a clear version of Coke (but with additional lemon flavoring) introduced to Japan in 2018. You can still find different video game consoles and/or controllers with special clear/translucent editions. Swatch still makes several different clear watches. While clear beer never happened, and Zima hasn’t really come back strong, the hard seltzer craze of 2019 has introduced a plethora of profitable clear alternatives to beer.

In 2024 Zima is still available in Japan.

Added info: If you’re interested in owning second-hand clear prison electronics (for the novelty and certainly not for their quality), you can find various options on eBay. Here’s a selection of clear prison televisions. Urban Outfitters has also gotten in the game of selling clear electronics that were originally designed for prison, such as this cassette player.

Mat Taylor of Techmoan has a fantastic introduction to prison technology.

The SNL commercial for Crystal Gravy was a parody of the Crystal Pepsi commercial.

Hookworm

The parasite responsible for giving American southerners a bad reputation.

For centuries American southerners were maligned as lazy, slow, shiftless, dumb. Southerners had “the germ of laziness.” There was just something different about southerners that made them less-than their northern counterparts. As it turned out there was something different about them but it had nothing to do with genetics or social conditioning. That something was hookworm.

Hookworm

Hookworm, and specifically the New World hookworm Necator americanus, is a parasitic worm that arrived in America by way of the slave trade in the 17th century. In the larval stage hookworms live in wet warm shady soil where they wait to encounter a human. A person walking barefoot outdoors may make contact with a hookworm at which point it can penetrate the skin and crawl into the foot. From there it travels through the circulatory system to the lungs where it triggers a dry cough. The human host then unknowingly coughs up the worm only to swallow it down to the small intestine, which is where the worm wanted to be the entire time. The worm then lives around 1-2 years (or longer) attached to the wall of the intestine, sucking blood, and where a female worm can lay up to 30,000 eggs per day. Eventually these fertilized eggs are pooped out in a poorly built outhouse or in some bushes, contaminating the soil to start the process again. It’s disgusting.

Because hookworms thrive in warm humid environments they do particularly well in the southern climate of the United States. The area from southeastern Texas to West Virginia became nicknamed the “hookworm belt”. For poor southerners who couldn’t afford shoes and didn’t have indoor plumbing it was almost impossible to avoid hookworm. By 1910 it’s believed that around 40% of the people living in the south were infected with millions of worms.

Putting their gross lifecycle aside, the problem with hookworms is that they steal your blood. Alone one worm won’t do much damage, but getting infested by multiple worms on a continual basis over years/decades has a severely damaging cumulative effect. By consuming your blood hookworms can cause an iron deficiency. People with hookworms become tired, lose weight, and have little strength to do anything. Pregnant women are at risk for anemia and a greater chance of dying in child birth. Infected children can suffer irreversible developmental problems including stunted growth and intellectual disabilities. All of this matches the unfair characterization of southerners as slow rednecks.

A nurse brings hookworm medicine to a rural Alabama family, 1939.
A doctor and a nurse examine for hookworm in an Alabama school, 1939.

The cumulative effect

In 1902 zoologist Charles W. Stiles discovered that hookworms were endemic to the southern US. In 1909 John D. Rockefeller got involved by funding the creation of the Rockefeller Sanitary Commission for the Eradication of Hookworm Disease. They campaigned across the south to educate, test, and help treat hookworm. Students in small country schoolhouses would submit stool samples to their teachers to be tested – some schools even required students be screened for hookworm. People would go to health clinics on the weekends to learn more. An estimated 7.5 million southerners had hookworms. While the Rockefeller Commission helped treat the problem what greatly reduced hookworm was the urbanization of the south enabling more people to afford shoes and sanitary indoor plumbing.

The barefoot Texas boy on the right has hookworm, 1939.

Beyond the health consequences the socioeconomic impact of hookworm is also destructive. The US regions with hookworm had lower rates of literacy and school attendance than areas without it. A 1926 study of Alabama children showed that the more worms a child had the lower their IQ. Even today children with chronic hookworm face up to 40% lower future wage earnings when they grow up. General productivity is measurably lower as a result of hookworm. The southern regions that were worst infected with hookworms saw the greatest income expansion after treatment, but unfortunately centuries of infection had a cumulative effect. Eight of the ten poorest US states are still southern states.

Hookworm in the US is typically thought of as a problem of the past but it is still very much a problem of the present. Given the severe income inequality in the US, hookworm is thriving in regions living below the poverty line. Hookworm lives in, and reinforces, poverty and while 85% of the world lives on less than $30 a day 10% of the world is currently living in extreme poverty. Around the world an estimated 477 million people are currently living with hookworms inside them.

Astrology

Astrology, the idea that the stars are influencing your life, is completely fake.

Humans have been following the movements of the sun, the moon, the planets, and the stars for thousands of years. Using this celestial information to understand the seasons and the passage of time is logical. Using this information to predict the future or explain human personalities, is not logical (but understandable). People want to understand why things happen, the world can be scary, and finding some system in the stars is an attractive idea. A relatable narrative is more appealing than unpredictable chaos so it’s understandable that people would look to astrology (like how people fall for conspiracy theories).

While there are different kinds of astrology, the shared basics is that they use complex series of real astronomical calculations combined with made-up traits assigned to different constellations/alignments/times to “gain insights” into the workings of the world. The Western astrological system is rooted in Hellenistic astrology from the Mediterranean around 200-100 BCE (which itself is based in the much older Babylonian astrology). It’s from Hellenistic astrology that we get the Zodiac, horoscopes, star signs, and the kind of astrology we typically encounter in blogs and newspapers.

Despite millennia of study & measurements, nobody is any closer to explaining why astrology is supposedly real.

Bunk

That said, astrology is completely fake. It’s pseudoscience, superstition, hooey. To start, there’s no reason a distant configuration of stars which looks vaguely like a crab or a bull would have any relationship with the events on Earth. But even if there was some kind of relationship there would need to be a force connecting us to these heavenly bodies, affecting us here on Earth. Science hasn’t found or been able to measure any kind of force at work. Neither gravity nor electromagnetism work like this. Maybe there is some unknown other force, that remains strong yet undetectable, interacting with us from distant stars trillions of miles away which has yet to be discovered.

Another problem is that astrological assessments/predictions should be at least consistent if not accurate. In 1985 scientist Shawn Carlson conducted a double-blind experiment with astrologers to match personality test results to natal charts (essentially their zodiac symbols). If personality types are immutably governed by the stars, matching a zodiac sign to a participant’s corresponding personality type should be easy. It was apparently not easy, as astrologers performed about the same as pure chance. Worse, the astrologer participants performed poorly in even finding their own personality profiles.

Maybe astrology succeeds despite the human failings of astrologers. Time twins, people born at the same time on the same day sometimes even in the same hospital, should have similar personalities. Unfortunately there is no correlation at all. Even without astrologers being involved astrology is inconsistent.

Part of the blame for astrology lies with its adherents who believe astrology is real. Paranormal skeptic James Randi conducted an exercise where he gave detailed horoscopes to a class full of students. Most of the students said the horoscope they received was quite accurate. The trick was that Randi gave the same horoscope to everyone in the class. What the students in Randi’s experiment fell for was the Barnum effect.

Barnum Effect

The Barnum effect (aka the Forer effect) is found in fortune telling and astrology where an assessment/reading seems to be about you but in reality can apply to almost anyone. These are statements that have been carefully worded to be specific and yet universal. For example, one might say that …

“You have a tendency to be critical of yourself. You have a great need for other people to like and admire you. You pride yourself as an independent thinker and do not accept others’ statements without satisfactory proof. At times you are extroverted, affable, sociable, while at other times you are introverted, wary, reserved.”

In fact these statements are part of what psychologist Bertram Forer gave to his test subjects as part of his 1948 study. When assessing the accuracy of these statements, participants in Forer’s experiment gave an average rating of 4.3 out of 5 (5 being the most accurate). It turns out every student was given the exact same statements. Horoscopes and other astrological readings frequently use the Barnum effect to seem specific to you but in reality can apply to almost anyone.

Confirmation Bias

Another way astrology can seem real is through confirmation bias. Believers remember the predictions that came true more than the ones that didn’t. When someone has an emotional desire for a certain outcome they can respond more favorably towards the evidence that supports their beliefs and dismiss or undervalue contradictory evidence. Selectively remembering the horoscopes that came true can make astrology seem real, even thought it’s not.

Other contributing factors are that people who believe in astrology tend to be of lower intelligence, and more narcissistic, than non-believers. A potential “self-centered worldview” (along with a shaky understanding of science) could be influencing factors leading people to believe in astrology.

Ultimately astrology is inconsistent, inaccurate, and unable to explain why any of it is supposedly happening. From Cicero to modern scientists we have compelling arguments and mountains of scientific evidence showing again and again that astrology isn’t real. As professor Ivan Kelly of the University of Saskatchewan wrote, “Astrology is part of our past and has undeniable historical value, but astrologers have given no plausible reason why it should have a role in our future.”

Added bonus: one famous believer in astrology was President Ronald Reagan. Astrologer Joan Quigley (the Rasputin of the Reagan White House) regularly consulted her star charts to advise the president on a host of matters. She advised the president on when to deliver speeches, when to have presidential debates, when he should schedule his cancer surgery, and even when to land Air Force One. It was generous of the Christian Moral Majority to overlook Reagan’s pagan beliefs.

Waterphone

The haunting crying instrument that you’ve heard in thriller / horror movies.

Invented around 1968 by Richard Waters, the waterphone is an atonal musical instrument which has vertical metal rods of different lengths attached to a metal resonator pan/bowl. Inside the resonator is a bit of water, similar to a water drum, so when the vertical rods are played (with a mallet or more frequently with a bow) the resonator can echo and bend the sounds. The effect ranges from spacey to creepy.

Given its haunting sound the waterphone has been used to create tension and uneasiness in a variety of TV shows, theatrical productions, and movies. You can hear it in ALIENS, Poltergeist, Star Trek: The Motion Picture, etc. It’s used for jump scares such as in The Matrix when Neo’s new cellphone rings. It’s been used in multiple productions of A Midsummer Night’s Dream. Mickey Hart of the Grateful Dead has used the waterphone, Tom Waits has used the waterphone, as well as a host of contemporary classical composers.

Waterphones aren’t all scary though. Composer and activist Jim Nollman used the waterphone on Playing Music With Animals, where he played a waterphone to orcas.

A demonstration of the waterphone.

Egyptian Mummies: From Medicine to Paint

For hundreds of years Europeans used ground up Egyptian mummies as medicine and paint pigment.

The Arabic word mūmiyā (which later became “mummia”) was the name for the black sticky asphalt material that came out of the ground used as a sealant, an adhesive, and as medicine around the ancient world. Pliny the Elder and others wrote about the medicinal uses for mummia which became a bit of a cure-all for a range of ailments.

Unfortunately mummia the petroleum product looked like another black substance that was a byproduct of the Egyptian embalming process. As such the word “mummia” came to mean both the petroleum product AND the byproduct of Egyptian mummification, which was then even further confused as meaning an entire mummified body. This is how we got the word “mummy”. Unfortunately this series of mistakes also led to hundreds of years of cannibalism.

Cannibal Medicine

Since the petroleum based mummia was used both externally as a salve as well as ingested internally, the Egyptian mummy version of mummia became used in the same ways. The 11th century physician Constantinus Africanus even described mummia as a “spice” found in the sepulchers of the dead. Soon the human version replaced the petroleum version and people began to crumble & grind human corpses for medicine.

With the Crusades, Europeans learned of mummia and its medicinal possibilities. This significantly increased European demand for Egyptian mummies and by the 15th-16th centuries there was a thriving trade in mummies. Thousands of bodies were being exhumed and shipped to Europe to be turned into medicines. In 1586 English merchant John Sanderson shipped 600 pounds of mummies to London to sell at various apothecaries. This was fueled in part by orientalism, that Egyptian mummies had some sort of exotic ancient knowledge or power.

Europeans would consume portions of Egyptian corpses for help with general pain, ulcers, inflammation, epilepsy, cough, difficult labor, etc. – none of which worked, or if they worked it wasn’t the mummy that was the active ingredient. The practice was so common Shakespeare included mummy as an ingredient in the witches’ potion in Macbeth. Demand was so high that by the 17th century some mummy dealers were producing counterfeit mummies. Newly deceased people, animals, or prisoners who had been purposefully starved & executed, were put through a process to simulate ancient Egyptian mummies.

After a few hundred years of medicinal cannibalism Europeans began to express doubt as to the practice’s efficacy (and ethicality). The 16th century herbalist Leonhard Fuchs felt foreign mummies were acceptable but local ones were wrong. While doubts arose during the Renaissance in the 16th century it took until the 18th century age of Enlightenment for the practice to fall out of fashion. As consuming mummies slowly ended Egyptian mummies took on a new role: paint pigment.

The Egyptian Widow by Lourens Alma Tadema is an 1872 painting of Egyptian life potentially painted using mummy brown paint.
Liberty Leading the People by Eugène Delacroix is another painting that’s theorized to contain mummy brown.

Mummy Brown

Around the end of the 16th century artists began using ground up Egyptian mummies (mixed with other materials) to produce mummy brown, a shade of brown pigment. Apothecaries that were grinding up mummies for medicine began to grind them up for paint as well. As a paint it was good for shadows, flesh tones, and glazing. Artists Benjamin West, Martin Drolling, Lawrence Alma-Tadema, Edward Burne-Jones, Eugène Delacroix, and others all painted with mummy brown.

It wasn’t until the 19th century that mummy brown began to fall out of favor. That said as recently as 1926 C Roberson & Co. still sold mummy brown made with ground up Egyptian corpses. As mummy brown died out so too did hundreds of years of large-scale desecration of deceased Egyptians, using human beings for medicines and paints.

Added info: in another example of “what’s old is new again”, in 2024 some Sierra Leoneans were digging up dead bodies for their bones to make the highly addictive street drug kush.

Zombies: Sadder Than You Think

The concept of Haitian zombies was used as a threat to keep slaves working.

Before Haiti was an independent country it was the French colony of Saint-Domingue where they produced sugar, coffee, cotton, and other goods. The French brought more than a million West African people to the colony as slaves, more than any other colony in the Caribbean. Slavery in Saint-Domingue was particularly brutal – most people were poorly fed, they worked 12 hour days, pregnant slaves frequently didn’t live long enough to have babies, torture was common. Life expectancy was about 3-6 years with about half of the enslaved people of Saint-Domingue dying within the first few years of arriving.

The brutal conditions of Saint-Domingue left the enslaved people hoping that, in death, their souls would return home to West Africa.

Haitian Vodou & Zombies

The Code Noir was a 1685 decree that outlined how slavery was to be conducted in the French empire. Among other things it stated that slaves were prohibited from practicing African religions and instead were forcibly baptized into Catholicism. What resulted was Haitian Vodou, a religious blend of West African beliefs (practiced in secret) given a veneer of Catholicism.

Part of this belief system was the idea that, upon dying, you would return to lan guinée (ie. Guinea, or West Africa). Their idea of heaven was to escape the slavery of Saint-Domingue and to simply go home. Feeling the allure of going home some people decided to escape slavery on their own terms. As such suicide was very common Saint-Domingue.

Initially suicide was seen as a viable way of getting to lan guinée but at some point there was a change. At some point (oral tradition is murky on when/how) suicide was prohibited and the punishment for committing suicide was that you’d be a slave forever – you’d become a zombie. The zombies of Haitian Vodou are not the Western pop culture shambling brain-eating zombies. The Haitian zombie was someone whose soul had been captured, denied entry to lan guinée, and was turned into an undead field hand with no chance of escape. Plantation slave-drivers used this to their advantage threatening slaves that if they killed themselves they would be turned into zombies to work forever under the control of a bokor/sorcerer. Unlike today what was feared was the threat of becoming a zombie, not the actual zombies themselves.

1929’s White Zombie was the first zombie movie. It used some Haitian Vodou beliefs but took significant artistic license.

White Zombie

Over time the zombie concept evolved and changed. The sensationalistic 1929 William Seabrook travel book The Magic Island introduced voodoo and zombies to mainstream Western culture. This inspired the 1932 film White Zombie, which was the first zombie movie. White Zombie stars Bela Lugosi as the villainous Murder Legendre (a bit on the nose) who’s a bokor enslaving people as zombies to be his henchmen and to work in his sugarcane mill. White Zombie used Haitian Vodou ideas but with a lot of artistic license. Later zombie stories dropped the Saint-Domingue threat of eternal slavery, then they dropped the bokor master commanding the zombies. Aside from being mindless undead creatures, the zombies of today have little resemblance to their sadder more terrifying origins.

Added info: following the Haitian revolution of 1791–1804, the 1883 Haitian Criminal Code outlawed the practice of turning someone into a zombie.

Cabinet of Curiosities

Before museums existed, people had cabinets/rooms to display their collected treasures.

There was a time when museums did not exist. The role of collecting, preserving, and displaying the art, artifacts, and wonders of the world belonged largely to individuals. As far back as the 4th century BCE Greeks were collecting exotic treasures from the East. More than just trading in commodities, the Greeks collected the art and textiles from these far away cultures. Roman emperor Augustus decorated his homes not just with art but with rare objects and bones of giant animals. Over the centuries, as cultures explored & traded with increasingly distant lands, the trends in what was collectible grew & changed. By the 16th and 17th centuries wealthy European collectors had amassed enough objects that they created special cabinets and/or rooms to show-off their collections. They created cabinets of curiosities.

Ole Worm’s Museum Wormianum is one of the most famous cabinets of curiosities.
Ferrante Imperato’s Dell’Historia Naturale is another famous wunderkabinett.

Wunderkabinett

From the German for art (kunst) or marvels (wunder) and cabinet (kabinett) or room (kammer), these cabinets & rooms were places where Renaissance scholars, merchants, royalty, and others could store their collections. Collecting was very fashionable in 17th century Europe and these cabinets were dedicated spaces to displaying all manner of objects. Like the contemporaneous maps of the world, some of these spaces were designed for show while others were more utilitarian.

A collection of cabinets and rooms displaying all manner of curiosities.

Some collectors had thousands of specimens. The objects in these cabinets were thoughtfully categorized and organized, each piece contributing to the larger whole. Collecting was a way to bring order to the world, to exert some level of control over something that is uncontrollable. What was stored & displayed in these cabinets depended on the collector, but broad categories of objects included:

  • Fine art
  • Applied art (scientific instruments, anthropological objects, etc.)
  • Natural materials (fossils, shells, rocks, etc.)
  • Historical objects

These categories, as well as these collections, served as the precursors to our modern museums. The Amerbach Cabinet was a collection of art, books, coins, etc. that was assembled by various members of the Amerbach family. It was eventually co-purchased by the city of Basel & the University Basel and became the Kunstmuseum Basel in 1661, the first public museum in the world. Francesco I de’ Medici had his studiolo, a 26 x 10 foot room of curiosities that is part of the Palazzo Vecchio in Florence. Other Medici possessions served as the start of the Uffizi Gallery. Elias Ashmole, who amassed his fortune & collection through sometimes questionable means, gifted his collection to the University of Oxford which became the Ashmolean Museum in 1683.

Throughout the 18th century an increasing number of private collections were converted into public museums, some of which still exist today but all of which helped define what museums have become.

Added info: In 1784 Charles Wilson Peale’s collection became the Philadelphia Museum which was the United States’ first museum (and also the first to display a mastodon skeleton).

The Necronomicon

The most famous magical book of occult knowledge that sounds real, but isn’t.

Possibly the most famous book that doesn’t exist, the Necronomicon is a fictional book of dark magic invented by weird fiction / horror author H.P. Lovecraft. First mentioned in 1924’s The Hound, the Necronomicon is part of Lovecraft’s Cthulhu Mythos, a dark collection of cosmic horror, ghouls, inter dimensional monsters, and unspeakable evil all set in an uncaring indifferent universe. The best interpretation of the name “necronomicon” is “book considering (or classifying) the dead”. Supposedly written in 738 CE by Abdul Alhazred (who was later eaten alive by an invisible monster in broad daylight), the Necronomicon is a dark book of forbidden knowledge and most Lovecraft characters who read it come to horrible ends.

Lovecraft felt to produce terror a story had to be “… devised with the care and verisimilitude of an actual hoax.” As such the Necronomicon is very much treated as if it were a real book. Lovecraft enjoyed making his fictional world seem believable. For example, in a list of real books he would throw in a few real-sounding fake ones (such as the Necronomicon) – blurring the line between reality and fiction. Similarly he wrote that there were copies of the Necronomicon held by 5 world institutions: the British Museum, Harvard, Bibliothèque nationale de France, University of Buenos Aires, as well as Miskatonic University … which is a fictional school set in the equally fictional city of Arkham, Massachusetts. Again, including a fictional creation in a list of real places making something fake seem real.

H.P. Lovecraft’s Necronomicon can be found in a host of movies, books, comics, and more.

Crawling Chaos

Part of the appeal of the Necronomicon (beyond the spooky name) is that, like all good suspenseful horror, Lovecraft gives the reader just enough details to understand the idea of the Necronomicon but the exact contents (or even a good physical description of the book) are left open to your imagination. This vagueness also kept the door open for future expansion of ideas. Soon other authors began to include the Necronomicon in their work, and so it spread.

Today the Necronomicon has gone beyond the works of Lovecraft & his friends and has appeared in countless other projects. It’s in books, movies, cartoons, comics, video games, music, etc, each with their own take on exactly what the Necronomicon is, but it’s always a book of dark magic. It’s in the The Evil Dead series, it’s in an episode of The Real Ghostbusters, Mr. Burns mentions it at a meeting of republicans in The Simpsons, it’s the name of a German thrash metal band, it’s the name of H.R. Giger’s first collection of artwork, Michael Crichton and Stephen King have both referenced it, etc. The book of the dead lives on, spreading its tentacles across dark fiction. Cthulhu fhtagn.

Added info: The fictional Arkham Asylum in the DC Universe, where many of Batman’s foes are frequently locked away, was named after the fictional Lovecraft town of Arkham, Massachusetts.

Mr. Burns has Bob Dole read from the Necronomicon.

In a cleverly titled episode The Collect Call of Cathulhu, the Ghostbusters discuss that the Necronomicon will be on display at the New York City Public Library.