The Human Cannonball

The most dangerous act in the circus.

William Hunt (aka “The Great Farini”) was a well known Canadian tightrope walker & daredevil. He crossed Niagara Falls on a tightrope multiple times performing different tricks. He eventually became an inventor & a (manipulative) talent manager as a safer way to make a living. In 1876 he invented a spring-loaded platform that could launch a person 30ft. He further developed this idea into the first human firing “cannon”.

14-year-old Rossa Richter, aka. “Zazel”, the world’s first human cannonball.

The Cannon

The cannons used in the human cannonball act are not true cannons, there is no gunpowder used to propel the performer. Some are spring loaded but many use compressed air which pushes a small platform up the barrel firing the person out while the platform stays hidden inside. Sometimes a little explosion takes place outside the cannon to create smoke for dramatic effect, but there is no gunpowder used inside the cannon.

On April 10th, 1877 a 14-year-old Rossa Richter (aka. “Zazel”) became the first human cannonball with a performance at the Royal Aquarium in London, flying out of Hunt’s new cannon invention. She was chosen because of her size and her circus experience. The Royal Aquarium was chosen because its management was looking to increase their profits, and it worked. The human cannonball act soon became integral to circus performances as it could bring in thousands of paying spectators.

Zazel gained world-wide fame, but little money, as the human cannonball.

The Dangers of Being The Cannon Ball

The upsides of being a human cannonball are usually a fun stage name and that you only have to “work” for about 5 seconds a day. Unfortunately the dangers are obvious and quite real. Today’s cannons can apply 3,000 to 6,000 pounds of pressure on the performer as they accelerate from 0 to 70+ mph into the air. This can put enormous G-force pressure on a performer, sometimes up to 9 times normal gravity. All of this is damaging enough on the human body but the greatest danger is the landing. After reaching heights up to 75ft in the air and coming down sometimes 200ft away from where they started, human cannonballs have to land in the target area – there’s no other option. Some use nets, some use airbags, but to miss the target is as devastating as you might imagine.

Anton Barker (aka. “The Human Rocket” aka “Capt. George Wernesch”) incorporated a trick where he was inside a shell which he would break out of in mid-air. On March 29, 1937 he was set to travel 84 feet but only went 64, crashing into the ground injuring his spine. Mary Connors wanted to break a record by being shot from one side of the River Avon and land on the other. On August 24, 1974 she failed to make it to the other side and ended up in the river. To make matters worse the rescue boat then capsized so she and the rescue team had to be rescued.

On January 8, 1987 human cannonball Elvin Bale (aka. “the Human Space Shuttle”) knew something was wrong the instant he was in the air. He tried to adjust for it in mid-air but he overshot the airbag by a few feet, landing on the ground feet first. He broke his ankles, knees, and his back in two places.

The Zacchini family performed for 70 years as human cannonballs.

The multi-generational Zacchini family produced numerous human cannonballs, performing for 70 years. Mario Zacchini ended his career as a human cannonball after he flew over a Ferris wheel at the 1939–40 World’s Fair in New York, but landed wrong breaking part of his spine, shoulder, and some ribs. On February 7, 1970 Emmanuel Zacchini and his wife Linda collided after being fired from a double-barreled human cannon. He fractured his spine while she broke her neck.

The Zacchini family performing different versions of their human cannonball act.

The Most Dangerous Act

Breaking bones is bad enough but fatalities are very common. One of the most cited human cannonball statistics comes from British historian A.H. Coxe. He estimates that of the 50+ people who have been human cannonballs, 30 have died while performing their act (almost all of which missed their landing). That’s a fatality rate of around 60%, making the human cannonball the most dangerous act in the circus.

Added info: Being a human cannonball is different than being a human catching a cannonball. Frank “Cannonball” Richards was a carnival performer famous in pop culture for taking a cannonball shot directly to the stomach.

Similar to the cannons used for human cannonballs, Frank Richards used a spring-loaded cannon instead of a real one fired by gunpowder. This slowed down the speed and force of the cannonball considerably … but he still took a 104lb cannonball to the stomach twice a day for years.

Frank “Cannonball” Richards taking a cannonball shot to the stomach.

The Simpsons parody of Frank “Cannonball” Richards taking a cannonball to the stomach.

Groundhog Day

Groundhog Day is halfway between the winter solstice & the spring equinox, and has its roots in Candlemas which has even older pagan roots.

Every February 2nd since 1887, people have gathered in the small Pennsylvania town of Punxsutawney for Groundhog Day, the day Punxsutawney Phil (the groundhog) predicts whether there will be six more weeks of winter or an early spring. This idea of marking the transition between winter and spring existed long before Groundhog Day.

Groundhog Day sits halfway between the winter solstice and the spring equinox.

Before Groundhog Day

February 2nd sits halfway between the winter solstice and the spring equinox. The ancient Celts of Europe marked this solar event with a festival which, after the Celts made it to the British Isles, became the the Imbolc festival. Imbolc in one of the four “fire festivals”, began at sundown on February 1st, and ended at the following sundown on February 2nd. In Ireland it evolved to honor the pagan goddess Brigid. When Christianity took over Ireland Brigid the pagan goddess became Brigid the Catholic Saint whose feast day (conveniently) was also on February 1st. The church Christianized the following day as well and made February 2nd Candlemas, the day commemorating the presentation of Jesus at the Temple.

Candlemas has its own customs. People take their candles to the church to be blessed, a reminder that Jesus is the light of the world. For some, Candlemas marks the end of the Christmas season and is the date they take down their Christmas decorations. German speaking areas of Europe also marked Candlemas as “Badger Day”, a folkloric day when a badger would help predict the weather. If a badger was seen in the sun on February 2nd there would be a “second winter”, ie. four more weeks of winter.

Punxsutawney

The tradition emigrated from Germany to North America where the groundhog was substituted for the badger (since badgers aren’t that common where these immigrants settled in the eastern United States – especially Pennsylvania). Similarly, where badgers weren’t common in parts of Europe other regional animals had been used such as foxes and bears.

The small western Pennsylvanian town of Punxsutawney has the most famous observation of this tradition, creating what has become the modern day Groundhog Day. The first “official” event was in 1887, where six more weeks of winter was predicted. Groundhog Day is presided over by a group of men in top hats & tuxes dubbed the “Inner Circle.” This amusing secret-ish society originally began as members of the Punxsutawney Elks Lodge, where the groundhog was not only use for weather prediction, but was also served as food at the lodge.

Phil the groundhog wasn’t a named element of the ritual until 1961. As the tradition goes Phil is the same groundhog year-after-year and is the only groundhog to have ever predicted the weather for Groundhog Day, on account of the “magical elixir” he drinks every year which adds 7 more years of life keeping him alive so long. Otherwise, a normal groundhog has a life expectancy of about 3 years (or up to 14 in captivity). Phil’s popularity as the prognosticator of prognosticators, the seer of seers, has led to many imitators.

Punxsutawney Phil tends to pick “6 more weeks of winter” and has debatable accuracy.

Track Record

Much is made of Phil’s prognostication track record. As of 2021, he has:
• Seen his shadow / 6 more weeks of winter: 105 times (84%)
• No shadow / early spring: 20 times (16%)
• No record of his prediction: 10 times

Stormfax has said that Phil has 39% accuracy in predicting the weather, but the Inner Circle has said that Phil is 100% accurate. Any “wrong” predictions must have been Inner Circle error in interpreting Phil’s prediction.

Added info: the 1993 movie Groundhog Day was filmed not in Punxsutawney but in Woodstock, Illinois. The movie was so popular that Woodstock started hosting their own Groundhog Day festival. Meanwhile the popularity of the film took the Punxsutawney Groundhog Day event from attracting around 2,000 visitors to bringing in tens of thousands each year with the 2020 celebration bringing in an estimated 40,000 people (about 8 times the town’s population).

Groundhog Day and Punxsutawney Phil are so strongly associated with the state of Pennsylvania that in 2004 the state created Gus the groundhog to be the mascot of the Pennsylvania lottery, the second most famous groundhog riding the coattails of Phil.

the 1954 Eldorado Bullet Wheel

Sammy Davis Jr. lost his eye on the steering wheel of a 1954 Cadillac Eldorado.

The Cadillac Eldorado (named for the mythical tribal chief / city of gold) began production in 1953. It was decorated with aeronautically inspired fins and conical “bullets”, as was the style at the time. The “Dagmar bumper” was the chrome front bumper that had two decorative bullet projections, named for the buxom American actress Dagmar. Included in this ‘50s bullet styling was a hard bullet shape at the center of the steering wheel, nicknamed “the bullet wheel”. The car had no seat belts.

The Eldorado’s “Dagmar bumper”, named for the buxom figure of American actress Dagmar
The “bullet wheel” of the 1954 Cadillac Eldorado had a hard “bullet” at the center of the steering wheel, similar to the styling found elsewhere on the car.

Sammy Davis Jr.’s career as a song & dance man started when he was a child in the 1930s. In the early 1950s his career was on the rise and he was performing in the clubs of Las Vegas while also working on projects down in LA. On November 18, 1954 Davis and his valet Charles Head left the New Frontier Hotel & Casino in Las Vegas in Davis’s Eldorado to drive through the night to Studio City in LA the next morning.

Helen Boss was a widower from Akron, Ohio that liked to live as a snowbird, traveling to LA in the winters to avoid the cold of Ohio. She was traveling down Route 66, not far from San Bernadino around 7:00am on November 19th, when she missed her turn. Instead of turning the car around she simply put it in reverse and went backwards to the fork in the road where she went wrong. At the same time Sammy Davis Jr. was driving the same road and before he realized the car in his lane was driving backwards, slammed directly into the back of Boss’s car.

The Accident

The resulting accident sent people flying. Charles Head, who had been sleeping in the backseat, was launched into the front seat where he broke his jaw. Helen and her friend broke bones when they were sent into the backseat of their car. The V-8 engine of Davis’s car was pushed backwards into the dashboard as Davis was sent forward, his head colliding with the steering wheel. He hit his head hard enough that he dislocated his left eye on the bullet portion of the wheel.

The accident was a front-page story around the country. This brush with death, combined with a visit by a rabbi chaplain, led Davis to convert to Judaism. In the hospital Davis’s damaged eye was removed by doctors. He wore an eye patch for the next few months. His debut album, Starring Sammy Davis Jr., was released the following year and the album cover features Davis wearing an eye patch. Eventually he switched to a glass eye. Later in life Davis would say “I’m a one-eyed Negro who’s Jewish.”

Davis initially wore an eye patch but eventually switched to a glass eye.

Form Follows Function

In the words of architect Louis Sullivan, “Form follows function”. The bullet wheel was a costly example that the style of the steering wheel (its form) was less important than its purpose (its function). Looking cool was less important than being useful & safe. After Davis’s accident the Eldorado’s bullet wheel was discontinued and replaced with a safer design.

the Fallacy of Relative Privation

Just because someone else has it worse than you doesn’t mean you don’t have problems.

The Fallacy of Relative Privation is a faulty way of thinking where someone dismisses a problem because there are worse problems in the world. For example “Oh you think you have a bad headache? Well some people live with migraines for days at a time.” The idea of this kind of statement is that you should feel better, comforted by the knowledge that the situation could be worse. Unfortunately, knowing that someone has a worse headache won’t improve the condition of your headache. A more severe problem doesn’t negate a less severe problem.

This fallacy also goes the other direction. When judging people who are more affluent someone might say “What do they have to complain about? They’re rich & famous.” Just because someone is better-off than you doesn’t mean they don’t have problems. The idea of “First World problems” touches on this. While the day-to-day problems of a wealthier society are not as significant as the problems of a poorer country, wealthy people / societies still have problems.

If we follow the fallacy of relative privation to its logical conclusion, only the person with the absolute worst problem(s) could ever have any right to complain about anything. Obviously this is wrong. Therefore when considering problems, your own or the problems of others, remember that all problems are problems regardless of severity or whose they are.

Herbs & Spices (and Salt)

Herbs come from the leaves of a plant, spices are from any other part of a plant (and salt is a mineral).

In cooking they are frequently used together to flavor a meal, and their names get used interchangeably, but herbs and spices are not the same. In short:

HERBS
Herbs are a seasoning that are the leaves of a plant

SPICES
Spices are a seasoning from any part of a plant other than the leaves (roots, stalks, bark, seeds, and sometimes even the fruit)

Herbs

In greater detail, an herb typically comes from smaller deciduous plants without a bark stem. There are exceptions of course as lavender, sage, rosemary aren’t deciduous and never lose their leaves. Either way, the key is that the green leaves become herbs when they are used in cooking, medicine, teas, cosmetics, etc.

Spices

Spices can come from everything but the leaves of a plant. For example cinnamon comes from bark, ginger comes from roots, pepper comes from seeds, and chili powder comes from the pulverized fruit of the chili pepper. Saffron, one of the most expensive spices in the world at around $10,000 a pound, comes from the hand-picked stigma & styles from the Crocus sativus flower.

Allspice, despite a misconception, is not a blend of spices but is just one spice which comes from the berry of the Pimenta dioica tree. Its name comes from the fact that it tastes like a combination of cinnamon, nutmeg, and clove.

Herb & Spice Plants

There are a few plants that produce both an herb and a spice. The leaves of the Coriandrum sativum produce the herb cilantro while the seeds become coriander. Similarly the Dill plant produces dill the herb from its leaves, and dill the spice from its seeds.

Salt

The food seasoning odd-one-out, salt is a mineral and does not come from a plant (although salt is present in plants). There is a lot to say about salt but in short it’s been used as a preservative and a seasoning for thousands of years. It’s the only food seasoning that doesn’t come from a plant.

Norwegian Salmon Sushi

Japanese sushi didn’t contain salmon until a deal with Norway in 1992.

Like many of the oldest things in Japan, sushi originally came from China. Its earliest form was fish stored in fermented rice. The rice was used as a preservative and wasn’t eaten. Through a series of culinary improvements over the centuries the dish eventually became raw fish served with rice (to be eaten, not thrown out), which is how we know it today.

Norwegian salmon

Of the fish used to make sushi, salmon was not usually one of them. Pacific salmon tend to have parasites, making it unsafe to eat raw and needing to be cooked. Enter the Norwegians. Bjorn Eirik Olsen was part of a delegation to Japan in 1985 trying to sell Norwegian salmon to the Japanese. Norway had begun farming salmon in the 1970s and by the 1980s had an excessive amount of fish they needed to find buyers for. At the same time Japan had overfished their waters and were looking to diversify their supply of fish.

Selling salmon to the Japanese public for use in sushi was a difficult proposition because Japanese salmon wasn’t safe to eat raw. A marketing campaign couldn’t say that Norwegian salmon was parasite-free since that would only make people think of parasites, which wouldn’t help sales. It took a few years but by 1992 Olsen got Japanese frozen food producer Nichirei to purchase 5,000 tons of salmon at a heavily discounted price but on the condition that they sell it in grocery stores as raw salmon specifically for sushi. They also labeled their parasite-free Atlantic Norwegian salmon as ‘sāmon’ instead of the Japanese word for salmon ‘sake’, to help differentiate the two types. This was followed by a marketing campaign where they had chefs on Japanese TV demonstrate using salmon. It was a success.

In the years that followed salmon’s popularity took off. Salmon sushi started in the cheap sushi restaurants but eventually spread to restaurants of all levels in Japan and around the world.

Survivorship Bias

The data you don’t see is just as important as the data you do.

Survivorship bias is when you aren’t working with all of the information needed to make a complete analysis. We tend to focus on the information we have and mistakenly forget to consider the information we don’t have.

Missing Data

During WWII Hungarian mathematician Abraham Wald helped the US military determine where to add reinforcing armor on bomber planes. If you reinforce the whole plane it’s too heavy so you want to only add weight where absolutely needed. The military collected data from returning planes on where they had taken damage (from bullets, shrapnel, etc). From this they created scatter plots on plane diagrams showing where the damage tended to be. The initial military analysis was to reinforce the heaviest hit areas but Wald realized this was survivorship bias.

The military was only accounting for the planes that made it back and weren’t accounting for the planes that were shot down and never returned. The areas a plane could get shot, but still return, must not need additional armor to fly. Therefor the areas on returning planes with no damage (the cockpit, the engines, etc) must be the places needing reinforcement since the planes that never returned were probably hit in those places. The military had worked with the data they had but they forgot to account for the planes that never made it back, the data that was missing.

The Value of Failure

We tend to over-appreciate success stories and under-appreciate failures. Success stories are easy to find while failures are usually ignored or lost to time. Survivorship bias comes up frequently in think-pieces about successful people, businesses, investments, etc. The focus tends to be on the “winners” but rarely on the “losers.”

While successful people can give advice on what to do, people who failed can give advice on what not to do (which is just as valuable). Successful people giving advice is only one part of the data, it’s survivorship bias because we’re only hearing from the ones who “made it” and not the ones who didn’t. You hear how Steve Jobs, Mark Zuckerberg, and Bill Gates successfully went from college dropouts to billionaires, but there aren’t many stories on the majority of college dropouts who don’t become billionaires.

In the Red

In the world of business we mostly hear from the businesses that are successfully still around, and not from the ones who closed. Most new businesses, around 90%, will fail but we rarely get advice from them after they do. Instead we hear inspirational stories about the very small percentage of scrappy startups that were incredibly lucky who went from operating in garages to being juggernauts, such as Amazon.

Investments are similar. Funds that are losers are only allowed to lose for so long. When an investment company closes a fund the fund ceases to exist and no longer drags down the company’s overall performance. By removing/hiding the failures you can get a false overall sense of positive performance. This also means that the funds available to invest in are either proven winners or brand new funds, never any long-time losers.

Here Today, Gone Tomorrow

We can see survivorship bias in architecture and anthropology. The best built and/or most appreciated buildings of the past continue to stand, while the weak or unwanted buildings are brought down. This can lead to a false sense that all buildings of the past were stronger or more beautiful than today’s, but there were plenty of weak and/or ugly buildings in the past just like today. Ancient cultures had lots of buildings that were torn down or fell down over time, but we talk about the pyramids of the world because they’re a great shape to arrange stone that won’t fall over and are still standing today (no aliens needed).

When studying ancient cultures, it’s easy to account for cultures who built permanent structures with durable materials that have survived to be studied. But cultures that utilized temporary structures, moveable structures, or buildings made from biodegradable materials are harder to document. We have to rely on other clues to understand these groups and account for them in history.

When I Was Younger

Survivorship bias can also apply to things that are more subjective. It’s easier to remember the good art than all the bad art that got thrown away. People make statements that music / TV / movies were better in some previous time period than today, but they frequently forget all of the bad music / TV / movies of that previous time period. It’s a survivorship biased rose-tinted view of the past.

Today’s music is made up of new songs, both good and bad. When playing songs from the past however radio stations / channels tend to only play the successful hit songs and skip the bad songs, adding to the survivorship bias. This means it’s easier to remember the hits songs of the past that are still played than it is to remember the songs you never wanted to listen to. Even when you’re choosing the music to listen to you tend to pick the songs you want to hear and skip all of the songs you don’t. Sure Nirvana’s generational anthem Smells Like Teen Spirit came out in 1991, but so did Tom Cochrane’s Life Is A Highway (a song which is straight-up trash).

In analyzing a situation, thinking about the secrets of success, or flashing-back to a past that never really existed, remember to factor in all of the data you are forgetting.

Typhoid Mary

How one asymptomatic woman spread typhoid to dozens of people and raised a host of bioethical questions.

Mary Mallon was born in Cookstown, County Tyrone in Ireland in 1869. She emigrated to New York City when she was 15 and worked her way up through the servant ranks to the highly respectable position of cook. Over the years she ran the kitchens & cooked for various families around the city. In the summer of 1906 she was the cook for the Warren family (Charles Warren, banker to the Vanderbilts) as they vacationed in a rental house in the very upscale Oyster Bay, Long Island.

Over the course of that summer, 6 members of the household got sick with typhoid. No one else in Oyster Bay contracted the disease, a disease typically associated with the poor. Concerned for the reputation of the rental house, the owner knew the source of the typhoid had to be found or it would be difficult to ever rent the house again. George Soper, a freelance civil engineer, was hired to find the source of the typhoid and he traced it back to the Warren family’s former cook, Mary Mallon.

Tenement housing in New York provided ideal conditions for the spread of diseases including typhoid.

Typhoid

Typhoid fever is a form of salmonella (a bacteria) that can spread through tainted water or food that has come into contact with fecal matter. You find it in places with poor hygiene and poor sanitation, which is why it’s generally associated with the poor.

New York City in the early 20th century was a much dirtier place than today. The population of the city was doubling every decade. The tenement housing of Manhattan’s Lower East Side was an overcrowded jungle of people and it was common for a family of 10 to live in a 325 square foot apartment. Add to the mix the 150,000 – 200,000 horses of the city, each of which created about 25 pounds of manure a day and it all led to filthy conditions that were ideal for typhoid and other bacterial diseases.

Mary, seen in the first bed, during her first quarantine at North Brother Island.

Forced Quarantine

Soper tracked down Mary and he documented a trail of typhoid in her wake. Over 10 years Mary worked for 8 different New York families, 6 of those families contracted typhoid and 1 person died. Despite this evidence Mary was adamant that she never had typhoid and she never felt sick. She was partially right.

It turned out that she was a “healthy carrier” of typhoid, someone who had the disease but never really felt sick. She was asymptomatic and went about her life unaware that she even had the disease, let alone that she was spreading it to other people (not unlike asymptomatic carriers of COVID-19).

Eventually she was forced against her will into quarantine by the New York City Health Department. In 1907 she was sent to North Brother Island in the East River which was being used as a quarantine center for people sick with infectious diseases. She remained there for 3 years, during which time her story of forced quarantine made it into the papers where she was dubbed “Typhoid Mary”.

In 1910 she was released from quarantine on the condition that she never work as a cook again since she had most likely transmitted typhoid through the food she prepared. She kept to this agreement for a while, working as a laundress, but eventually she disappeared from public health officials and started work as a cook again under assumed names. The pay and working conditions of a laundress were far below that of a cook for a wealthy family. She was eventually caught working at Sloane Hospital for Women where an outbreak of typhoid infected 25 people killing 2. She was sent back to North Brother Island where she lived until she died in 1938 at the age of 69 (still carrying typhoid).

Typhoid Mary

Mary Mallon’s legacy is one of bioethical questions. In the early 20th century the science of communicable diseases was in its infancy and Mary’s suspicion of the New York Health Department was not unusual. She felt fine, so how could she be carrying/spreading a deadly disease?

Her quarantining raises ethical questions of how far the government should go to protect the general public. When weighing an individual’s civil liberties against the health of the public which is greater? Despite never being convicted of a crime she was imprisoned on North Brother Island for the safety of the public. Was it more ethical to quarantine her the first time or the second time, or at all? Knowing that other people were also asymptomatic carriers of typhoid why was she kept in isolation for nearly 30 years while others walked free? As a healthy carrier she was an unlucky victim of a disease, but she also chose to go back to cooking which she knew might endanger lives. The questions raised by Typhoid Mary are still relevant today.

Added item: There is a good hour-long documentary by PBS, The Most Dangerous Woman in America, on the story of Mary Mallon. You can also find a bootleg copy of the documentary on YouTube:

the Vulcan Salute

Leonard Nimoy got the Vulcan hand sign from a Jewish blessing.

For a 1967 episode of Star Trek: The Original Series Leonard Nimoy’s Vulcan character Spock was to, for the first time in the series, appear with other Vulcans. He decided Vulcans would have their own greeting that isn’t a human handshake or bow. Nimoy thought back to his childhood and remembered an Orthodox religious service he attended. The Jewish Kohanim performed a blessing where they brought their hands together, thumb to thumb, and parted their fingers between their middle and ring fingers (forming two Vs). This hand sign forms the Hebrew letter Shin which is the first letter of “Shaddai”, one of the names of God.

Nimoy took this two-handed blessing and turned it into the one-handed Vulcan salute. This gesture is often accompanied by one of the most famous phrases from Star Trek, “Live long and prosper.” When the “Amok Time” episode aired the hand sign instantly became famous. People would make the sign to Nimoy everywhere he went. Many people thought it was just a fun variation on the peace sign but unbeknownst to them they were (in a way) actually blessing one another.

On the history of the Vulcan salute

Animal Names vs Meat Names

In English we have different names for animals vs those same animals as food because of the Norman conquest of England in 1066 CE.

From the 5th century until the 11th century England was ruled by the Anglo-Saxons. The Anglo-Saxons were descendant of Germanic tribes which is why, if we look along the language family tree, we can see that English is related to a host of Germanic languages. The early English language of the Anglo-Saxons took a turn however in 1066 CE when the Normans invaded and conquered the country.

The Normans were a French speaking people from Normandy, the northwestern area of France. After crossing the channel and conquering England, they became the ruling class. This led to a tri-lingual system where:

  • Latin was the language of the Church
  • Anglo-Saxon English was the language of the common people
  • Norman French was the language of the nobility, courts, and government administration
A portion of the Bayeux Tapestry documenting the Anglo-Saxon defeat to the Normans at the Battle of Hastings in 1066 CE.

What’s For Dinner?

Anglo-Saxons became the working-class hunters and farmers of England and, as they were the ones tending to the animals, they called the animals by their English names. The Norman rulers however more frequently encountered these animals when they were served on a plate, and in this culinary context called them by their French names.

Over the centuries this practice of using two different names was adopted into Middle English which then evolved into our Modern English. This linguistic duality, where a living animal is called one name in English while also being called by a different French name as food, has continued through to the present.

English animal vs French meat dual names include:

  • cow vs beef
  • calf vs veal
  • pig vs pork
  • sheep vs mutton
  • deer vs venison (although originally any hunted animal was called “venison”)
  • snail vs escargot

Interestingly we use the word “chicken” for both the animal and the meat. This is likely because chicken was one of the few meats that everyone could afford and since the common people were raising and eating them, their practice of using the English language name in both contexts carried on.

Also the word for “fish” in French is “poisson” which is too close to the word “poison”. It’s thought that this linguistic similarity, and the danger if you get them confused, is why we kept the English language word for both the animal and the meat. We also tend to use species names such as “salmon” or “flounder”, avoiding “fish” and “poisson” altogether.

Added info: this English vs French origin linguistic duality is found in a host of other examples beyond food. Deadly vs morbid, job vs profession, cookie vs biscuit, smell vs odor, calling vs vocation, etc.

Belvoir Castle, whose name means “beautiful view” in French, is a Norman castle in central England. The Normans pronounced it the French way as “bell-vwah”, but the local Anglo-Saxons had difficulty saying this and called it “Beaver Castle” instead, a practice that continues to today.