Mount Tambora & Frankenstein

The eruption of Mount Tambora in 1815 led to the creation of Frankenstein.

Mount Tambora is a volcano on the island of Sumbawa, Indonesia and on April 5, 1815 it began a monumental multi-day eruption. The eruption is still the largest volcanic eruption in recorded human history, the estimated equivalent of a 14,000-megaton nuclear bomb. It was so powerful it removed the top 4,750 feet of the volcano, reducing it to 9,350 feet tall as it sent more than 38 cubic miles of debris into the sky. The explosion was so loud it was heard 1,600 miles away, the equivalent of an explosion in Philadelphia being heard in Denver.

The 1815 eruption of Mount Tambora, on the Indonesian island of Sumbawa, cause global devastation.

The eruption immediately killed over 10,000 people on the island. All of the island’s vegetation was destroyed and the water was poisoned which led to starvation and disease killing a further 37,825 Sumbawanese people. As the tsunami it generated, and the ash it expelled, spread to other islands, it killed off more vegetation and more people. Over 71,000 people are believed to have died in the immediate area of Indonesia from the eruption. However, with so much material being sent into the sky, the full impact of the eruption was only beginning.

Into the Stratosphere and Around the World

The long-term effects of the eruption were caused by the gases & ash sent into the stratosphere 141,000 feet into the sky. The sulfur dioxide (SO2) released caused a global greenhouse effect, blocking out sunlight and changing weather patterns. While the effects were spread around the world they were worse in the northern hemisphere. The cold weather and constant rain (such as the 8 weeks of “unceasing and extraordinary rain” in Ireland) killed crops around Europe causing food shortages in what became the worst famine in 19th century mainland Europe. Over 65,000 people died around the British Isles as a result of a typhus epidemic which was made worse by the volcanic induced weather. A new strain of cholera also developed in this weather, killing thousands more.

In North America a dry fog descended on the northeastern states which lasted for months. The extended cold was felt up & down the eastern seaboard. On the 4th of July the high in Savannah, Georgia was only 46° F. Rivers and lakes were still frozen in Pennsylvania in August. The extreme weather and bitter cold is believed to have been a catalyst for the westward expansion across America – people wanted to find a place that wasn’t awful. The eruption of Mount Tambora lowered global temperatures by 0.7 to 1.3 °F but its particularly brutal effects on the northern hemisphere is why 1816 came to be known as the “year without a summer.” The initial volcanic eruption, the extreme cold, the unusual weather patterns, as well as the spread of diseases resulted in a global death toll in the hundreds of thousands.

Silver Lining

Despite the adversity there were some positives. German inventor Karl Drais was motivated to find an alternate means of transportation to the horse (since horses require food which was in short supply at the time). He invented the first bicycle, the Laufmaschine, in 1817.

The eruption caused strange dark colors in the skies captured by a variety of painters of the day.

In the arts painters were inspired by the unusual hazy skies. Particulate matter from Mount Tambora hung in the stratosphere frequently blocking the shorter wavelength colors of blue light. A study of paintings from between 1500 to 1900 found that the paintings around 1816 got redder & darker than other time periods. The polluted skies might have made for more depressing daily life but they made for some great paintings.

But perhaps the greatest byproduct of the year without a summer was in literature. In the summer of 1816 a group of English friends traveled to Cologny near Lake Geneva in Switzerland. They hoped to escape the bad weather of England but ended up in even more rain. Sitting around with nothing to do Lord Byron proposed everyone write a ghost story. John William Polidori, Byron’s personal physician, took a story idea by Byron and eventually wrote 1819’s The Vampyre, the first modern vampire story.

The year without a summer generated two of the most influential stories in Gothic horror.

An 18 year old Mary Godwin had trouble coming up with a story until (literally) one dark & stormy night, sometime after midnight, she had a “waking dream” of a pale man kneeling beside the thing he had put together that showed signs of life. With the encouragement/help of her soon to be husband Percy Shelley, Mary (Godwin) Shelley had the beginnings of Frankenstein. In 1818 Mary Shelley published Frankenstein; or, The Modern Prometheus, considered the first science-fiction story.

From one miserable vacation, caused by a volcano thousands of miles away, two of the most defining works of the Gothic horror genre were born.

Auld Lang Syne

The nostalgic song toasting times gone by that has spread around the world.

Auld Lang Syne started as a traditional Scottish folk song. The lyrics were written down, added to, and made famous by 18th century Scottish national poet Robert Burns in 1788. In the late 18th century Burns was touring Scotland collecting folk songs & poetry when he recorded Auld Lang Syne and submitted it to The Scots Musical Museum.

Burns contributed hundreds of songs to the Museum whose intention was to preserve the fading Scots language & culture which was becoming increasingly influenced by English culture. As such Auld Lang Syne is written partially in English but also partially in Scots (which is a Germanic derived Scottish language, different than “Scottish” which is a Celtic Gaelic derived language). The lyrics were originally set to a few different melodies but in 1799 they were paired with the melody we know today.

Written down and added to by Robert Burns, Auld Lang Syne has become the unofficial theme song of New Year’s.

What is it and why New Year’s Eve?

Because the lyrics are partially in Scots most people don’t know exactly what the song means. The title “auld lang syne” in Scots translates to “old long since” or more loosely as “for the sake of the good old days gone by”. The song is a toast to friendship and to the fond memories of days gone by.

Given the song’s spirit of looking back while looking forward it became a standard sung every Hogmanay (the Scottish New Year’s Eve). Its association with New Year’s in North America was because of Guy Lombardo. On New Year’s Eve 1928 Guy Lombardo and The Royal Canadians big band hosted a concert at the Roosevelt Hotel in New York City and at the stroke of midnight they played Auld Lang Syne. For the next 47 years they played NYE concerts and every midnight they played Auld Lang Syne, earning Lombardo the nickname of “Mr. New Year’s Eve”. When Dick Clark created Dick Clark’s New Year’s Rockin’ Eve from Times Square in 1972 he too played Lombardo’s version of Auld Lang Syne at midnight. Since then the song has become synonymous with New Year’s.

Guy Lombardo’s classic 1947 rendition of Auld Lang Syne.

Around the World

While the song is internationally recognized as the unofficial theme song of New Year’s Eve the melody has been used in other ways. The Korean national anthem Aegukga originally used the melody of Auld Lang Syne until 1948 when it was replaced with an original melody. It was also the melody of the national anthem of the Maldives, Qaumii salaam, until 1972 when it too was replaced with an original melody.

The Dutch song Wij houden van Oranje (which translates to “We Love Orange”) is a national soccer chant set to the melody of Auld Lang Syne. Also in Japan the melody is used for for the graduation ceremony song Hotaru no Hikari, the melody is used to mark the end of the day in department stores, etc.

QI discusses the history of Auld Lang Syne

Toasting the Past, Looking Forward

Like the Roman god Janus, Auld Lang Syne is a seasonal reminder to look back at the days gone by but also look ahead to the future. It’s a nostalgic song that toasts the people with us today as well as the people with us in spirit.

the First Thanksgiving Menu

Lacking key ingredients, the menu at the first Thanksgiving of 1621 was a bit different than the traditional turkey dinner of today.

In the fall of 1621 the English Pilgrims and the Wampanoag came together in Massachusetts for, what has subsequently become a much mythologized, 3 day harvest festival. That they were still alive following the deaths of half their fellow pilgrims the previous winter, having their supplies fortified by the Wampanoag, and then having completed a successful summer growing season, the pilgrims had a lot to be thankful for. What they ate as they gave thanks is debatable.

Definitely on the Menu

One food that was definitely served was venison. Massasoit, the leader of the Wampanoag, had 5 deer brought to the event. Another meat on the menu was “wild fowl”, but exactly what kind of birds these were is unknown. It’s possible that there was turkey at the first Thanksgiving but more likely it was goose or duck (or a combination). Other regional bird options at the time would have been swan and passenger pigeon.

Also definitely present was corn. The Wampanoag, who used the Three Sisters method of farming, had taught the pilgrims how to grow corn. As the pilgrims had grown a successful crop of Flint corn (aka “Indian corn”) it was cooked into a porridge, a bread, and/or with beans.

Maybe on the Menu

Given that the Plymouth Colony was by the water it’s very likely that seafood was also served. Eels, clams, muscles, cod, bass, and/or lobsters were very likely a part of the meal. It’s worth noting though that, unlike today, lobster was considered a food of last resort.

There were certainly vegetables & fruits on the menu but which ones were never specified (other than corn). Chestnuts, walnuts, beans, onions, carrots, cabbage, pumpkins, and various squashes were all grown in the area. Blueberries, plums, grapes, and raspberries were also grown in the area and could have been present. While cranberries might have been served cranberry sauce definitely was not since the colonists lacked the necessary sugar (and that cranberry sauce didn’t exist for another 50 years).

Not on the Menu

Even though pumpkins may have been present, pumpkin pie definitely was not. The pilgrims had neither the butter nor the flour necessary to make pumpkin pie – they didn’t even have an oven in 1621. Something pumpkin pie-esque that may have been prepared is a spiced pumpkin soup/custard cooked directly inside a pumpkin which was roasted on hot ashes.

There was no stuffing because, again, the colonists lacked the necessary flour. There were also no potatoes (mashed or otherwise). Potatoes came from South America and, while they had made their way to Europe by the late 16th century via the Spanish, they had yet to make their way to New England. There also weren’t any forks on the table since they too hadn’t made their way to North America yet (but on the upside nobody present had an overbite).

A historical reenactment of how to cook some of the foods present at the first Thanksgiving.

Hookworm

The parasite responsible for giving American southerners a bad reputation.

For centuries American southerners were maligned as lazy, slow, shiftless, dumb. Southerners had “the germ of laziness.” There was just something different about southerners that made them less-than their northern counterparts. As it turned out there was something different about them but it had nothing to do with genetics or social conditioning. That something was hookworm.

Hookworm

Hookworm, and specifically the New World hookworm Necator americanus, is a parasitic worm that arrived in America by way of the slave trade in the 17th century. In the larval stage hookworms live in wet warm shady soil where they wait to encounter a human. A person walking barefoot outdoors may make contact with a hookworm at which point it can penetrate the skin and crawl into the foot. From there it travels through the circulatory system to the lungs where it triggers a dry cough. The human host then unknowingly coughs up the worm only to swallow it down to the small intestine, which is where the worm wanted to be the entire time. The worm then lives around 1-2 years (or longer) attached to the wall of the intestine, sucking blood, and where a female worm can lay up to 30,000 eggs per day. Eventually these fertilized eggs are pooped out in a poorly built outhouse or in some bushes, contaminating the soil to start the process again. It’s disgusting.

Because hookworms thrive in warm humid environments they do particularly well in the southern climate of the United States. The area from southeastern Texas to West Virginia became nicknamed the “hookworm belt”. For poor southerners who couldn’t afford shoes and didn’t have indoor plumbing it was almost impossible to avoid hookworm. By 1910 it’s believed that around 40% of the people living in the south were infected with millions of worms.

Putting their gross lifecycle aside, the problem with hookworms is that they steal your blood. Alone one worm won’t do much damage, but getting infested by multiple worms on a continual basis over years/decades has a severely damaging cumulative effect. By consuming your blood hookworms can cause an iron deficiency. People with hookworms become tired, lose weight, and have little strength to do anything. Pregnant women are at risk for anemia and a greater chance of dying in child birth. Infected children can suffer irreversible developmental problems including stunted growth and intellectual disabilities. All of this matches the unfair characterization of southerners as slow rednecks.

A nurse brings hookworm medicine to a rural Alabama family, 1939.
A doctor and a nurse examine for hookworm in an Alabama school, 1939.

The cumulative effect

In 1902 zoologist Charles W. Stiles discovered that hookworms were endemic to the southern US. In 1909 John D. Rockefeller got involved by funding the creation of the Rockefeller Sanitary Commission for the Eradication of Hookworm Disease. They campaigned across the south to educate, test, and help treat hookworm. Students in small country schoolhouses would submit stool samples to their teachers to be tested – some schools even required students be screened for hookworm. People would go to health clinics on the weekends to learn more. An estimated 7.5 million southerners had hookworms. While the Rockefeller Commission helped treat the problem what greatly reduced hookworm was the urbanization of the south enabling more people to afford shoes and sanitary indoor plumbing.

The barefoot Texas boy on the right has hookworm, 1939.

Beyond the health consequences the socioeconomic impact of hookworm is also destructive. The US regions with hookworm had lower rates of literacy and school attendance than areas without it. A 1926 study of Alabama children showed that the more worms a child had the lower their IQ. Even today children with chronic hookworm face up to 40% lower future wage earnings when they grow up. General productivity is measurably lower as a result of hookworm. The southern regions that were worst infected with hookworms saw the greatest income expansion after treatment, but unfortunately centuries of infection had a cumulative effect. Eight of the ten poorest US states are still southern states.

Hookworm in the US is typically thought of as a problem of the past but it is still very much a problem of the present. Given the severe income inequality in the US, hookworm is thriving in regions living below the poverty line. Hookworm lives in, and reinforces, poverty and while 85% of the world lives on less than $30 a day 10% of the world is currently living in extreme poverty. Around the world an estimated 477 million people are currently living with hookworms inside them.

Egyptian Mummies: From Medicine to Paint

For hundreds of years Europeans used ground up Egyptian mummies as medicine and paint pigment.

The Arabic word mūmiyā (which later became “mummia”) was the name for the black sticky asphalt material that came out of the ground used as a sealant, an adhesive, and as medicine around the ancient world. Pliny the Elder and others wrote about the medicinal uses for mummia which became a bit of a cure-all for a range of ailments.

Unfortunately mummia the petroleum product looked like another black substance that was a byproduct of the Egyptian embalming process. As such the word “mummia” came to mean both the petroleum product AND the product of Egyptian mummification, which was then even further confused as meaning an entire mummified body. This is how we got the word “mummy”. Unfortunately this series of mistakes also led to hundreds of years of cannibalism.

Cannibal Medicine

Since the petroleum based mummia was used both externally as a salve as well as ingested internally, the Egyptian mummy version of mummia became used in the same ways. The 11th century physician Constantinus Africanus even described mummia as a “spice” found in the sepulchers of the dead. Soon the human version replaced the petroleum version and people began to crumble & grind human corpses for medicine.

With the Crusades, Europeans learned of mummia and its medicinal possibilities. This significantly increased European demand for Egyptian mummies and by the 15th-16th centuries there was a thriving trade in mummies. Thousands of bodies were being exhumed and shipped to Europe to be turned into medicines. In 1586 English merchant John Sanderson shipped 600 pounds of mummies to London to sell at various apothecaries. This was fueled in part by orientalism, that Egyptian mummies had some sort of exotic ancient knowledge or power.

Europeans would consume portions of Egyptian corpses for help with general pain, ulcers, inflammation, epilepsy, cough, difficult labor, etc – none of which worked, or if they worked it wasn’t the mummy that was the active ingredient. The practice was so common Shakespeare included mummy as an ingredient in the witches’ potion in Macbeth. Demand was so high that by the 17th century some mummy dealers were producing counterfeit mummies. Newly deceased people, animals, or prisoners who had been purposefully starved & executed, were put through a process to simulate ancient Egyptian mummies.

After a few hundred years of medicinal cannibalism Europeans began to express doubt as to the practice’s efficacy (and ethicality). The 16th century herbalist Leonhard Fuchs felt foreign mummies were acceptable but local ones were wrong. While doubts arose during the Renaissance in the 16th century it took until the 18th century age of Enlightenment for the practice to fall out of fashion. As consuming mummies slowly ended Egyptian mummies took on a new role: paint pigment.

The Egyptian Widow by Lourens Alma Tadema is an 1872 painting of Egyptian life potentially painted using mummy brown paint.
Liberty Leading the People by Eugène Delacroix is another painting that’s theorized to contain mummy brown.

Mummy Brown

Around the end of the 16th century artists began using ground up Egyptian mummies (mixed with other materials) to produce mummy brown, a shade of brown pigment. Apothecaries that were grinding up mummies for medicine began to grind them up for paint as well. As a paint it was good for shadows, flesh tones, and glazing. Artists Benjamin West, Martin Drolling, Lawrence Alma-Tadema, Edward Burne-Jones, Eugène Delacroix, and others all painted with mummy brown.

It wasn’t until the 19th century that mummy brown began to fall out of favor. That said as recently as 1926 C Roberson & Co. still sold mummy brown made with ground up Egyptian corpses. As mummy brown died out so too did hundreds of years of large-scale desecration of deceased Egyptians, using human beings for medicines and paints.

Zombies: Sadder Than You Think

The concept of Haitian zombies was used as a threat to keep slaves working.

Before Haiti was an independent country it was the French colony of Saint-Domingue where they produced sugar, coffee, cotton, and other goods. The French brought more than a million West African people to the colony as slaves, more than any other colony in the Caribbean. Slavery in Saint-Domingue was particularly brutal – most people were poorly fed, they worked 12 hour days, pregnant slaves frequently didn’t live long enough to have babies, torture was common. Life expectancy was about 3-6 years with about half of the enslaved people of Saint-Domingue dying within the first few years of arriving.

The brutal conditions of Saint-Domingue left the enslaved people hoping that, in death, their souls would return home to West Africa.

Haitian Vodou & Zombies

The Code Noir was a 1685 decree that outlined how slavery was to be conducted in the French empire. Among other things it stated that slaves were prohibited from practicing African religions and instead were forcibly baptized into Catholicism. What resulted was Haitian Vodou, a religious blend of West African beliefs (practiced in secret) given a veneer of Catholicism.

Part of this belief system was the idea that, upon dying, you would return to lan guinée (ie. Guinea, or West Africa). Their idea of heaven was to escape the slavery of Saint-Domingue and to simply go home. Feeling the allure of going home some people decided to escape slavery on their own terms. As such suicide was very common Saint-Domingue.

Initially suicide was seen as a viable way of getting to lan guinée but at some point there was a change. At some point (oral tradition is murky on when/how) suicide was prohibited and the punishment for committing suicide was that you’d be a slave forever – you’d become a zombie. The zombies of Haitian Vodou are not the Western pop culture shambling brain-eating zombies. The Haitian zombie was someone whose soul had been captured, denied entry to lan guinée, and was turned into an undead field hand with no chance of escape. Plantation slave-drivers used this to their advantage threatening slaves that if they killed themselves they would be turned into zombies to work forever under the control of a bokor/sorcerer. Unlike today what was feared was the threat of becoming a zombie, not the actual zombies themselves.

1929’s White Zombie was the first zombie movie. It used some Haitian Vodou beliefs but took significant artistic license.

White Zombie

Over time the zombie concept evolved and changed. The sensationalistic 1929 William Seabrook travel book The Magic Island introduced voodoo and zombies to mainstream Western culture. This inspired the 1932 film White Zombie, which was the first zombie movie. White Zombie stars Bela Lugosi as the villainous Murder Legendre (a bit on the nose) who’s a bokor enslaving people as zombies to be his henchmen and to work in his sugarcane mill. White Zombie used Haitian Vodou ideas but with a lot of artistic license. Later zombie stories dropped the Saint-Domingue threat of eternal slavery, then they dropped the bokor master commanding the zombies. Aside from being mindless undead creatures, the zombies of today have little resemblance to their sadder more terrifying origins.

Added info: following the Haitian revolution of 1791–1804, the 1883 Haitian Criminal Code outlaws the practice of turning someone into a zombie.

Cabinet of Curiosities

Before museums existed, people had cabinets/rooms to display their collected treasures.

There was a time when museums did not exist. The role of collecting, preserving, and displaying the art, artifacts, and wonders of the world belonged largely to individuals. As far back as the 4th century BCE Greeks were collecting exotic treasures from the East. More than just trading in commodities, the Greeks collected the art and textiles from these far away cultures. Roman emperor Augustus decorated his homes not just with art but with rare objects and bones of giant animals. Over the centuries, as cultures explored & traded with increasingly distant lands, the trends in what was collectible grew & changed. By the 16th and 17th centuries wealthy European collectors had amassed enough objects that they created special cabinets and/or rooms to show-off their collections. They created cabinets of curiosities.

Ole Worm’s Museum Wormianum is one of the most famous cabinets of curiosities.
Ferrante Imperato’s Dell’Historia Naturale is another famous wunderkabinett.

Wunderkabinett

From the German for art (kunst) or marvels (wunder) and cabinet (kabinett) or room (kammer), these cabinets & rooms were places where Renaissance scholars, merchants, royalty, and others could store their collections. Collecting was very fashionable in 17th century Europe and these cabinets were dedicated spaces to displaying all manner of objects. Like the contemporaneous maps of the world, some of these spaces were designed for show while others were more utilitarian.

A collection of cabinets and rooms displaying all manner of curiosities.

Some collectors had thousands of specimens. The objects in these cabinets were thoughtfully categorized and organized, each piece contributing to the larger whole. Collecting was a way to bring order to the world, to exert some level of control over something that is uncontrollable. What was stored & displayed in these cabinets depended on the collector, but broad categories of objects included:

  • Fine art
  • Applied art (scientific instruments, anthropological objects, etc.)
  • Natural materials (fossils, shells, rocks, etc.)
  • Historical objects

These categories, as well as these collections, served as the precursors to our modern museums. The Amerbach Cabinet was a collection of art, books, coins, etc. that was assembled by various members of the Amerbach family. It was eventually co-purchased by the city of Basel & the University Basel and became the Kunstmuseum Basel in 1661, the first public museum in the world. Francesco I de’ Medici had his studiolo, a 26 x 10 foot room of curiosities that is part of the Palazzo Vecchio in Florence. Other Medici possessions served as the start of the Uffizi Gallery. Elias Ashmole, who amassed his fortune & collection through sometimes questionable means, gifted his collection to the University of Oxford which became the Ashmolean Museum in 1683.

Throughout the 18th century an increasing number of private collections were converted into public museums, some of which still exist today but all of which helped define what museums have become.

Added info: In 1784 Charles Wilson Peale’s collection became the Philadelphia Museum which was the United States’ first museum (and also the first to display a mastodon skeleton).

Rednecks & Hillbillies

The terms redneck and hillbilly both come from rebellious 17th century Scottish protestants.

Rednecks

In 17th century, King Charles I pushed for greater religious uniformity across the British Isles. Scottish Presbyterians disapproved as these reforms were increasingly Catholic in style & organization. In 1638 thousands of Scots signed the National Covenant (sometimes using their own blood as ink), signifying their preference for a Presbyterian Church of Scotland and their refusal to accept the reforms made by Charles. Going one step further, some of these “Covenanters” took to wearing red cloth on their necks as an outward sign of their resistance. These dissenting Scottish religious rebels were the original “red necks”.

Looking closely at The Signing of the National Covenant in Greyfriars Kirkyard, Edinburgh by William Allan you can see the man signing the Covenant at the center is having his blood drawn by a dagger for him to use as ink.

Hillbillies

Political and religious tension continued around the British Isles throughout the late 17th century which led to the 1688 Glorious Revolution. On the one side of this revolution was Catholic King James II and those who supported a strong monarchy, on the other were Protestants & Parliamentarians. Afraid of a Catholic dynasty and that James would leave the throne to his Catholic son James Francis Edward, seven influential English nobility invited the protestant Dutch Prince William of Orange to invade England and take the throne.

Around the same time, Scottish Presbyterian leader Richard Cameron was preaching a message of rebellion against the English. Being a religious nonconformist, Cameron took to being a field preacher and spread his radical message outdoors away from Scottish towns. His followers (the Cameronians) were given the nickname “hillmen” due to their outdoor religious gatherings.

As William of Orange easily invaded England, and successfully took the throne, he was supported by Scottish Protestants. The Scottish living in Northern Ireland at the time fought against the Jacobite supporters of King James. William of Orange was nicknamed “King Billy” and his Ulster Scots Protestant supporters were nicknamed “Billy boys”. Eventually these two Scottish Protestant rebel nicknames of “hillmen” and “Billy boys” got combined to form “hillbilly boys” and then just “hillbilly”.

Ulster Scot supporters of William of Orange became known as “Billy Boys” which, when combined with the Scottish Cameronian nickname of “hillmen”, eventually became “hillbilly”.

American Rednecks & Hillbillies

Despite their successful support for William many Scottish were still oppressed for being Presbyterians and for being Scottish. Searching for greater religious & personal freedom they began to emigrate in larger numbers from Ulster to the British colonies in North America. An estimated 200,000 Ulster Scots (aka Scotch-Irish) emigrated to the American colonies between 1717 and 1775. Settling up and down the East coast and throughout Appalachia, these Scottish protestants brought with them their religion, the rebelliousness, as well as their nicknames.

Over the centuries the meanings of both “redneck” and “hillbilly” have changed. During the “Redneck War” of 1920-21 “redneck” was used to label the unionizing coal miners (many of whom were Scotch-Irish) who wore red bandanas in solidarity. The term has also been used to describe early 20th century southern Democrats as well as more literally to describe poor farmers with sunburnt necks. Hillbilly also took on a more literal interpretation to describe the people who settled the rural hilly areas of Appalachia and the Ozarks. Today both terms are generally used as derogatory slurs for poor rural whites.

Sunglasses

Humans have been making devices to shield their eyes from the sun for thousands of years. Today one company dominates the market.

Living around the Arctic where the bright sunlight reflects off the ice & snow, the indigenous peoples of North America & Greenland developed the earliest sunglasses. These 4,000 year old proto-sunglasses were carved from a variety of materials and featured very thin slits allowing the wearer to see while keeping their eyes protected by blocking excessive sunlight. This idea has been recreated many times in a variety of styles from the 1930s to the present.

The traditional “sunglasses” of the Arctic have been reinvented many times over the years.

The Venetians, who had been making clear corrective eyeglasses since the 13th century, were among the first to produce sunglasses with glass. In the 18th century the glass makers of Murano produced green-tinted eyeglasses (as well as what resemble handheld mirrors but with transparent green glass) through which wealthy Venetians could look across the water while protecting their eyes from reflected light.

Venetians used green glass to protect their eyes while soldiers in the American Civil War used a variety of colors.

By the 19th century it was not uncommon for soldiers, on both sides of the American Civil War, to wear colored spectacles of blue/gray/green to protect their eyes while marching in the sun. But sunglasses were still primarily utilitarian. They didn’t become a fashionable part of mainstream culture until the 20th century.

20th Century Sunglasses

In the early 20th century Sam Foster had a plastics company that primarily sold women’s hair accessories, but as the trend in women’s hair changed to shorter hair styles (negating the need for so many hair accessories), he had to find a new product to sell. In 1929 he began selling inexpensive plastic sunglasses to beachgoers for 10 cents a pair on the Atlantic City boardwalk. This was the beginning of the Foster Grant eyewear company. Foster Grant sunglasses became the shades of Hollywood celebrities which helped make sunglasses not just about protecting your eyes but also about fashion. Sunglasses could now be about style as well as function.

In 1929 Bausch & Lomb, who were already making optical equipment for the military, began work for the U.S. Army Air Corps developing sunglasses that wouldn’t fog up and would reduce glare for pilots. This gave us the iconic “Ray-Ban Aviator” sunglasses. Aviator sunglasses were also the start of Ray-Ban eyewear company, which began as the civilian division of Bausch & Lomb. Ray-Ban would go on to make another iconic model of sunglasses, the Wayfarer, in 1956.

(Side fact: Roy Orbison fell into his wayfarer signature style accidentally while on tour with the Beatles in 1963. He forgot his regular glasses on the plane and had to wear his wayfarer sunglasses on stage, and thus was born his iconic look.)

Sunglasses became about function & fashion in the 20th century. Also, Tom Cruise movies helped popularize two of Ray-Ban’s most famous models.

Luxottica

Today the sunglasses market is dominated by Luxottica, an Italian eyewear juggernaut which is the largest eyewear company in the world. They’re the company actually making the sunglasses of luxury brands such as Chanel, Prada, Ralph Lauren, Versace, etc. Luxottica’s dominance is due in large part to their vertical integration control over the eyewear industry. They own major distribution retail stores such as LensCrafters, Target Optical, Pearle Vision, and Sunglass Hut. They own major eyeglass brands including Oakley and Ray-Ban, and they manufacture the eyewear for all of the above. They even own EyeMed, the second largest vision insurance company in America. You could go from getting a vision prescription, to selecting a pair of glasses, to buying them at a retail store and pay Luxottica at every step of the way.

Luxottica’s control over the market is why eyewear prices have gone up and not down. The proliferation of brands & stores competing for sales isn’t as competitive as it seems since Luxottica is behind many of them. In Luxottica owned stores 89% of the products available are made by Luxottica. Most of these glasses are the same quality, just different styles. Because of Luxottica, frames that cost maybe $15 to produce can be sold for hundreds of dollars. As of 2019 Luxottica controlled around 40% of the eyewear market.

60 Minutes’s 2012 report on eyeglass juggernaut Luxottica.

Added info: Beyond just blocking excessive bright light, good sunglasses block most ultraviolet (UV) light from damaging your eyes. Darker glasses don’t necessarily block more UV light so it’s worth buying reputable sunglasses that have been engineered & certified to offer UV protection. It’s better to not wear any sunglasses at all than ones that don’t block UV light because your pupils will widen in the shade of junk sunglasses and in so doing allow in more UV rays.

Pineapples as Status Symbols

Because of their rarity pineapples became European decorative elements and status symbols.

The pineapple is native to South America but across thousands of years of cultivation it spread to Central America as well. The first European to encounter a pineapple was Columbus in 1493 who brought some back to the Spanish royal court (along with tobacco, gold, chili peppers, and the people he kidnapped). Europeans had never tasted anything like pineapple before and, because of their scarcity, to own one quickly became an exotic status symbol of the ultra wealthy.

Pineapples were in high demand but there was low supply so enterprising individuals set out to grow pineapples in Europe. The tropical conditions pineapples require make growing them in Europe a challenge. It took until the 17th century for farmers in the Netherlands to succeed, followed by the English in the 18th century. Mathew Decker even memorialized his pineapple growing achievement by commissioning a painting in 1720. These efforts produced more, albeit still not many, pineapples for the European & American markets. A single pineapple could go for around $8,000 in today’s dollars. A cheaper alternative was to rent a pineapple which people would do to show off at parties and such. These pineapples would be rented from person to person until the final person paid to eat it, unless it had rotted by then. A further down-market option was pineapple jam which could be shipped from Central/South America.

Because of their popularity, pineapples became a decorative element in a host of artistic mediums.

Pineapple Art

The Caribbean custom of placing pineapples at the front of a friendly home as a welcome to strangers, combined with years of being displayed at happy European social gatherings, led pineapples to becoming international symbols of hospitality. This combined with their association with wealth & high society helped make the pineapple a popular artistic motif. From this we get carved pineapple embellishments as finials on staircases, at the tops of columns, on New England gateposts, above front doors, as fountains, as furniture accents, Christopher Wren placed gilded copper pineapples on St. Paul’s Cathedral in London, the centerpiece of the Dunmore Pineapple folly in Scotland is a massive pineapple, etc.

Added info: any association of pineapple with Hawaii comes after the fruit was introduced there by the Spanish in the 18th century. Pineapple is not native to Hawaii.