Color Blindness

The visual condition that changes what colors you see.

To start, being color blind almost never means someone is blind to color, as if they’re living in a black & white movie. “Color blind” usually just means someone doesn’t see the full spectrum of colors like the rest of us. To understand color blindness we have to understand two concepts: light and our eyes.

Let there be light

The colors that we see are photons moving at different wavelengths/frequencies. They’re part of the electromagnetic spectrum. The full electromagnetic spectrum ranges from Gamma rays (the shortest, highest frequency waves – quite dangerous) to radio waves (the longest, lowest frequency waves – not so dangerous). What we call visible light is radiation in a particular range of wavelengths. Within this bandwidth the colors of violet and blue have the shortest wavelengths while oranges and reds have the longest. Through evolution we have developed two small biological machines capable of detecting this range of wavelengths … our eyes.

the electromagnetic spectrum
The electromagnetic spectrum ranges from gamma rays to radio waves, photons moving at a variety of wavelengths & frequencies.

Doctor My Eyes

Our retinas have two kinds of photoreceptive cells: rods and cones. Rods see light & dark while cones see color. We have about 120 million rods per eye and but only 6-7 million cones per eye. Instead of just one kind of cone cell we have three and each kind is tuned to a certain range of wavelengths (short, medium, and long). To put it another way, our three kinds of cone cells are each tuned to see certain ranges of colors – blues (short), greens (medium), reds (long).

Bringing it all together, color blindness is when one (or more) of your cone cell types are either defective or missing entirely. The result is that you are unable to properly see certain wavelengths of colors.

Color Blindness

Why does color blindness happen? While color blindness can be an acquired condition most of the time it’s genetic. The most common forms of color blindness are carried on the X chromosome and because men only have one X chromosome, if it’s defective they’re out of luck. This is why men are more commonly color blind than women. Women have two X chromosomes so a functioning X chromosome will compensate for a defective one. As a result around 8% of men are color blind compared to only around 0.5% of women. That said color blindness isn’t evenly distributed across men – it has a higher prevalence amongst Caucasian men than other ethnicities.

red-green color blindness is the most common form of color blindness
Red-green color blindness is a group of different kinds of color blindness. It’s the most common form of color blindness.

Because cones come in three varieties, and those cones can be defective or absent, the various combinations of factors means there are many forms of color blindness. The most common type is “red-green” color blind (which is a few kinds grouped together) where reds and greens aren’t seen properly and shift to look more like yellows and browns. This is the result of the medium and long (green and red) cone cells being defective or absent. Red-green color blindness accounts for about 99% of all color blindness with about 1 in 12 men and 1 in 200 women having it.

Blue-yellow color blindness is where blues and greens aren’t seen properly. It’s also genetic but it’s not carried on the X chromosome so men and women are affected relatively evenly. It’s quite rare – around 0.01% of men and women are blue-yellow color blind.

Are you seeing what I’m seeing?

The effects of color blindness range from the benign to the dangerous. Accidentally wearing clothes that don’t match can be embarrassing but confusing “stop” for “go” on a traffic light can be dangerous. Color blind individuals can have difficulty determining the ripeness of fruits & vegetables. They can see sports jerseys as similar and have difficulty tracking games. The designer shorthand that red means error/bad while green means success/good (the traffic light analogy) can make a variety of safety features, dashboards, and websites more difficult to use. One positive is that color blind individuals may be better at detecting camouflage.

What do other animals see?

The concept of “color blindness” is relative. What most humans consider normal is not what most bees would consider normal, or dogs, or any other species. So when people say that most other animals are color blind, it’s just that they can’t see the same spectrum of colors that humans normally see.

To start, most mammals are red-green color blind (which to them is normal). They tend to only have two cone cell types, lacking the third we have to see a wider range of colors. So when a dog can’t find the green tennis ball in the green grass, it’s probably because they really can’t see it (especially if it has stopped rolling). Dogs rely on movement to distinguish between things more than we do. That said dogs have more rods than humans so when it seems like they’re looking at something in the dark, and you can’t see anything, they’re probably seeing something beyond your vision.

The old idea that bulls dislike the color red is untrue – they’re red-green color blind. When a matador waves a red/pink cape to attract a bull the bull is responding to the motion of the cape, not the color. Dolphins and other marine animals see even less due to having only the long wavelength cone type and are monochromatic. Deer are red-green color blind but can see more shorter wavelength colors than we can including some amount of ultraviolet which to us is invisible. Some laundry detergents contain brightening agents that are intended to brighten the colors of your clothes but can make clothes look bright blue to deer. The result is that, even if your clothes are camouflaged, the deer probably saw you long before you saw the deer.

Even beyond seeing ultraviolet, some animals can detect/see the Earth’s magnetic fields. It’s believed that robins can see magnetic field lines as a darker shading on the normal colors they already see, but they can only see it through their right eyes and only on clear days. When cryptochrome molecules in their right eyes are struck by blue light the molecules become active and allow robins to see magnetic fields which they can use to navigate as they migrate north & south. Interestingly, non-migratory bird species seem to have less sensitivity to magnetic fields than migratory birds.

Finally, contrary to popular misconception, bats are not blind and some actually have quite decent sight. While they are red-green color blind like most other mammals they have an ultraviolet sensitivity that helps them hunt as well as detect predators. All of this in addition to echolocation means they are quite capable of seeing and navigating the world around them. That said, some species of bats as well as other nocturnal animals have no cones at all and are really truly color blind.

Diamonds are not forever, but pretty close

Diamonds very slowly degrade into graphite … but the sun, the solar system, and many of the black holes will have died long before then.

When thinking about diamonds we can think of the Shawshank Redemption line that, “Geology is the study of pressure and time. That’s all it takes really… pressure… and time”. Diamonds mined from the Earth are carbon atoms that have been compressed over long periods of time (in the billions of years) under enormous amounts of pressure.

The sands of time

The underground pressure that forms a diamond also holds it together. Once removed from the ground the carbon atoms very, very, slowly rearrange into graphite. Graphite is a more stable arrangement of carbon atoms than a diamond, but “more stable” is extremely relative. Under normal conditions a diamond sitting in your house would take an estimated 10 to the power of 80 years (1080 years) to become graphite, which is:

100,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000 years

One with 80 zeros after it, years. To put that in perspective, the universe is only around 14 billion years old, or 14,000,000,000 years old. A wildly shorter period of time.

The slow death of the universe

So what will life be like when diamonds begin to lose their luster? Nothing lasts forever, including our sun and the universe as we know it. Our sun is scheduled to become a red giant star, expanding in size to engulf Mercury, Venus, and probably Earth, in about 5 billion years. But before that happens the increasing brightness of the sun will kill off all life on Earth in about 1 billion years.

If humanity takes its diamonds aboard a spaceship (with standard Earth like conditions), and sails the universe through space & time, it will theoretically take until the Black Hole Era for the diamonds to become graphite. The Black Hole Era will be from 1043 to approximately 10100 years from now. Leading up to the Black Hole Era the stars will have burned out and the planets (and your diamonds) will have decayed because their protons fell apart. In this time of darkness the black holes of the universe will decay and evaporate into nothingness … and then, finally, your (theoretically still existent) diamonds will have become graphite.

Added info: in 1947 De Beers launched the marketing campaign that “A Diamond Is Forever” (which they used to create our modern idea of the engagement ring). Also learn more about the carat measurement of diamonds (and the other karats, carets, & carrots).

Crash Course Astronomy discusses the long … long, future in store for our universe.

And because it’s an incredible song, and it mentions diamonds (albeit as a metaphor), and it will surely be the soundtrack of our interstellar space travels … Pink Floyd’s Shine On You Crazy Diamond.

the Myth of 8 Glasses of Water

You don’t need to drink 8 glasses of water a day.

In short: you only need to drink water when you’re thirsty. For millions of years humans and our human ancestors survived using thirst as an indicator that it’s time for more water. It wasn’t until the 20th century that the idea of drinking 8 glasses of water a day began.

We all need water to live but liquid water isn’t our only source. Coffee, tea, juice, soft drinks, fruits, vegetables, etc. all contain water. Depending on your diet you can get around 20% of the water you need just from food. Then because coffee, milk, juice, tea, etc. are mostly water, you’re probably already getting all the water you need each day without having to drink 8 more glasses of it.

… But Maybe You Do Need More Water

Daily water consumption is about maintaining balance: you need to replace the water you lose. If you live in a hot climate, or you’re sweating from exercise, you lose water faster than someone sitting still in a temperate climate. As such you need to replace water faster than normal which means drinking more water.

Also, should you be lost on a hike somewhere, you should ration sweat not water. Try to limit your physical exertion and sweat less but drink when you need to. A common mistake is that you should ration your water which, while it’s true you don’t want to waste a limited resource, if you’re thirsty you should drink. Your water isn’t doing you any good sitting inside a bottle.

Water water everywhere

On the flip side it’s possible to drink too much water. Exercise-associated hyponatremia is where you’re engaged in an endurance activity such as running a marathon, you sweat out water and sodium, but then you only drink water. In drinking regular water you manage to replenish your lost water but not your sodium. The result is low blood-sodium levels. This imbalance can cause poor nerve communication which leads to poor muscle control, poor performance, etc. Athletes with hyponatremia can feel nauseous, develop muscle cramps, and become confused leading some to think they’re dehydrated and drink even more water (making the situation worse).

Hyponatremia is becoming more prevalent in sports as an increasing number of novice athletes participate in long-distance endurance activities. For example in the 2002 Boston Marathon 13% of runners were found to have hyponatremia from drinking too much water. Athletes need to replenish their sodium levels along with their water. Part of the solution (pardon the pun) is to drink sports beverages that contain electrolytes (which are salts and can replenish sodium levels). This is why sports drinks boast about having electrolytes.

So, if you’re thirsty, drink some water and if you’re engaged in an endurance sport remember to get some electrolytes along with your water.

Added info: to bust another myth, consuming caffeinated beverages won’t dehydrate you. While excessive caffeine has a number of downsides, drinking coffee or tea is an acceptable a way to hydrate.

Adam Ruins Everything dives into the myth of 8 glasses of water a day.

Winter Warmer

Drinking alcohol in cold weather only warms you temporarily and then you need to get inside.

Our bodies are designed to regulate heat in order to stay alive. When you’re too hot the blood vessels nearest the skin use vasodilation to open wide and allow heated blood to pass through and radiate heat away from the body, cooling you down.

When you’re cold you body does the opposite. In the cold your body uses vasoconstriction to close off the blood vessels closest to the skin to prevent heat loss. In order to keep your core warm your body limits the blood flow to your extremities, conserving heat. This is why your fingers, toes, skin, etc. get stiff & cold before your core. Fingers are expendable, your organs are not. When your body is using vasoconstriction to keep warm the last thing you want to do is dilate those blood vessels … which is exactly what alcohol does.

From old myths and advertisements, the idea that alcohol is a good way to warm up in the cold has been around for centuries.

Hot Shots

Part of why we think alcohol helps comes from St. Bernards rescuing avalanche victims in the snow wearing small barrels of brandy to warm them up. Unfortunately no St. Bernard dog has ever worn such a cask – it’s a myth popularized by the Edwin Landseer 1820 painting Alpine Mastiffs Reanimating a Distressed Traveller.

Probably the main reason we think alcohol warms us up in the cold is because it sort of does. Alcohol is a vasodilator and so it isn’t the alcohol warming you but what alcohol does to the body that warms you. In the cold alcohol opens the blood vessels that the body has closed down to preserve warmth, the result of which is that the warm reserve of blood in your core is suddenly released out to your extremities. Unfortunately this sudden warmth comes at a cost. As the warm blood reaches your extremities there is a loss of heat and as it travels back to your core your overall body temperature drops. Further, alcohol reduces your body’s ability to shiver (which is another mechanism used to increase warmth) so you’re cold and only getting colder. Now you no longer have a reserve of warmth and need to get indoors.

In this bootlegged video from MythBusters, they explore the myth that alcohol helps warm you up in cold weather.

Why are barns red?

Barns are red because of an abundance of iron from exploding stars.

Barns get painted because it helps protect them from the elements where, if left untreated, they would rot. In the spirit of Louis Sullivan that “form follows function”, barns are painted for practical reasons more so than artistic ones. Red was originally chosen because it was the cheapest paint. It was the cheapest paint because of exploding stars.

Stardust

Iron is the most abundant element on Earth, making up 32.1% by mass. The iron on rocky planets such as Earth came from red giant stars which produced iron atoms towards the end of their lives. Once a star is producing iron it’s on a one-way ticking timeline towards going supernova. When the star finally explodes it sends elements across space, including iron.

Because of an abundance of iron in the soil, the easiest (and cheapest) paint to produce was red.

Fast-forward billions of years, in the mid 19th century American farmers began painting their barns with homemade paint. Farmers mixed skimmed milk, lime, linseed oil, and the readily available red iron oxide found in clay soil to create red paint. Red paint protected the wood from rot as well as from mold since iron oxide (rust essentially) kills mold. By the late 19th century when commercially produced paints were more widely available red paint was still cheaper than other colors and so farmers continued painting their barns red. Today barns are painted red largely out of tradition.

Hookworm

The parasite responsible for giving American southerners a bad reputation.

For centuries American southerners were maligned as lazy, slow, shiftless, dumb. Southerners had “the germ of laziness.” There was just something different about southerners that made them less-than their northern counterparts. As it turned out there was something different about them but it had nothing to do with genetics or social conditioning. That something was hookworm.

Hookworm

Hookworm, and specifically the New World hookworm Necator americanus, is a parasitic worm that arrived in America by way of the slave trade in the 17th century. In the larval stage hookworms live in wet warm shady soil where they wait to encounter a human. A person walking barefoot outdoors may make contact with a hookworm at which point it can penetrate the skin and crawl into the foot. From there it travels through the circulatory system to the lungs where it triggers a dry cough. The human host then unknowingly coughs up the worm only to swallow it down to the small intestine, which is where the worm wanted to be the entire time. The worm then lives around 1-2 years (or longer) attached to the wall of the intestine, sucking blood, and where a female worm can lay up to 30,000 eggs per day. Eventually these fertilized eggs are pooped out in a poorly built outhouse or in some bushes, contaminating the soil to start the process again. It’s disgusting.

Because hookworms thrive in warm humid environments they do particularly well in the southern climate of the United States. The area from southeastern Texas to West Virginia became nicknamed the “hookworm belt”. For poor southerners who couldn’t afford shoes and didn’t have indoor plumbing it was almost impossible to avoid hookworm. By 1910 it’s believed that around 40% of the people living in the south were infected with millions of worms.

Putting their gross lifecycle aside, the problem with hookworms is that they steal your blood. Alone one worm won’t do much damage, but getting infested by multiple worms on a continual basis over years/decades has a severely damaging cumulative effect. By consuming your blood hookworms can cause an iron deficiency. People with hookworms become tired, lose weight, and have little strength to do anything. Pregnant women are at risk for anemia and a greater chance of dying in child birth. Infected children can suffer irreversible developmental problems including stunted growth and intellectual disabilities. All of this matches the unfair characterization of southerners as slow rednecks.

A nurse brings hookworm medicine to a rural Alabama family, 1939.
A doctor and a nurse examine for hookworm in an Alabama school, 1939.

The cumulative effect

In 1902 zoologist Charles W. Stiles discovered that hookworms were endemic to the southern US. In 1909 John D. Rockefeller got involved by funding the creation of the Rockefeller Sanitary Commission for the Eradication of Hookworm Disease. They campaigned across the south to educate, test, and help treat hookworm. Students in small country schoolhouses would submit stool samples to their teachers to be tested – some schools even required students be screened for hookworm. People would go to health clinics on the weekends to learn more. An estimated 7.5 million southerners had hookworms. While the Rockefeller Commission helped treat the problem what greatly reduced hookworm was the urbanization of the south enabling more people to afford shoes and sanitary indoor plumbing.

The barefoot Texas boy on the right has hookworm, 1939.

Beyond the health consequences the socioeconomic impact of hookworm is also destructive. The US regions with hookworm had lower rates of literacy and school attendance than areas without it. A 1926 study of Alabama children showed that the more worms a child had the lower their IQ. Even today children with chronic hookworm face up to 40% lower future wage earnings when they grow up. General productivity is measurably lower as a result of hookworm. The southern regions that were worst infected with hookworms saw the greatest income expansion after treatment, but unfortunately centuries of infection had a cumulative effect. Eight of the ten poorest US states are still southern states.

Hookworm in the US is typically thought of as a problem of the past but it is still very much a problem of the present. Given the severe income inequality in the US, hookworm is thriving in regions living below the poverty line. Hookworm lives in, and reinforces, poverty and while 85% of the world lives on less than $30 a day 10% of the world is currently living in extreme poverty. Around the world an estimated 477 million people are currently living with hookworms inside them.

Astrology

Astrology, the idea that the stars are influencing your life, is completely fake.

Humans have been following the movements of the sun, the moon, the planets, and the stars for thousands of years. Using this celestial information to understand the seasons and the passage of time is logical. Using this information to predict the future or explain human personalities, is not logical (but understandable). People want to understand why things happen, the world can be scary, and finding some system in the stars is an attractive idea. A relatable narrative is more appealing than unpredictable chaos so it’s understandable that people would look to astrology (like how people fall for conspiracy theories).

While there are different kinds of astrology, the shared basics is that they use complex series of real astronomical calculations combined with made-up traits assigned to different constellations/alignments/times to “gain insights” into the workings of the world. The Western astrological system is rooted in Hellenistic astrology from the Mediterranean around 200-100 BCE (which itself is based in the much older Babylonian astrology). It’s from Hellenistic astrology that we get the Zodiac, horoscopes, star signs, and the kind of astrology we typically encounter in blogs and newspapers.

Despite millennia of study & measurements, nobody is any closer to explaining why astrology is supposedly real.

Bunk

That said, astrology is completely fake. It’s pseudoscience, superstition, hooey. To start, there’s no reason a distant configuration of stars which looks vaguely like a crab or a bull would have any relationship with the events on Earth. But even if there was some kind of relationship there would need to be a force connecting us to these heavenly bodies, affecting us here on Earth. Science hasn’t found or been able to measure any kind of force at work. Neither gravity nor electromagnetism work like this. Maybe there is some unknown other force, that remains strong yet undetectable, interacting with us from distant stars trillions of miles away which has yet to be discovered.

Another problem is that astrological assessments/predictions should be at least consistent if not accurate. In 1985 scientist Shawn Carlson conducted a double-blind experiment with astrologers to match personality test results to natal charts (essentially their zodiac symbols). If personality types are immutably governed by the stars, matching a zodiac sign to a participant’s corresponding personality type should be easy. It was apparently not easy, as astrologers performed about the same as pure chance. Worse, the astrologer participants performed poorly in even finding their own personality profiles.

Maybe astrology succeeds despite the human failings of astrologers. Time twins, people born at the same time on the same day sometimes even in the same hospital, should have similar personalities. Unfortunately there is no correlation at all. Even without astrologers being involved astrology is inconsistent.

Part of the blame for astrology lies with its adherents who believe astrology is real. Paranormal skeptic James Randi conducted an exercise where he gave detailed horoscopes to a class full of students. Most of the students said the horoscope they received was quite accurate. The trick was that Randi gave the same horoscope to everyone in the class. What the students in Randi’s experiment fell for was the Barnum effect.

Barnum Effect

The Barnum effect (aka the Forer effect) is found in fortune telling and astrology where an assessment/reading seems to be about you but in reality can apply to almost anyone. These are statements that have been carefully worded to be specific and yet universal. For example, one might say that …

“You have a tendency to be critical of yourself. You have a great need for other people to like and admire you. You pride yourself as an independent thinker and do not accept others’ statements without satisfactory proof. At times you are extroverted, affable, sociable, while at other times you are introverted, wary, reserved.”

In fact these statements are part of what psychologist Bertram Forer gave to his test subjects as part of his 1948 study. When assessing the accuracy of these statements, participants in Forer’s experiment gave an average rating of 4.3 out of 5 (5 being the most accurate). It turns out every student was given the exact same statements. Horoscopes and other astrological readings frequently use the Barnum effect to seem specific to you but in reality can apply to almost anyone.

Confirmation Bias

Another way astrology can seem real is through confirmation bias. Believers remember the predictions that came true more than the ones that didn’t. When someone has an emotional desire for a certain outcome they can respond more favorably towards the evidence that supports their beliefs and dismiss or undervalue contradictory evidence. Selectively remembering the horoscopes that came true can make astrology seem real, even thought it’s not.

Other contributing factors are that people who believe in astrology tend to be of lower intelligence, and more narcissistic, than non-believers. A potential “self-centered worldview” (along with a shaky understanding of science) could be influencing factors leading people to believe in astrology.

Ultimately astrology is inconsistent, inaccurate, and unable to explain why any of it is supposedly happening. From Cicero to modern scientists we have compelling arguments and mountains of scientific evidence showing again and again that astrology isn’t real. As professor Ivan Kelly of the University of Saskatchewan wrote, “Astrology is part of our past and has undeniable historical value, but astrologers have given no plausible reason why it should have a role in our future.”

Added bonus: one famous believer in astrology was President Ronald Reagan. Astrologer Joan Quigley (the Rasputin of the Reagan White House) regularly consulted her star charts to advise the president on a host of matters. She advised the president on when to deliver speeches, when to have presidential debates, when he should schedule his cancer surgery, and even when to land Air Force One. It was generous of the Christian Moral Majority to overlook Reagan’s pagan beliefs.

Egyptian Mummies: From Medicine to Paint

For hundreds of years Europeans used ground up Egyptian mummies as medicine and paint pigment.

The Arabic word mūmiyā (which later became “mummia”) was the name for the black sticky asphalt material that came out of the ground used as a sealant, an adhesive, and as medicine around the ancient world. Pliny the Elder and others wrote about the medicinal uses for mummia which became a bit of a cure-all for a range of ailments.

Unfortunately mummia the petroleum product looked like another black substance that was a byproduct of the Egyptian embalming process. As such the word “mummia” came to mean both the petroleum product AND the byproduct of Egyptian mummification, which was then even further confused as meaning an entire mummified body. This is how we got the word “mummy”. Unfortunately this series of mistakes also led to hundreds of years of cannibalism.

Cannibal Medicine

Since the petroleum based mummia was used both externally as a salve as well as ingested internally, the Egyptian mummy version of mummia became used in the same ways. The 11th century physician Constantinus Africanus even described mummia as a “spice” found in the sepulchers of the dead. Soon the human version replaced the petroleum version and people began to crumble & grind human corpses for medicine.

With the Crusades, Europeans learned of mummia and its medicinal possibilities. This significantly increased European demand for Egyptian mummies and by the 15th-16th centuries there was a thriving trade in mummies. Thousands of bodies were being exhumed and shipped to Europe to be turned into medicines. In 1586 English merchant John Sanderson shipped 600 pounds of mummies to London to sell at various apothecaries. This was fueled in part by orientalism, that Egyptian mummies had some sort of exotic ancient knowledge or power.

Europeans would consume portions of Egyptian corpses for help with general pain, ulcers, inflammation, epilepsy, cough, difficult labor, etc. – none of which worked, or if they worked it wasn’t the mummy that was the active ingredient. The practice was so common Shakespeare included mummy as an ingredient in the witches’ potion in Macbeth. Demand was so high that by the 17th century some mummy dealers were producing counterfeit mummies. Newly deceased people, animals, or prisoners who had been purposefully starved & executed, were put through a process to simulate ancient Egyptian mummies.

After a few hundred years of medicinal cannibalism Europeans began to express doubt as to the practice’s efficacy (and ethicality). The 16th century herbalist Leonhard Fuchs felt foreign mummies were acceptable but local ones were wrong. While doubts arose during the Renaissance in the 16th century it took until the 18th century age of Enlightenment for the practice to fall out of fashion. As consuming mummies slowly ended Egyptian mummies took on a new role: paint pigment.

The Egyptian Widow by Lourens Alma Tadema is an 1872 painting of Egyptian life potentially painted using mummy brown paint.
Liberty Leading the People by Eugène Delacroix is another painting that’s theorized to contain mummy brown.

Mummy Brown

Around the end of the 16th century artists began using ground up Egyptian mummies (mixed with other materials) to produce mummy brown, a shade of brown pigment. Apothecaries that were grinding up mummies for medicine began to grind them up for paint as well. As a paint it was good for shadows, flesh tones, and glazing. Artists Benjamin West, Martin Drolling, Lawrence Alma-Tadema, Edward Burne-Jones, Eugène Delacroix, and others all painted with mummy brown.

It wasn’t until the 19th century that mummy brown began to fall out of favor. That said as recently as 1926 C Roberson & Co. still sold mummy brown made with ground up Egyptian corpses. As mummy brown died out so too did hundreds of years of large-scale desecration of deceased Egyptians, using human beings for medicines and paints.

Cabinet of Curiosities

Before museums existed, people had cabinets/rooms to display their collected treasures.

There was a time when museums did not exist. The role of collecting, preserving, and displaying the art, artifacts, and wonders of the world belonged largely to individuals. As far back as the 4th century BCE Greeks were collecting exotic treasures from the East. More than just trading in commodities, the Greeks collected the art and textiles from these far away cultures. Roman emperor Augustus decorated his homes not just with art but with rare objects and bones of giant animals. Over the centuries, as cultures explored & traded with increasingly distant lands, the trends in what was collectible grew & changed. By the 16th and 17th centuries wealthy European collectors had amassed enough objects that they created special cabinets and/or rooms to show-off their collections. They created cabinets of curiosities.

Ole Worm’s Museum Wormianum is one of the most famous cabinets of curiosities.
Ferrante Imperato’s Dell’Historia Naturale is another famous wunderkabinett.

Wunderkabinett

From the German for art (kunst) or marvels (wunder) and cabinet (kabinett) or room (kammer), these cabinets & rooms were places where Renaissance scholars, merchants, royalty, and others could store their collections. Collecting was very fashionable in 17th century Europe and these cabinets were dedicated spaces to displaying all manner of objects. Like the contemporaneous maps of the world, some of these spaces were designed for show while others were more utilitarian.

A collection of cabinets and rooms displaying all manner of curiosities.

Some collectors had thousands of specimens. The objects in these cabinets were thoughtfully categorized and organized, each piece contributing to the larger whole. Collecting was a way to bring order to the world, to exert some level of control over something that is uncontrollable. What was stored & displayed in these cabinets depended on the collector, but broad categories of objects included:

  • Fine art
  • Applied art (scientific instruments, anthropological objects, etc.)
  • Natural materials (fossils, shells, rocks, etc.)
  • Historical objects

These categories, as well as these collections, served as the precursors to our modern museums. The Amerbach Cabinet was a collection of art, books, coins, etc. that was assembled by various members of the Amerbach family. It was eventually co-purchased by the city of Basel & the University Basel and became the Kunstmuseum Basel in 1661, the first public museum in the world. Francesco I de’ Medici had his studiolo, a 26 x 10 foot room of curiosities that is part of the Palazzo Vecchio in Florence. Other Medici possessions served as the start of the Uffizi Gallery. Elias Ashmole, who amassed his fortune & collection through sometimes questionable means, gifted his collection to the University of Oxford which became the Ashmolean Museum in 1683.

Throughout the 18th century an increasing number of private collections were converted into public museums, some of which still exist today but all of which helped define what museums have become.

Added info: In 1784 Charles Wilson Peale’s collection became the Philadelphia Museum which was the United States’ first museum (and also the first to display a mastodon skeleton).

MSG (Safe to Eat)

Reports that MSG is dangerous stem from one anecdotal letter and years of racism.

Monosodium glutamate (MSG) is a compound made up of sodium and glutamate (an amino acid) found naturally in our bodies and in a variety of foods (tomatoes, cheeses, anchovies, mushrooms, etc). Usually when it’s mentioned people are referring to the synthesized food additive version which is added to meals to bring out their umami flavors. It’s been a commercially produced food additive since 1909 but, despite being used by tens of millions of people, 42% of Americans today think it’s dangerous. The cause of this fear goes back to one article.

Chinese Restaurant Syndrome

The April 4, 1968 edition of the New England Journal of Medicine contained a letter titled Chinese-Restaurant Syndrome by Dr. Robert Ho Man Kwok on his observations of eating American Chinese food. Kwok said that about 15 to 20 minutes after eating at a Chinese restaurant he developed a headache, weakness, heart palpitations, and numbness. He proposed several possible causes but singled out MSG as the answer. This single letter was the beginning of decades of mistrust in MSG.

The ideas of MSG side-effects and “Chinese Restaurant Syndrome” have largely been fueled by racism. Suspicion or fear of East Asian cultures, the exoticism of the “Orient”, and/or a general lack of knowledge has led some people to be suspicious of Asian cuisine. In 1969 New York City imposed regulations on MSG use in Chinese restaurants but not regulations on MSG in general. While the supposed adverse reactions to MSG should make consumers wary of any food item containing MSG, Chinese food in particular got singled out and maligned. Lots of processed western foods contain MSG, lots of plants naturally contain significant levels of MSG, and yet Doritos and shiitake mushrooms didn’t seem to get singled out quite like Chinese food did.

Asian restaurants were singled out and maligned for their use of MSG, but Western processed foods were not.

Safe to Eat

There is no connection between MSG and the symptoms Kwok described. The US Food & Drug Administration states that MSG is safe to eat and that there is no evidence to support claims of headaches and nausea from eating normal amounts of MSG. In double-blind studies using subjects who claimed to have sensitivity to MSG some subjects were blindly given MSG and, unaware they were eating MSG, had no ill effects. These tests were unable to reproduce any of the side-effects claimed about MSG.

MSG, like any food additive, is safe in moderation. Excess anything can make you sick. Because of the association of Chinese food to MSG, some Asian restaurants in the US have reduced their usage of MSG just to satisfy public opinion, to the detriment of the food and the customers’ taste buds.