Rock Columns

As lava/magma cools & contracts it can form polygon stone columns.

Columnar jointing is a rock formation where fractures (joints) occur in cooling volcanic rock. As the lava/magma cools from the outside inward it shrinks towards center points. This shrinking then continues from the top down forming columns of rock. These stone columns are frequently hexagonal with six sides (a shape very common in nature) but other numbers of sides occur as well.

These rock formations can be straight vertical columns (the Giant’s Causeway, Devil’s Tower, Svartifoss, etc.) but can also form with sideways irregularities do to how the molten rock moved and cooled (such as Alcantara Gorge where the lava cooled diagonally).

Devil's Tower
Devil’s Tower in Wyoming features the tallest columns in the world. On the right is a climber ascending the Tower, giving scale to the enormity of the columns.

Rock Folktales

Because they look carved and intentionally organized they don’t seem natural. Cultures around the world have come up with a variety of explanations for these rock formations. Devil’s Tower in Wyoming is a striking example of columnar jointing which features the tallest columns in the world. Its name varies by different Native American groups but it tends to be versions of “Bear’s Home”. According to tradition, as children were fleeing a great bear (or bears) the animal’s claws dragged down the rock face, carving the columns we see today.

the Giant's Causeway
The Giant’s Causeway in Northern Ireland, created by Irish giant Fionn mac Cumhaill, features thousands of columns. It’s also featured on the album cover of Led Zeppelin’s 1973 album Houses of the Holy.

The Giant’s Causeway in Northern Ireland is perhaps the most famous example of columnar jointing. As the legend goes the causeway was formed by Irish giant (or just regular sized superhero) Fionn mac Cumhaill (Finn MacCool). He built a series of stepping stones to connect Ireland to Scotland. Longer story short, the Scottish giant Benandonner smashed the causeway during his retreat from Ireland back to Scotland. Today the remnants of this “causeway” are the rocks at the Giant’s Causeway and a similar rock formation on the Isle of Staffa (the most famous part of Staffa is Fingal’s Cave, named after Fionn, which was a very popular source of inspiration for the arts in the 19th century).

Svartifoss
Iceland’s Svartifoss falls has served as the inspiration for Hallgrímskirkja church among other places.

In Iceland, a volcanic hotspot in the past as well as today, there are several examples of rock columns. The rock formations along the black sand beach of Reynisfjara are said to be two trolls who were dragging a three-masted ship to land. As they fought over the ship they lost track of time and dawn came, turning the trolls into the rock columns. The rock columns at Svartifoss were the inspiration for Hallgrímskirkja church in the center of Reykjavik, which is designed like the rock columns. The hexagonal shapes of Svartifoss also influenced the design of the Harpa concert hall.

Added info: Columnar joint formations are not limited to Earth. The volcanism involved in the formation of Mars also created stone columns in much the same way we have them here.

Nick on the Rocks explains columnar jointing.

New England Vampires and Tuberculosis

The effects of tuberculosis led some 19th century New Englanders to believe that vampires were preying on the living.

In the late 18th and much of the 19th century there was a vampire panic in New England. People across New England feared that vampire-like creatures, using some kind of sympathetic magic, were slowly killing their friends & family from inside the grave (as opposed to traditional vampires who rise from the grave to attack). People would exhume their family members, look for the one who might be a vampire, and take various precautions to stop them. New Englanders might remove & burn the heart of a suspected vampire, they may turn the skeleton over facedown, decapitate the head, put a brick in their mouth, or use a wooden stake to pin their relative to the ground among other methods.

This panic was more than just a few isolated incidents. Henry David Thoreau mentions attending an exhumation in his journal on September 26, 1859. In February of 1793 over 500 people attended the ceremonial burning of the heart, liver, and lungs of supposed vampire Rachel Harris in Manchester, Vermont. After Nancy Young died in 1827 Rhode Island, her father thought that she might be preying on her still alive little sister Almira. The family exhumed Nancy’s coffin, burned it on a pyre, and stood in the smoke to breath in the vapors thinking it would free/cure them of this affliction – it did not work and Almira and two more of her siblings later died. Digesting the cremated remains of a suspected vampire, or breathing in the smoke of the cremation pyre, were not uncommon last resort treatments after traditional medicine had failed to heal sick relatives.

The 1892 exhuming of suspected vampire Mercy Brown in Exeter, Rhode Island became an international story – Bram Stoker based part of the Lucy character in Dracula on Mercy Brown. With 18 confirmed vampire cases, Rhode Island even become known as the “Vampire Capital of America.” The reason all of this happened was twofold: tuberculosis and decomposition.

The story of Mercy Brown influenced Bram Stoker’s Dracula.

Wasting away

Tuberculosis is an airborne disease that attacks the lungs (among other areas). Active tuberculosis kills about half of those infected and in 2018 it was the ninth leading cause of death worldwide (killing more people than Malaria or HIV/AIDS). In 19th century New England tuberculosis was the leading cause of death, killing an estimated 25% of the population.

Tuberculosis can develop over months or even years, slowly eating away at someone. A person with active TB develops a chronic cough as their lung tissue breaks down, their mucus starts to contain blood, they develop fevers, night sweats, and lose weight. Because of the weight loss the disease has been historically known as “consumption.” As the infected person wastes away they also develop ashen skin, giving them an overall sickly drained appearance.

Vampires, or, a lack of scientific understanding

The effect of tuberculosis (the slow draining of life) combined with some of the infected saying their deceased relatives were visiting them (as Almira Young claimed), was enough for some New Englanders to suspect there were vampires at work. Bodies of suspected vampires were exhumed to looks for signs of vampirism. Some of the corpses seemed have grown longer finger nails and longer hair, some were bloated, some had blood in their organs, while others seemed to have not decayed at all. These were surefire signs of a vampire … or were just normal parts of body decomposition.

As bodies decay they become dehydrated causing the skin to recede and shrink. This gives the illusion of longer fingernails & hair as the base of the nails and hair that was once under the skin is now exposed. The bodies that seemed to have not decayed at all were the ones of people who died in the cold winters of New England (as was Mercy Brown’s case who had died in January) where the cold slows the decomposition process. These unremarkable signs of decomposition were mistaken as proof of life after death to the untrained eyes of 19th century New England.

The dawn of a new era

The Mercy Brown story brought unwanted attention to New England. It was embarrassing that, while the light bulb was being invented and Henry Ford was building his first car, people were worried about folklorish undead monsters. The vampire panic rose and fell with the tuberculosis endemic of New England. Over time with advancements in science, and the dissemination of knowledge, belief in vampires faded away.

Added info: porphyria is another disease whose symptoms can be similar to vampire activity. It’s a liver disease that, for some, can cause sensitivity to sunlight (leading some to only come out at night) as well as sensitivity to garlic.

“Ask a Mortician” goes through the history of the New England vampire panic and the realities of tuberculosis in 19th century New England.

A crash course on tuberculosis.

Brain Freeze

The short headache triggered by cold food and/or drinks touching the inside of your mouth.

To start, brain freeze (aka “ice cream headache” or “cold-stimulus headache”) only affects about 30-50% of the population. Most people can eat ice cream and drink extra cold drinks without any fear of reprisal from their nervous system.

Brain freeze occurs when the roof of your mouth or the back of your throat suddenly come into contact with cold food, cold drinks, or even cold air. The trigeminal nerve in your head reacts to the cold by telling the arteries connected to the meninges (the membranes surrounding your brain) to contract to conserve warmth (much like how our bodies react to the cold in general). Then the body sends more warm blood up to the head telling those same arteries to expand. This quick succession of vasoconstriction and vasodilation of blood vessels triggers pain receptors along the trigeminal nerve which creates the pain you feel behind the eyes or forehead during a brain freeze.

A lot of nerve

While we all have a trigeminal nerve its varying sensitivity may explain why not everyone gets brain freeze. For example 37% of Americans may get brain freeze but only around 15% of Danish adults do. Further, 93% of people who get migraines are also susceptible to brain freeze.

Color Blindness

The visual condition that changes what colors you see.

To start, being color blind almost never means someone is blind to color, as if they’re living in a black & white movie. “Color blind” usually just means someone doesn’t see the full spectrum of colors like the rest of us. To understand color blindness we have to understand two concepts: light and our eyes.

Let there be light

The colors that we see are photons moving at different wavelengths/frequencies. They’re part of the electromagnetic spectrum. The full electromagnetic spectrum ranges from Gamma rays (the shortest, highest frequency waves – quite dangerous) to radio waves (the longest, lowest frequency waves – not so dangerous). What we call visible light is radiation in a particular range of wavelengths. Within this bandwidth the colors of violet and blue have the shortest wavelengths while oranges and reds have the longest. Through evolution we have developed two small biological machines capable of detecting this range of wavelengths … our eyes.

the electromagnetic spectrum
The electromagnetic spectrum ranges from gamma rays to radio waves, photons moving at a variety of wavelengths & frequencies.

Doctor My Eyes

Our retinas have two kinds of photoreceptive cells: rods and cones. Rods see light & dark while cones see color. We have about 120 million rods per eye and but only 6-7 million cones per eye. Instead of just one kind of cone cell we have three and each kind is tuned to a certain range of wavelengths (short, medium, and long). To put it another way, our three kinds of cone cells are each tuned to see certain ranges of colors – blues (short), greens (medium), reds (long).

Bringing it all together, color blindness is when one (or more) of your cone cell types are either defective or missing entirely. The result is that you are unable to properly see certain wavelengths of colors.

Color Blindness

Why does color blindness happen? While color blindness can be an acquired condition most of the time it’s genetic. The most common forms of color blindness are carried on the X chromosome and because men only have one X chromosome, if it’s defective they’re out of luck. This is why men are more commonly color blind than women. Women have two X chromosomes so a functioning X chromosome will compensate for a defective one. As a result around 8% of men are color blind compared to only around 0.5% of women. That said color blindness isn’t evenly distributed across men – it has a higher prevalence amongst Caucasian men than other ethnicities.

red-green color blindness is the most common form of color blindness
Red-green color blindness is a group of different kinds of color blindness. It’s the most common form of color blindness.

Because cones come in three varieties, and those cones can be defective or absent, the various combinations of factors means there are many forms of color blindness. The most common type is “red-green” color blind (which is a few kinds grouped together) where reds and greens aren’t seen properly and shift to look more like yellows and browns. This is the result of the medium and long (green and red) cone cells being defective or absent. Red-green color blindness accounts for about 99% of all color blindness with about 1 in 12 men and 1 in 200 women having it.

Blue-yellow color blindness is where blues and greens aren’t seen properly. It’s also genetic but it’s not carried on the X chromosome so men and women are affected relatively evenly. It’s quite rare – around 0.01% of men and women are blue-yellow color blind.

Are you seeing what I’m seeing?

The effects of color blindness range from the benign to the dangerous. Accidentally wearing clothes that don’t match can be embarrassing but confusing “stop” for “go” on a traffic light can be dangerous. Color blind individuals can have difficulty determining the ripeness of fruits & vegetables. They can see sports jerseys as similar and have difficulty tracking games. The designer shorthand that red means error/bad while green means success/good (the traffic light analogy) can make a variety of safety features, dashboards, and websites more difficult to use. One positive is that color blind individuals may be better at detecting camouflage.

What do other animals see?

The concept of “color blindness” is relative. What most humans consider normal is not what most bees would consider normal, or dogs, or any other species. So when people say that most other animals are color blind, it’s just that they can’t see the same spectrum of colors that humans normally see.

To start, most mammals are red-green color blind (which to them is normal). They tend to only have two cone cell types, lacking the third we have to see a wider range of colors. So when a dog can’t find the green tennis ball in the green grass, it’s probably because they really can’t see it (especially if it has stopped rolling). Dogs rely on movement to distinguish between things more than we do. That said dogs have more rods than humans so when it seems like they’re looking at something in the dark, and you can’t see anything, they’re probably seeing something beyond your vision.

The old idea that bulls dislike the color red is untrue – they’re red-green color blind. When a matador waves a red/pink cape to attract a bull the bull is responding to the motion of the cape, not the color. Dolphins and other marine animals see even less due to having only the long wavelength cone type and are monochromatic. Deer are red-green color blind but can see more shorter wavelength colors than we can including some amount of ultraviolet which to us is invisible. Some laundry detergents contain brightening agents that are intended to brighten the colors of your clothes but can make clothes look bright blue to deer. The result is that, even if your clothes are camouflaged, the deer probably saw you long before you saw the deer.

Even beyond seeing ultraviolet, some animals can detect/see the Earth’s magnetic fields. It’s believed that robins can see magnetic field lines as a darker shading on the normal colors they already see, but they can only see it through their right eyes and only on clear days. When cryptochrome molecules in their right eyes are struck by blue light the molecules become active and allow robins to see magnetic fields which they can use to navigate as they migrate north & south. Interestingly, non-migratory bird species seem to have less sensitivity to magnetic fields than migratory birds.

Finally, contrary to popular misconception, bats are not blind and some actually have quite decent sight. While they are red-green color blind like most other mammals they have an ultraviolet sensitivity that helps them hunt as well as detect predators. All of this in addition to echolocation means they are quite capable of seeing and navigating the world around them. That said, some species of bats as well as other nocturnal animals have no cones at all and are really truly color blind.

Diamonds are not forever, but pretty close

Diamonds very slowly degrade into graphite … but the sun, the solar system, and many of the black holes will have died long before then.

When thinking about diamonds we can think of the Shawshank Redemption line that, “Geology is the study of pressure and time. That’s all it takes really… pressure… and time”. Diamonds mined from the Earth are carbon atoms that have been compressed over long periods of time (in the billions of years) under enormous amounts of pressure.

The sands of time

The underground pressure that forms a diamond also holds it together. Once removed from the ground the carbon atoms very, very, slowly rearrange into graphite. Graphite is a more stable arrangement of carbon atoms than a diamond, but “more stable” is extremely relative. Under normal conditions a diamond sitting in your house would take an estimated 10 to the power of 80 years (1080 years) to become graphite, which is:

100,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000 years

One with 80 zeros after it, years. To put that in perspective, the universe is only around 14 billion years old, or 14,000,000,000 years old. A wildly shorter period of time.

The slow death of the universe

So what will life be like when diamonds begin to lose their luster? Nothing lasts forever, including our sun and the universe as we know it. Our sun is scheduled to become a red giant star, expanding in size to engulf Mercury, Venus, and probably Earth, in about 5 billion years. But before that happens the increasing brightness of the sun will kill off all life on Earth in about 1 billion years.

If humanity takes its diamonds aboard a spaceship (with standard Earth like conditions), and sails the universe through space & time, it will theoretically take until the Black Hole Era for the diamonds to become graphite. The Black Hole Era will be from 1043 to approximately 10100 years from now. Leading up to the Black Hole Era the stars will have burned out and the planets (and your diamonds) will have decayed because their protons fell apart. In this time of darkness the black holes of the universe will decay and evaporate into nothingness … and then, finally, your (theoretically still existent) diamonds will have become graphite.

Added info: in 1947 De Beers launched the marketing campaign that “A Diamond Is Forever” (which they used to create our modern idea of the engagement ring). Also learn more about the carat measurement of diamonds (and the other karats, carets, & carrots).

Crash Course Astronomy discusses the long … long, future in store for our universe.

And because it’s an incredible song, and it mentions diamonds (albeit as a metaphor), and it will surely be the soundtrack of our interstellar space travels … Pink Floyd’s Shine On You Crazy Diamond.

the Myth of 8 Glasses of Water

You don’t need to drink 8 glasses of water a day.

In short: you only need to drink water when you’re thirsty. For millions of years humans and our human ancestors survived using thirst as an indicator that it’s time for more water. It wasn’t until the 20th century that the idea of drinking 8 glasses of water a day began.

We all need water to live but liquid water isn’t our only source. Coffee, tea, juice, soft drinks, fruits, vegetables, etc. all contain water. Depending on your diet you can get around 20% of the water you need just from food. Then because coffee, milk, juice, tea, etc. are mostly water, you’re probably already getting all the water you need each day without having to drink 8 more glasses of it.

… But Maybe You Do Need More Water

Daily water consumption is about maintaining balance: you need to replace the water you lose. If you live in a hot climate, or you’re sweating from exercise, you lose water faster than someone sitting still in a temperate climate. As such you need to replace water faster than normal which means drinking more water.

Also, should you be lost on a hike somewhere, you should ration sweat not water. Try to limit your physical exertion and sweat less but drink when you need to. A common mistake is that you should ration your water which, while it’s true you don’t want to waste a limited resource, if you’re thirsty you should drink. Your water isn’t doing you any good sitting inside a bottle.

Water water everywhere

On the flip side it’s possible to drink too much water. Exercise-associated hyponatremia is where you’re engaged in an endurance activity such as running a marathon, you sweat out water and sodium, but then you only drink water. In drinking regular water you manage to replenish your lost water but not your sodium. The result is low blood-sodium levels. This imbalance can cause poor nerve communication which leads to poor muscle control, poor performance, etc. Athletes with hyponatremia can feel nauseous, develop muscle cramps, and become confused leading some to think they’re dehydrated and drink even more water (making the situation worse).

Hyponatremia is becoming more prevalent in sports as an increasing number of novice athletes participate in long-distance endurance activities. For example in the 2002 Boston Marathon 13% of runners were found to have hyponatremia from drinking too much water. Athletes need to replenish their sodium levels along with their water. Part of the solution (pardon the pun) is to drink sports beverages that contain electrolytes (which are salts and can replenish sodium levels). This is why sports drinks boast about having electrolytes.

So, if you’re thirsty, drink some water and if you’re engaged in an endurance sport remember to get some electrolytes along with your water.

Added info: to bust another myth, consuming caffeinated beverages won’t dehydrate you. While excessive caffeine has a number of downsides, drinking coffee or tea is an acceptable a way to hydrate.

Adam Ruins Everything dives into the myth of 8 glasses of water a day.

Winter Warmer

Drinking alcohol in cold weather only warms you temporarily and then you need to get inside.

Our bodies are designed to regulate heat in order to stay alive. When you’re too hot the blood vessels nearest the skin use vasodilation to open wide and allow heated blood to pass through and radiate heat away from the body, cooling you down.

When you’re cold you body does the opposite. In the cold your body uses vasoconstriction to close off the blood vessels closest to the skin to prevent heat loss. In order to keep your core warm your body limits the blood flow to your extremities, conserving heat. This is why your fingers, toes, skin, etc. get stiff & cold before your core. Fingers are expendable, your organs are not. When your body is using vasoconstriction to keep warm the last thing you want to do is dilate those blood vessels … which is exactly what alcohol does.

From old myths and advertisements, the idea that alcohol is a good way to warm up in the cold has been around for centuries.

Hot Shots

Part of why we think alcohol helps comes from St. Bernards rescuing avalanche victims in the snow wearing small barrels of brandy to warm them up. Unfortunately no St. Bernard dog has ever worn such a cask – it’s a myth popularized by the Edwin Landseer 1820 painting Alpine Mastiffs Reanimating a Distressed Traveller.

Probably the main reason we think alcohol warms us up in the cold is because it sort of does. Alcohol is a vasodilator and so it isn’t the alcohol warming you but what alcohol does to the body that warms you. In the cold alcohol opens the blood vessels that the body has closed down to preserve warmth, the result of which is that the warm reserve of blood in your core is suddenly released out to your extremities. Unfortunately this sudden warmth comes at a cost. As the warm blood reaches your extremities there is a loss of heat and as it travels back to your core your overall body temperature drops. Further, alcohol reduces your body’s ability to shiver (which is another mechanism used to increase warmth) so you’re cold and only getting colder. Now you no longer have a reserve of warmth and need to get indoors.

In this bootlegged video from MythBusters, they explore the myth that alcohol helps warm you up in cold weather.

Why are barns red?

Barns are red because of an abundance of iron from exploding stars.

Barns get painted because it helps protect them from the elements where, if left untreated, they would rot. In the spirit of Louis Sullivan that “form follows function”, barns are painted for practical reasons more so than artistic ones. Red was originally chosen because it was the cheapest paint. It was the cheapest paint because of exploding stars.

Stardust

Iron is the most abundant element on Earth, making up 32.1% by mass. The iron on rocky planets such as Earth came from red giant stars which produced iron atoms towards the end of their lives. Once a star is producing iron it’s on a one-way ticking timeline towards going supernova. When the star finally explodes it sends elements across space, including iron.

Because of an abundance of iron in the soil, the easiest (and cheapest) paint to produce was red.

Fast-forward billions of years, in the mid 19th century American farmers began painting their barns with homemade paint. Farmers mixed skimmed milk, lime, linseed oil, and the readily available red iron oxide found in clay soil to create red paint. Red paint protected the wood from rot as well as from mold since iron oxide (rust essentially) kills mold. By the late 19th century when commercially produced paints were more widely available red paint was still cheaper than other colors and so farmers continued painting their barns red. Today barns are painted red largely out of tradition.

Hookworm

The parasite responsible for giving American southerners a bad reputation.

For centuries American southerners were maligned as lazy, slow, shiftless, dumb. Southerners had “the germ of laziness.” There was just something different about southerners that made them less-than their northern counterparts. As it turned out there was something different about them but it had nothing to do with genetics or social conditioning. That something was hookworm.

Hookworm

Hookworm, and specifically the New World hookworm Necator americanus, is a parasitic worm that arrived in America by way of the slave trade in the 17th century. In the larval stage hookworms live in wet warm shady soil where they wait to encounter a human. A person walking barefoot outdoors may make contact with a hookworm at which point it can penetrate the skin and crawl into the foot. From there it travels through the circulatory system to the lungs where it triggers a dry cough. The human host then unknowingly coughs up the worm only to swallow it down to the small intestine, which is where the worm wanted to be the entire time. The worm then lives around 1-2 years (or longer) attached to the wall of the intestine, sucking blood, and where a female worm can lay up to 30,000 eggs per day. Eventually these fertilized eggs are pooped out in a poorly built outhouse or in some bushes, contaminating the soil to start the process again. It’s disgusting.

Because hookworms thrive in warm humid environments they do particularly well in the southern climate of the United States. The area from southeastern Texas to West Virginia became nicknamed the “hookworm belt”. For poor southerners who couldn’t afford shoes and didn’t have indoor plumbing it was almost impossible to avoid hookworm. By 1910 it’s believed that around 40% of the people living in the south were infected with millions of worms.

Putting their gross lifecycle aside, the problem with hookworms is that they steal your blood. Alone one worm won’t do much damage, but getting infested by multiple worms on a continual basis over years/decades has a severely damaging cumulative effect. By consuming your blood hookworms can cause an iron deficiency. People with hookworms become tired, lose weight, and have little strength to do anything. Pregnant women are at risk for anemia and a greater chance of dying in child birth. Infected children can suffer irreversible developmental problems including stunted growth and intellectual disabilities. All of this matches the unfair characterization of southerners as slow rednecks.

A nurse brings hookworm medicine to a rural Alabama family, 1939.
A doctor and a nurse examine for hookworm in an Alabama school, 1939.

The cumulative effect

In 1902 zoologist Charles W. Stiles discovered that hookworms were endemic to the southern US. In 1909 John D. Rockefeller got involved by funding the creation of the Rockefeller Sanitary Commission for the Eradication of Hookworm Disease. They campaigned across the south to educate, test, and help treat hookworm. Students in small country schoolhouses would submit stool samples to their teachers to be tested – some schools even required students be screened for hookworm. People would go to health clinics on the weekends to learn more. An estimated 7.5 million southerners had hookworms. While the Rockefeller Commission helped treat the problem what greatly reduced hookworm was the urbanization of the south enabling more people to afford shoes and sanitary indoor plumbing.

The barefoot Texas boy on the right has hookworm, 1939.

Beyond the health consequences the socioeconomic impact of hookworm is also destructive. The US regions with hookworm had lower rates of literacy and school attendance than areas without it. A 1926 study of Alabama children showed that the more worms a child had the lower their IQ. Even today children with chronic hookworm face up to 40% lower future wage earnings when they grow up. General productivity is measurably lower as a result of hookworm. The southern regions that were worst infected with hookworms saw the greatest income expansion after treatment, but unfortunately centuries of infection had a cumulative effect. Eight of the ten poorest US states are still southern states.

Hookworm in the US is typically thought of as a problem of the past but it is still very much a problem of the present. Given the severe income inequality in the US, hookworm is thriving in regions living below the poverty line. Hookworm lives in, and reinforces, poverty and while 85% of the world lives on less than $30 a day 10% of the world is currently living in extreme poverty. Around the world an estimated 477 million people are currently living with hookworms inside them.

Astrology

Astrology, the idea that the stars are influencing your life, is completely fake.

Humans have been following the movements of the sun, the moon, the planets, and the stars for thousands of years. Using this celestial information to understand the seasons and the passage of time is logical. Using this information to predict the future or explain human personalities, is not logical (but understandable). People want to understand why things happen, the world can be scary, and finding some system in the stars is an attractive idea. A relatable narrative is more appealing than unpredictable chaos so it’s understandable that people would look to astrology (like how people fall for conspiracy theories).

While there are different kinds of astrology, the shared basics is that they use complex series of real astronomical calculations combined with made-up traits assigned to different constellations/alignments/times to “gain insights” into the workings of the world. The Western astrological system is rooted in Hellenistic astrology from the Mediterranean around 200-100 BCE (which itself is based in the much older Babylonian astrology). It’s from Hellenistic astrology that we get the Zodiac, horoscopes, star signs, and the kind of astrology we typically encounter in blogs and newspapers.

Despite millennia of study & measurements, nobody is any closer to explaining why astrology is supposedly real.

Bunk

That said, astrology is completely fake. It’s pseudoscience, superstition, hooey. To start, there’s no reason a distant configuration of stars which looks vaguely like a crab or a bull would have any relationship with the events on Earth. But even if there was some kind of relationship there would need to be a force connecting us to these heavenly bodies, affecting us here on Earth. Science hasn’t found or been able to measure any kind of force at work. Neither gravity nor electromagnetism work like this. Maybe there is some unknown other force, that remains strong yet undetectable, interacting with us from distant stars trillions of miles away which has yet to be discovered.

Another problem is that astrological assessments/predictions should be at least consistent if not accurate. In 1985 scientist Shawn Carlson conducted a double-blind experiment with astrologers to match personality test results to natal charts (essentially their zodiac symbols). If personality types are immutably governed by the stars, matching a zodiac sign to a participant’s corresponding personality type should be easy. It was apparently not easy, as astrologers performed about the same as pure chance. Worse, the astrologer participants performed poorly in even finding their own personality profiles.

Maybe astrology succeeds despite the human failings of astrologers. Time twins, people born at the same time on the same day sometimes even in the same hospital, should have similar personalities. Unfortunately there is no correlation at all. Even without astrologers being involved astrology is inconsistent.

Part of the blame for astrology lies with its adherents who believe astrology is real. Paranormal skeptic James Randi conducted an exercise where he gave detailed horoscopes to a class full of students. Most of the students said the horoscope they received was quite accurate. The trick was that Randi gave the same horoscope to everyone in the class. What the students in Randi’s experiment fell for was the Barnum effect.

Barnum Effect

The Barnum effect (aka the Forer effect) is found in fortune telling and astrology where an assessment/reading seems to be about you but in reality can apply to almost anyone. These are statements that have been carefully worded to be specific and yet universal. For example, one might say that …

“You have a tendency to be critical of yourself. You have a great need for other people to like and admire you. You pride yourself as an independent thinker and do not accept others’ statements without satisfactory proof. At times you are extroverted, affable, sociable, while at other times you are introverted, wary, reserved.”

In fact these statements are part of what psychologist Bertram Forer gave to his test subjects as part of his 1948 study. When assessing the accuracy of these statements, participants in Forer’s experiment gave an average rating of 4.3 out of 5 (5 being the most accurate). It turns out every student was given the exact same statements. Horoscopes and other astrological readings frequently use the Barnum effect to seem specific to you but in reality can apply to almost anyone.

Confirmation Bias

Another way astrology can seem real is through confirmation bias. Believers remember the predictions that came true more than the ones that didn’t. When someone has an emotional desire for a certain outcome they can respond more favorably towards the evidence that supports their beliefs and dismiss or undervalue contradictory evidence. Selectively remembering the horoscopes that came true can make astrology seem real, even thought it’s not.

Other contributing factors are that people who believe in astrology tend to be of lower intelligence, and more narcissistic, than non-believers. A potential “self-centered worldview” (along with a shaky understanding of science) could be influencing factors leading people to believe in astrology.

Ultimately astrology is inconsistent, inaccurate, and unable to explain why any of it is supposedly happening. From Cicero to modern scientists we have compelling arguments and mountains of scientific evidence showing again and again that astrology isn’t real. As professor Ivan Kelly of the University of Saskatchewan wrote, “Astrology is part of our past and has undeniable historical value, but astrologers have given no plausible reason why it should have a role in our future.”

Added bonus: one famous believer in astrology was President Ronald Reagan. Astrologer Joan Quigley (the Rasputin of the Reagan White House) regularly consulted her star charts to advise the president on a host of matters. She advised the president on when to deliver speeches, when to have presidential debates, when he should schedule his cancer surgery, and even when to land Air Force One. It was generous of the Christian Moral Majority to overlook Reagan’s pagan beliefs.