Drinking alcohol in cold weather only warms you temporarily and then you need to get inside.
Our bodies are designed to regulate heat in order to stay alive. When you’re too hot the blood vessels nearest the skin use vasodilation to open wide and allow heated blood to pass through and radiate heat away from the body, cooling you down.
When you’re cold you body does the opposite. In the cold your body uses vasoconstriction to close off the blood vessels closest to the skin to prevent heat loss. In order to keep your core warm your body limits the blood flow to your extremities, conserving heat. This is why your fingers, toes, skin, etc. get stiff & cold before your core. Fingers are expendable, your organs are not. When your body is using vasoconstriction to keep warm the last thing you want to do is dilate those blood vessels … which is exactly what alcohol does.
Hot Shots
Part of why we think alcohol helps comes from St. Bernards rescuing avalanche victims in the snow wearing small barrels of brandy to warm them up. Unfortunately no St. Bernard dog has ever worn such a cask – it’s a myth popularized by the Edwin Landseer 1820 painting Alpine Mastiffs Reanimating a Distressed Traveller.
Probably the main reason we think alcohol warms us up in the cold is because it sort of does. Alcohol is a vasodilator and so it isn’t the alcohol warming you but what alcohol does to the body that warms you. In the cold alcohol opens the blood vessels that the body has closed down to preserve warmth, the result of which is that the warm reserve of blood in your core is suddenly released out to your extremities. Unfortunately this sudden warmth comes at a cost. As the warm blood reaches your extremities there is a loss of heat and as it travels back to your core your overall body temperature drops. Further, alcohol reduces your body’s ability to shiver (which is another mechanism used to increase warmth) so you’re cold and only getting colder. Now you no longer have a reserve of warmth and need to get indoors.
Barns are red because of an abundance of iron from exploding stars.
Barns get painted because it helps protect them from the elements where, if left untreated, they would rot. In the spirit of Louis Sullivan that “form follows function”, barns are painted for practical reasons more so than artistic ones. Red was originally chosen because it was the cheapest paint. It was the cheapest paint because of exploding stars.
Stardust
Iron is the most abundant element on Earth, making up 32.1% by mass. The iron on rocky planets such as Earth came from red giant stars which produced iron atoms towards the end of their lives. Once a star is producing iron it’s on a one-way ticking timeline towards going supernova. When the star finally explodes it sends elements across space, including iron.
Fast-forward billions of years, in the mid 19th century American farmers began painting their barns with homemade paint. Farmers mixed skimmed milk, lime, linseed oil, and the readily available red iron oxide found in clay soil to create red paint. Red paint protected the wood from rot as well as from mold since iron oxide (rust essentially) kills mold. By the late 19th century when commercially produced paints were more widely available red paint was still cheaper than other colors and so farmers continued painting their barns red. Today barns are painted red largely out of tradition.
The parasite responsible for giving American southerners a bad reputation.
For centuries American southerners were maligned as lazy, slow, shiftless, dumb. Southerners had “the germ of laziness.” There was just something different about southerners that made them less-than their northern counterparts. As it turned out there was something different about them but it had nothing to do with genetics or social conditioning. That something was hookworm.
Hookworm
Hookworm, and specifically the New World hookworm Necator americanus, is a parasitic worm that arrived in America by way of the slave trade in the 17th century. In the larval stage hookworms live in wet warm shady soil where they wait to encounter a human. A person walking barefoot outdoors may make contact with a hookworm at which point it can penetrate the skin and crawl into the foot. From there it travels through the circulatory system to the lungs where it triggers a dry cough. The human host then unknowingly coughs up the worm only to swallow it down to the small intestine, which is where the worm wanted to be the entire time. The worm then lives around 1-2 years (or longer) attached to the wall of the intestine, sucking blood, and where a female worm can lay up to 30,000 eggs per day. Eventually these fertilized eggs are pooped out in a poorly built outhouse or in some bushes, contaminating the soil to start the process again. It’s disgusting.
Because hookworms thrive in warm humid environments they do particularly well in the southern climate of the United States. The area from southeastern Texas to West Virginia became nicknamed the “hookworm belt”. For poor southerners who couldn’t afford shoes and didn’t have indoor plumbing it was almost impossible to avoid hookworm. By 1910 it’s believed that around 40% of the people living in the south were infected with millions of worms.
Putting their gross lifecycle aside, the problem with hookworms is that they steal your blood. Alone one worm won’t do much damage, but getting infested by multiple worms on a continual basis over years/decades has a severely damaging cumulative effect. By consuming your blood hookworms can cause an iron deficiency. People with hookworms become tired, lose weight, and have little strength to do anything. Pregnant women are at risk for anemia and a greater chance of dying in child birth. Infected children can suffer irreversible developmental problems including stunted growth and intellectual disabilities. All of this matches the unfair characterization of southerners as slow rednecks.
The cumulative effect
In 1902 zoologist Charles W. Stiles discovered that hookworms were endemic to the southern US. In 1909 John D. Rockefeller got involved by funding the creation of the Rockefeller Sanitary Commission for the Eradication of Hookworm Disease. They campaigned across the south to educate, test, and help treat hookworm. Students in small country schoolhouses would submit stool samples to their teachers to be tested – some schools even required students be screened for hookworm. People would go to health clinics on the weekends to learn more. An estimated 7.5 million southerners had hookworms. While the Rockefeller Commission helped treat the problem what greatly reduced hookworm was the urbanization of the south enabling more people to afford shoes and sanitary indoor plumbing.
Beyond the health consequences the socioeconomic impact of hookworm is also destructive. The US regions with hookworm had lower rates of literacy and school attendance than areas without it. A 1926 study of Alabama children showed that the more worms a child had the lower their IQ. Even today children with chronic hookworm face up to 40% lower future wage earnings when they grow up. General productivity is measurably lower as a result of hookworm. The southern regions that were worst infected with hookworms saw the greatest income expansion after treatment, but unfortunately centuries of infection had a cumulative effect. Eight of the ten poorest US states are still southern states.
Hookworm in the US is typically thought of as a problem of the past but it is still very much a problem of the present. Given the severe income inequality in the US, hookworm is thriving in regions living below the poverty line. Hookworm lives in, and reinforces, poverty and while 85% of the world lives on less than $30 a day 10% of the world is currently living in extreme poverty. Around the world an estimated 477 million people are currently living with hookworms inside them.
Astrology, the idea that the stars are influencing your life, is completely fake.
Humans have been following the movements of the sun, the moon, the planets, and the stars for thousands of years. Using this celestial information to understand the seasons and the passage of time is logical. Using this information to predict the future or explain human personalities, is not logical (but understandable). People want to understand why things happen, the world can be scary, and finding some system in the stars is an attractive idea. A relatable narrative is more appealing than unpredictable chaos so it’s understandable that people would look to astrology (like how people fall for conspiracy theories).
While there are different kinds of astrology, the shared basics is that they use complex series of real astronomical calculations combined with made-up traits assigned to different constellations/alignments/times to “gain insights” into the workings of the world. The Western astrological system is rooted in Hellenistic astrology from the Mediterranean around 200-100 BCE (which itself is based in the much older Babylonian astrology). It’s from Hellenistic astrology that we get the Zodiac, horoscopes, star signs, and the kind of astrology we typically encounter in blogs and newspapers.
Bunk
That said, astrology is completely fake. It’s pseudoscience, superstition, hooey. To start, there’s no reason a distant configuration of stars which looks vaguely like a crab or a bull would have any relationship with the events on Earth. But even if there was some kind of relationship there would need to be a force connecting us to these heavenly bodies, affecting us here on Earth. Science hasn’t found or been able to measure any kind of force at work. Neither gravity nor electromagnetism work like this. Maybe there is some unknown other force, that remains strong yet undetectable, interacting with us from distant stars trillions of miles away which has yet to be discovered.
Another problem is that astrological assessments/predictions should be at least consistent if not accurate. In 1985 scientist Shawn Carlson conducted a double-blind experiment with astrologers to match personality test results to natal charts (essentially their zodiac symbols). If personality types are immutably governed by the stars, matching a zodiac sign to a participant’s corresponding personality type should be easy. It was apparently not easy, as astrologers performed about the same as pure chance. Worse, the astrologer participants performed poorly in even finding their own personality profiles.
Maybe astrology succeeds despite the human failings of astrologers. Time twins, people born at the same time on the same day sometimes even in the same hospital, should have similar personalities. Unfortunately there is no correlation at all. Even without astrologers being involved astrology is inconsistent.
Part of the blame for astrology lies with its adherents who believe astrology is real. Paranormal skeptic James Randi conducted an exercise where he gave detailed horoscopes to a class full of students. Most of the students said the horoscope they received was quite accurate. The trick was that Randi gave the same horoscope to everyone in the class. What the students in Randi’s experiment fell for was the Barnum effect.
Barnum Effect
The Barnum effect (aka the Forer effect) is found in fortune telling and astrology where an assessment/reading seems to be about you but in reality can apply to almost anyone. These are statements that have been carefully worded to be specific and yet universal. For example, one might say that …
“You have a tendency to be critical of yourself. You have a great need for other people to like and admire you. You pride yourself as an independent thinker and do not accept others’ statements without satisfactory proof. At times you are extroverted, affable, sociable, while at other times you are introverted, wary, reserved.”
In fact these statements are part of what psychologist Bertram Forer gave to his test subjects as part of his 1948 study. When assessing the accuracy of these statements, participants in Forer’s experiment gave an average rating of 4.3 out of 5 (5 being the most accurate). It turns out every student was given the exact same statements. Horoscopes and other astrological readings frequently use the Barnum effect to seem specific to you but in reality can apply to almost anyone.
Confirmation Bias
Another way astrology can seem real is through confirmation bias. Believers remember the predictions that came true more than the ones that didn’t. When someone has an emotional desire for a certain outcome they can respond more favorably towards the evidence that supports their beliefs and dismiss or undervalue contradictory evidence. Selectively remembering the horoscopes that came true can make astrology seem real, even thought it’s not.
Other contributing factors are that people who believe in astrology tend to be of lower intelligence, and more narcissistic, than non-believers. A potential “self-centered worldview” (along with a shaky understanding of science) could be influencing factors leading people to believe in astrology.
Ultimately astrology is inconsistent, inaccurate, and unable to explain why any of it is supposedly happening. From Cicero to modern scientists we have compelling arguments and mountains of scientific evidence showing again and again that astrology isn’t real. As professor Ivan Kelly of the University of Saskatchewan wrote, “Astrology is part of our past and has undeniable historical value, but astrologers have given no plausible reason why it should have a role in our future.”
Added bonus: one famous believer in astrology was President Ronald Reagan. Astrologer Joan Quigley (the Rasputin of the Reagan White House) regularly consulted her star charts to advise the president on a host of matters. She advised the president on when to deliver speeches, when to have presidential debates, when he should schedule his cancer surgery, and even when to land Air Force One. It was generous of the Christian Moral Majority to overlook Reagan’s pagan beliefs.
For hundreds of years Europeans used ground up Egyptian mummies as medicine and paint pigment.
The Arabic word mūmiyā (which later became “mummia”) was the name for the black sticky asphalt material that came out of the ground used as a sealant, an adhesive, and as medicine around the ancient world. Pliny the Elder and others wrote about the medicinal uses for mummia which became a bit of a cure-all for a range of ailments.
Unfortunately mummia the petroleum product looked like another black substance that was a byproduct of the Egyptian embalming process. As such the word “mummia” came to mean both the petroleum product AND the byproduct of Egyptian mummification, which was then even further confused as meaning an entire mummified body. This is how we got the word “mummy”. Unfortunately this series of mistakes also led to hundreds of years of cannibalism.
Cannibal Medicine
Since the petroleum based mummia was used both externally as a salve as well as ingested internally, the Egyptian mummy version of mummia became used in the same ways. The 11th century physician Constantinus Africanus even described mummia as a “spice” found in the sepulchers of the dead. Soon the human version replaced the petroleum version and people began to crumble & grind human corpses for medicine.
With the Crusades, Europeans learned of mummia and its medicinal possibilities. This significantly increased European demand for Egyptian mummies and by the 15th-16th centuries there was a thriving trade in mummies. Thousands of bodies were being exhumed and shipped to Europe to be turned into medicines. In 1586 English merchant John Sanderson shipped 600 pounds of mummies to London to sell at various apothecaries. This was fueled in part by orientalism, that Egyptian mummies had some sort of exotic ancient knowledge or power.
Europeans would consume portions of Egyptian corpses for help with general pain, ulcers, inflammation, epilepsy, cough, difficult labor, etc. – none of which worked, or if they worked it wasn’t the mummy that was the active ingredient. The practice was so common Shakespeare included mummy as an ingredient in the witches’ potion in Macbeth. Demand was so high that by the 17th century some mummy dealers were producing counterfeit mummies. Newly deceased people, animals, or prisoners who had been purposefully starved & executed, were put through a process to simulate ancient Egyptian mummies.
After a few hundred years of medicinal cannibalism Europeans began to express doubt as to the practice’s efficacy (and ethicality). The 16th century herbalist Leonhard Fuchs felt foreign mummies were acceptable but local ones were wrong. While doubts arose during the Renaissance in the 16th century it took until the 18th century age of Enlightenment for the practice to fall out of fashion. As consuming mummies slowly ended Egyptian mummies took on a new role: paint pigment.
Mummy Brown
Around the end of the 16th century artists began using ground up Egyptian mummies (mixed with other materials) to produce mummy brown, a shade of brown pigment. Apothecaries that were grinding up mummies for medicine began to grind them up for paint as well. As a paint it was good for shadows, flesh tones, and glazing. Artists Benjamin West, Martin Drolling, Lawrence Alma-Tadema, Edward Burne-Jones, Eugène Delacroix, and others all painted with mummy brown.
It wasn’t until the 19th century that mummy brown began to fall out of favor. That said as recently as 1926 C Roberson & Co. still sold mummy brown made with ground up Egyptian corpses. As mummy brown died out so too did hundreds of years of large-scale desecration of deceased Egyptians, using human beings for medicines and paints.
Added info: in another example of “what’s old is new again”, in 2024 some Sierra Leoneans were digging up dead bodies for their bones to make the highly addictive street drug kush.
Before museums existed, people had cabinets/rooms to display their collected treasures.
There was a time when museums did not exist. The role of collecting, preserving, and displaying the art, artifacts, and wonders of the world belonged largely to individuals. As far back as the 4th century BCE Greeks were collecting exotic treasures from the East. More than just trading in commodities, the Greeks collected the art and textiles from these far away cultures. Roman emperor Augustus decorated his homes not just with art but with rare objects and bones of giant animals. Over the centuries, as cultures explored & traded with increasingly distant lands, the trends in what was collectible grew & changed. By the 16th and 17th centuries wealthy European collectors had amassed enough objects that they created special cabinets and/or rooms to show-off their collections. They created cabinets of curiosities.
Wunderkabinett
From the German for art (kunst) or marvels (wunder) and cabinet (kabinett) or room (kammer), these cabinets & rooms were places where Renaissance scholars, merchants, royalty, and others could store their collections. Collecting was very fashionable in 17th century Europe and these cabinets were dedicated spaces to displaying all manner of objects. Like the contemporaneous maps of the world, some of these spaces were designed for show while others were more utilitarian.
Some collectors had thousands of specimens. The objects in these cabinets were thoughtfully categorized and organized, each piece contributing to the larger whole. Collecting was a way to bring order to the world, to exert some level of control over something that is uncontrollable. What was stored & displayed in these cabinets depended on the collector, but broad categories of objects included:
Fine art
Applied art (scientific instruments, anthropological objects, etc.)
Natural materials (fossils, shells, rocks, etc.)
Historical objects
These categories, as well as these collections, served as the precursors to our modern museums. The Amerbach Cabinet was a collection of art, books, coins, etc. that was assembled by various members of the Amerbach family. It was eventually co-purchased by the city of Basel & the University Basel and became the Kunstmuseum Basel in 1661, the first public museum in the world. Francesco I de’ Medici had his studiolo, a 26 x 10 foot room of curiosities that is part of the Palazzo Vecchio in Florence. Other Medici possessions served as the start of the Uffizi Gallery. Elias Ashmole, who amassed his fortune & collection through sometimes questionable means, gifted his collection to the University of Oxford which became the Ashmolean Museum in 1683.
Throughout the 18th century an increasing number of private collections were converted into public museums, some of which still exist today but all of which helped define what museums have become.
Added info: In 1784 Charles Wilson Peale’s collection became the Philadelphia Museum which was the United States’ first museum (and also the first to display a mastodon skeleton).
Reports that MSG is dangerous stem from one anecdotal letter and years of racism.
Monosodium glutamate (MSG) is a compound made up of sodium and glutamate (an amino acid) found naturally in our bodies and in a variety of foods (tomatoes, cheeses, anchovies, mushrooms, etc). Usually when it’s mentioned people are referring to the synthesized food additive version which is added to meals to bring out their umami flavors. It’s been a commercially produced food additive since 1909 but, despite being used by tens of millions of people, 42% of Americans today think it’s dangerous. The cause of this fear goes back to one article.
Chinese Restaurant Syndrome
The April 4, 1968 edition of the New England Journal of Medicine contained a letter titled Chinese-Restaurant Syndrome by Dr. Robert Ho Man Kwok on his observations of eating American Chinese food. Kwok said that about 15 to 20 minutes after eating at a Chinese restaurant he developed a headache, weakness, heart palpitations, and numbness. He proposed several possible causes but singled out MSG as the answer. This single letter was the beginning of decades of mistrust in MSG.
The ideas of MSG side-effects and “Chinese Restaurant Syndrome” have largely been fueled by racism. Suspicion or fear of East Asian cultures, the exoticism of the “Orient”, and/or a general lack of knowledge has led some people to be suspicious of Asian cuisine. In 1969 New York City imposed regulations on MSG use in Chinese restaurants but not regulations on MSG in general. While the supposed adverse reactions to MSG should make consumers wary of any food item containing MSG, Chinese food in particular got singled out and maligned. Lots of processed western foods contain MSG, lots of plants naturally contain significant levels of MSG, and yet Doritos and shiitake mushrooms didn’t seem to get singled out quite like Chinese food did.
Safe to Eat
There is no connection between MSG and the symptoms Kwok described. The US Food & Drug Administration states that MSG is safe to eat and that there is no evidence to support claims of headaches and nausea from eating normal amounts of MSG. In double-blind studies using subjects who claimed to have sensitivity to MSG some subjects were blindly given MSG and, unaware they were eating MSG, had no ill effects. These tests were unable to reproduce any of the side-effects claimed about MSG.
MSG, like any food additive, is safe in moderation. Excess anything can make you sick. Because of the association of Chinese food to MSG, some Asian restaurants in the US have reduced their usage of MSG just to satisfy public opinion, to the detriment of the food and the customers’ taste buds.
Calendars based on the cycles of the moon have a shorter year than solar calendars. How that time discrepancy is dealt with depends on the culture.
Ancient cultures typically had two options for creating calendars: solar or lunar. Solar calendars track time based on the movement of the sun in the sky. It takes 365.24 days for the Earth to travel around the sun and make up a year. Lunar calendars however are based on the phases of the moon which restart every 29.5 days adding up to only 354.37 solar days. This leaves an 11 day discrepancy between lunar and solar calendars.
Intercalation of “Lunar” Calendars
To account for this 11 day difference some cultures engage in a practice known as “intercalation” which is the adding of extra days/weeks/months to synchronize your calendar with a solar year of 365.24 days. Many lunar calendars are, in reality, lunisolar calendars as they intercalate extra time to keep their lunar year somewhat aligned to our solar year.
A variety of cultures use lunisolar calendars, especially in East Asia. For example the traditional Chinese calendar is based on lunar cycles but adds a 13th month ever few years. This is why Chinese New Year doesn’t have a fixed date (on our calendar). Intercalation is used to keep the lunar New Year from straying too far which keeps it sometime between late January to late February.
The alternative to adding time is to do nothing about the 11 day discrepancy which has the cumulative effect of pushing holidays further and further around the calendar. This hands off approach can put spring holidays in the fall, winter months in the summer, etc. The Islamic calendar (the Hijri calendar) operates this way, which explains why Muslim religious holidays move around our solar based calendar so much. It takes 33 years for a holiday on a lunar calendar to come back around to its original position.
Leap Day
It’s not just lunisolar calendars that intercalate time. Our calendar year is 365 days but it takes the Earth 365.24 days to travel around the sun. We add time to our Gregorian calendar to account for the extra 0.24 day period of time. We do this by adding a Leap Day every 4 years to even things out.
Added info: The oldest known calendars are a group of carvings from around 32,000 BCE created by the Aurignacian people. These carvings are in antlers, bones, and cave walls found in France which have crescents, dots, and lines diagraming the cycle of the moon. These early lunar calendars document that, for tens of thousands of years, humans have tracked the passage of time by looking to the skies.
Between Mars and Jupiter lies the asteroid belt (aka. the “main asteroid belt”, as there are other areas with asteroids in our Solar System). Within this belt there are millions to billions of asteroids made up of rock and metals. Some are tiny particles but the largest is Ceres which is 580 miles in diameter. Large or small they’re hurdling through space at speeds up to 40,000 mph, so if one flew into a space craft it could be disastrous. Fortunately this isn’t really a problem.
Far Out
Unlike asteroid belts in sci-fi movies, our main asteroid belt is not an obstacle course. Most of the asteroid belt is empty space. The four largest asteroids alone make up more than half the total mass of the entire belt and if you combined all of the asteroids together it would still be smaller than our moon. The average distance between asteroids is around 600,000 miles. According to Alan Stern of the Southwest Research Institute, “… if you want to come close enough to an asteroid to make detailed studies of it, you have to aim for one.” The odds of a spacecraft hitting one is less than 1 in a billion. It’s easier to fly through the asteroid belt than it is to actually hit an asteroid.
A plot device that isn’t as dangerous as movies & TV led us to believe.
Quicksand was once a very common plot device in TV shows & movies. From Lawrence of Arabia to The Incredible Hulk, Gilligan’s Island, Batman, and even in space in Lost in Space, quicksand was all over pop culture in the 1960s. Nearly 3% (or 1 in every 35) movies made in the 1960s featured quicksand. Characters step on what looks to be solid ground but, surprise, it’s quicksand. They begin sinking like they’re going down some sort of Earth elevator with the looming possibility of being totally submerged unless a handy vine or person can save them … this is not how quicksand really works. Real quicksand is not as sudden, dramatic, or dangerous as fictional quicksand.
Non-Newtonian Fluid
Quicksand is a mixture of water and sand/silt where the sand particles are suspended in water and spaced further apart than typical sand. It’s a non-Newtonian fluid so if you apply pressure you momentarily change the viscosity. Higher viscosity substances move more like mud, lower viscosity substances move more like water. In quicksand’s case stepping on it with your foot applies pressure and changes the viscosity to become momentarily less viscous. The sand particles get pushed out of the way making it more watery, which allows your foot to sink. This is quickly followed by the sand settling into place around your foot which is how you get stuck. The more you move, the more you agitate the mixture, the deeper you go.
The Good News
You can not totally sink into quicksand like some sort of bottomless pit. One reason is that quicksand is rarely more than a few feet deep. Further, the human body is less dense than the density of quicksand which means that, regardless of the quicksand depth, it’s not possible to sink further than your waste. That said there are dangers.
Since quicksand can form beside larger bodies of water there is the possibility of drowning due to flash flooding, tidal changes, etc. Other dangers include hypothermia, sunburn, predators, and/or the pain of having part of your body under pressure for a prolonged period of time. Most of the time though quicksand is fairly harmless as long as you stay calm to get out of it.
To get out of quicksand the first thing you should do is to not go any further in – stop moving around. If you can’t use your other foot to just step back out, and you really feel stuck, it’s time to sit/lay down extending away from the quicksand. Making yourself wider reduces the focalized pressure into the quicksand which helps free your foot. Then slowly work your leg back and forth, lowering the viscosity & making the quicksand more watery, and patiently pull your leg out.
Bonus: The “King of Quicksand” has a whole YouTube channel devoted to intentionally getting stuck, and then escaping from, quicksand. Watching any of his videos shows that you really have to work to get yourself stuck in quicksand, which is reassuring.
You can also watch a playlist full of scenes from TV shows and movies (old and new) of characters getting stuck in quicksand.