The idea that “… the kids of today aren’t as good as when I was a kid …”, has been around for thousands of years.
Generation Y, more commonly referred to as “Millennials”, are people born between 1981 and 1996 (but these years vary). Criticisms of millennials are that they’re lazy, entitled, and self-obsessed. The general narrative is that this younger generation is not as disciplined as the hard-working older generations. This is frequently accompanied by a “things were better when I was younger” mindset. While millennials have been a recent target of this kind of criticism, this kind of criticism is nothing new.
From Hesiod to Baby Boomers
Adults have been complaining about the up-and-coming younger generation for as long as there have been people. One of the earliest examples is by the classical Greek writer Hesiod, who around the 8th century BCE wrote “I see no hope for the future of our people if they are dependent on the frivolous youth of today, for certainly all youth are reckless beyond words.” A few centuries later Aristotle echoed this idea when he said of younger people, “They think they know everything, and are always quite sure about it.”
The song remains the same
This kind of thinking is reductive and condescending – it says more about the out of touch nature of the people doing the criticizing than the younger generation being criticized. Despite thousands of years of older people complaining about younger people civilization has, somehow, managed to continue and progress.
People don’t change that much from generation to generation and no generation is a cultural monolith. Every generation has hard workers, selfless givers, narcissists, the lazy, the good, the bad, and everything in between. Shakespeare continues to be relevant because the fundamental human condition has changed very little over the centuries.
The kids are alright
Myths that millennials eat avocado toast all the time, that they fail to save for retirement, that they’re lazy, that they’re all socialists, etc. have all been debunked. After maligning and blaming millennials for a variety of society’s problems baby boomers seemed surprised and insulted by the “audacious”, terse, and somewhat snarky millennial reply of “OK boomer”. Meanwhile these same baby boomers seem to have forgotten (or just don’t realize) that the political, economic, and environmental realities of today are much different than when the boomers were young.
As for narcissism, younger people of every generation tend to be more narcissistic but become less so as they age – the older people who are currently less narcissistic didn’t start out that way. Our values also change as we age. Despite being on the receiving end of this criticism the younger people of today will become the older people of tomorrow and will inevitably forget what they were like when they were young. They’ll judge younger generations by their present mindsets and not by the attitudes they held back when they were the that age. The more things change, the more things stay the same.
The taxidermy oddity that attracted thousands of people to P.T. Barnum’s American Museum.
In 1841 P.T. Barnum opened his American Museum in New York City. For 31 years the museum had been Scudder’s American Museum which was part science museum, part zoo, part history museum, and part collection of oddities. After Barnum bought it he took these ideas and amped them up to become one of the most popular attractions in America. With around 500,000 items in the collection the museum was both educational and entertaining – it was history and spectacle. Over its 14 year run the Barnum American Museum had 38 million customers at a time when the population of the US was only around 32 million.
Being a P.T. Barnum enterprise, marketing was a critical tool to its success. He transformed the facade of the building into a giant billboard for the museum itself. He had posters advertising (and exaggerating) the attractions inside. One of the first attractions he marketed, using most of the front of the building to do so, was the Fiji mermaid.
The Little Mermaid
The Fiji mermaid was brought to America in 1842 by Dr. J. Griffin of the British Lyceum of Natural History. It was the mummified remains of a mermaid from the Fiji islands in the South Pacific. Barnum generated interest in the mermaid by sending anonymous letters to various newspapers talking about it. He even cooked up a story that he was trying to convince Dr. Griffin to exhibit the mermaid and that Griffin was reluctant. It was a sensation before it was ever even exhibited to the public.
Barnum negotiated to display the mermaid for one week but it proved to be so popular that it went on the road, touring southern states. Dr. Griffin gave lectures about mermaids and cited the ancient Greek idea that everything on land had a counterpart in the sea. At a time when new species were being discovered in the remote areas of the world perhaps a mermaid had finally been found.
Eventually the Fiji mermaid split its time between Barnum’s American Museum and the Boston Museum. Its fate is unknown as it went missing but it was most likely destroyed in either the fire that consumed Barnum’s museum in 1865 or the fire that consumed the Boston Museum in 1880.
A sucker born every minute
In truth, the “mermaid” was Barnum’s first hoax at his American Museum (his very first hoax was when he exhibited Joice Heth, a woman he bought, and claimed she had been George Washington’s former nurse … which she hadn’t been). At about 3ft long the mermaid was the taxidermy combination of a monkey torso and the tail of a fish (most likely a salmon). Far from being the beautiful humanoid mermaid seen in Barnum’s advertisements, it was a ghastly animal mashup. The Charleston Courier wrote that “… the Feejee lady is the very incarnation of ugliness.”
Instead of originating in the Fiji islands, the mermaid actually was one of many created by Japanese fishermen. This particular mermaid was bought by the American sea captain Samuel Edes in 1822 whose son sold it to Moses Kimball of Boston in 1842. Kimball then leased the mermaid to Barnum for his museum. As for Dr. J. Griffin, he was actually Barnum’s associate Levi Lyman who was in on the ruse from the very beginning, pretending to vouch for the mermaid’s authenticity. Also there’s no such thing as the “British Lyceum of Natural History”. Nothing about the Fiji mermaid was real except the public’s excitement.
There is a Barnum-esque blurry gray area between “hoax” and “entertaining joke”. While Barnum liked to categorize things like the Fiji mermaid as “humbugs” (which are things designed to deceive), he felt they were always in playful fun. Barnum wanted the audience, even when deceived, to still have a good time. He did not like deception at the expense of the public. For example he spoke out publicly (and testified in court) against spiritual mediums who tricked people out of money, lying to them about communicating with deceased loved ones.
Over the years numerous other Fiji mermaids have made the rounds in museums, curiosity shops, sideshows, and private collections. They’re made from all manner of materials (animal parts, wood, papier-mâché, wire, plastic, etc). You can find higher-quality ones for sale in shops that specialize in curious objects, but there are also cheaper ones on ebay. You can also learn to build your own.
Added info: The Jenny Haniver is a related taxidermy hoax. It’s a sea animal, frequently a ray or skate, that’s been modified to look like the mummified remains of a demon, angel, basilisk, etc.
L’esprit de l’escalier, or “staircase wit”, is when you think of the perfect thing to say … but it’s too late. The French name for this phenomenon comes from thinking of the perfect retort on your way down the stairs after leaving the conversation/argument. It’s a common enough experience that the phenomenon has a name. Thinking of what you should have said, after the fact, happens to everyone.
Staircase wit touches on counterfactual thinking, where we imagine alternate scenarios for events that have already happened. Deliberating on how things could have played out can lead to arguing with ourselves, where we try a discussion a second time in our minds trying to come up with the best response (witty or otherwise).
In the heat of the moment
When our ideas are challenged we can become flustered and emotional. It can be difficult to think straight, let alone to be witty, when you’re uncomfortable. Fear and anxiety can cause us to focus on a single line of thinking (depth-first processing) which, if you are trying to be witty, makes it more difficult to formulate a creative response.
When you’re in a good mood you’re more open to new ideas and are more creative (breadth-first processing). Wit requires creativity, confidence, and timing. Staying relaxed can help you be witty in the moment … and not after the fact on the staircase.
The song about an Egyptian girl that became a surf rock classic.
At its height the Ottoman Empire controlled lands across North Africa, through the Middle East, Asia Minor (modern day Turkey), and up into the Balkans. By the early 20th century the empire had greatly reduced in size but culturally it was still a diverse mix of elements from the lands it once ruled as well as its neighbors. It’s in this environment that Rebetiko music was formed.
Rebetiko is Greek urban music that began in the early 20th century in Asia Minor. It’s a blend of styles pulling from Greek, Turkish, Armenian, Arabian, and Jewish music. It’s been referred to as the Blues of Greece due to its working class origins and its sometimes scandalous themes.
The song Misirlou is a rebetiko song of the early 20th century (its exact origins are unknown). The title is a Greek pronunciation of the Turkish word “Misirli” which translates as “Egyptian girl”. It’s a passionate song about the singer’s longing desire for a beautiful Egyptian girl. Played in the traditional style the Middle Eastern influences are easy to hear. The earliest known recording of the song was by Theodotos Demetriades in 1927. Since then numerous other versions have been recorded in the rebetiko style but the song reached new audiences through 1960s American surf rock.
The King of Surf Guitar
Surf Rock began in the late 1950s in Southern California. It started as instrumental music with lots of reverb, later evolving into vocal surf with bands such as the Beach Boys, Jan and Dean, etc. While a host of bands contributed to the creation of instrumental surf, perhaps the most notable pioneer was Dick Dale aka “The King of the Surf Guitar”.
In 1962 Dale (whose Lebanese-American uncle used to play Misirlou on the oud) recorded an instrumental version of Misirlou, changing the spelling to Miserlou. At a blistering pace of 173 beats per minute (the traditional version is around 78 bpm), Dick Dale’s surf rock version of Miserlou is one of the most famous instrumentals. Miserlou found new fans when it was used in the opening of 1994’s Pulp Fiction. The film brought new life to both Miserlou and Dick Dale’s career.
It used to be that, if you owned land, you used it to grow plants for some kind of profit (food, timber, fabric, etc.). Decorative manicured grounds have no monetary value. To keep a grassy lawn was a sign of wealth – it was a status symbol that you had so much money you could use some of your land for pure ornamentation. Beyond being a “waste of space”, you also had to pay for people to maintain the lawn, making it even more expensive.
Our modern idea of a meticulously manicured grassy lawn has its roots in 18th century European aristocracy. While earlier palaces featured intensely manicured gardens with topiaries and geometric lines (such as the Palace of Versailles), 18th century English garden design drew inspiration from the pastoral landscapes of Italian paintings. This new style featured wide open spaces that, while manicured, looked more natural. For example, some estates used ha-ha walls as barriers to keep grazing animals away from the house while offering the illusion of an uninterrupted natural view of the grounds.
As for the upkeep, grazing animals were sometimes used to maintain the lawn in the distance (and were a visual addition to the “natural” scene) but the areas closest to the house were tended to by men using hand tools. Even after the invention of the lawn mower in 1830, which helped increase the number of grassy lawns, these trimmed green fields were found primarily around the homes of the wealthy.
17th century colonists arriving in North America were generally preoccupied with trying to stay alive and didn’t have the time for decorative lawns. They were also missing the grass itself. The East Coast lacked the types of grasses necessary to turn into lawns. What’s worse is that these were the kinds of grasses that best served as food for the colonists’ grazing animals. As such the animals over grazed the native available plants, eventually turning in desperation to eating poisonous plants (to their detriment).
To solve this problem colonists began to import grass from Europe for their cows, sheep, etc. This is how many of the grasses that are so common in America got here. For example Kentucky bluegrass, one of the most popular grasses in America, is a non-native/invasive species and was imported from Europe.
As settlers spread around North America so too did grass. Throughout the 19th century as people became more established, grassy lawns slowly became a feature of homes and parks. After the Civil War the more prosperous northern states adopted lawns sooner than southern states. Public parks and cemeteries increased the popularity of grassy lawns. Landscape architect Frederick Law Olmsted designed one of the earliest suburbs in 1868 with his plans for Riverside, Illinois. He set the homes back 30ft from the street and placed grassy lawns out front. What really democratized lawns however was the housing boom in the mid 20th-century.
With the 1944 G.I. Bill millions of veterans were able to receive home loans which helped them buy homes and move to the suburbs. Abe Levitt, who created Levittowns, said that “A fine lawn makes a frame for a dwelling …”. Millions of homes were suddenly being created with millions of lawns. As so many families were becoming home owners lawns became less about economic status and more about cultural conformity. A well-maintained lawn was the sign of a good neighbor, and an unkempt lawn was subversive. Lawn care became big business and articles about lawn care surged in post-war America. With color TV more people could watch professional sports (especially golf) and see what was possible for their own lawns.
Today there is an estimated 40 million acres of grass in America. Grass is America’s greatest crop all while being (generally) inedible – lawns serve almost no functional purpose other than looking nice. Cutting grass regularly encourages it to spread out, edging out other plants and reducing biodiversity. Interestingly more affluent homes which can afford the time & money needed for a more manicured lawn actually have lower biodiversity than lower-income homes. The nicest looking lawns are, paradoxically, the worst for the environment.
As for carbon emissions grass is a carbon sink (which is a good thing), meaning it captures carbon emissions and stores it in its roots. Unfortunately the act of mowing the lawn contributes far more carbon dioxide than is captured. Gas powered lawn equipment produce more air pollution than cars over comparable periods of time (For example: the air pollution of 1 hour of mowing equals around 100 miles of driving). Lawn mowers account for around 5% of America’s air pollution. Having and maintaining a lawn ultimately produces more dangerous carbon dioxide than it captures. Further, lawn equipment in America uses around 800 million gallons of gasoline annually of which about 17 million gallons are spilled and never even used.
Homeowners use 10 times the amount of pesticides and fertilizers per acre than farmers, and many of these chemicals find their way into the water supply. Watering these lawns uses 30-60% of urban fresh water – all for a crop that isn’t eaten and just sits there.
An alternative to lawns are trees or other native plants that require less maintenance (less gas powered machines) and improve biodiversity. Native plants are better for butterflies, bees, and other helpful insects. This in turn is better for birds and other animals. Planting native plants, not using pesticides, reducing the size of your grass lawn, etc. creates a healthier and more bird friendly yard. Break free of the conformist thinking that you must have a green carpet around your house.
The short headache triggered by cold food and/or drinks touching the inside of your mouth.
To start, brain freeze (aka “ice cream headache” or “cold-stimulus headache”) only affects about 30-50% of the population. Most people can eat ice cream and drink extra cold drinks without any fear of reprisal from their nervous system.
Brain freeze occurs when the roof of your mouth or the back of your throat suddenly come into contact with cold food, cold drinks, or even cold air. The trigeminal nerve in your head reacts to the cold by telling the arteries connected to the meninges (the membranes surrounding your brain) to contract to conserve warmth (much like how our bodies react to the cold in general). Then the body sends more warm blood up to the head telling those same arteries to expand. This quick succession of vasoconstriction and vasodilation of blood vessels triggers pain receptors along the trigeminal nerve which creates the pain you feel behind the eyes or forehead during a brain freeze.
A lot of nerve
While we all have a trigeminal nerve its varying sensitivity may explain why not everyone gets brain freeze. For example 37% of Americans may get brain freeze but only around 15% of Danish adults do. Further, 93% of people who get migraines are also susceptible to brain freeze.
The most sampled drum beat of all time used in thousands of songs and helped launch new genres of music.
The 1963 film Lilies of the Field stars Sidney Poitier as a traveling jack-of-all-trades who encounters a group of German speaking nuns in the Arizona desert. As he performs odd jobs for them he also helps teach them English through song, and in particular he teaches them the song Amen. The song is a traditional gospel song which, along with the movie, were inspiring to a young Curtis Mayfield who recorded a new version of the song in 1964 with his band The Impressions.
The version of Amen recorded by The Impressions then served as inspiration in 1969 for an even funkier instrumental version of the song by The Winstons titled Amen, Brother. At 1:26 the song breaks for a 5.2 second drum solo by drummer Gregory Coleman. This drum solo has become one of the most prolific drum solos of all time.
Sampling and the rise of Hip Hop
In 1980s New York the sampler, combined with the turntable, helped create hip hop. The sampler allowed musicians to take pieces of music, especially drum beats, and transform them into new songs. They could loop audio clips, rearrange the notes, change the pitch, change the tempo, etc. An additional asset in this new genre were bootleg records of collected beats that artists could sample. In 1986 Amen, Brother was included on Ultimate Breaks and Beats which was immediately popular for the drum solo which became known as the “Amen break”.
The beat that launched a thousand songs
The Amen break soon became a staple of sampling. Its popularity and influence can be heard throughout early hip hop. The Amen break became even more versatile once it was broken down into its individual components where each sound was isolated, allowing musicians to rearrange the pieces. Entirely new genres of music such as Hardcore, Jungle, Drum and Bass, etc. wouldn’t exist without the Amen break. While early hip hop tended to slow down the Amen break (such as in NWA’s Straight Outta Compton, DJs in Jungle sped it up into a frenzy (as heard in Incredible by M-Beat).
The Amen break can be found in at least 5,617 songs. Some examples of songs using the Amen break include Salt-N-Pepa’s I Desire, Jay-Z’s Can’t Knock the Hustle, UK Apachi’s Original Nuttah, The Invisible Man’s The Beginning, the theme song to the TV show Futurama, etc.
Success or “Success”
The Winstons were never compensated for any of this. The Amen break took on a life of its own without the band. Today, you would clear the use of a song and pay royalties to the original artist but the Amen break became popular at a time when artists weren’t concerned with copyright laws and were more focused on their art. Richard Spencer of The Winstons says he only became aware that the drum solo from Amen, Brother had become the Amen break in 1996, at which point the beat was everywhere.
Over the years there have been multiple attempts to raise money for Spencer and for The Winstons’ drummer Gregory Coleman to compensate them for the unlicensed sampling of the song, but to mixed success. In 2006 Gregory Coleman died, reportedly homeless, having never seen any royalties from his contribution to music history.
Added info: the Winstons’ Amen, Brother was actually the B-side to Color Him Father, which won the 1970 Grammy award for Best R&B song.
Another incredibly popular sample of the time was the Think break from the 1972 song Think (About It) by Lyn Collins and James Brown, famous for it’s “Woo! Yeah”. The Think break is perhaps most famously used in 1988’s It Takes Two by Rob Base and DJ E-Z Rock.
Charles Dickens’s pet raven Grip helped inspire Edgar Allan Poe’s poem The Raven.
In the first half of the 19th century Charles Dickens had a pet raven named Grip who, by all accounts, was quite the handful. Grip was talkative, bossy, and aggressive. She intimidated the family’s mastiff Turk (she would steal food from his bowl) and would also bite the Dickens children. Eventually Dickens exiled Grip to the shed where, being a mischievous raven, she got into a can of white paint (which contained lead). On March 12, 1841 Dickens wrote to his friend, the illustrator Daniel Maclise, that Grip had died.
Because he loved Grip Dickens had her stuffed and mounted in a case complete with a woodland setting of branches and leaves. He also had Maclise create a portrait of her. Despite Grip’s difficult personality it didn’t put Dickens off to having more ravens as pets, the next of which he also named Grip (who, according to Dickens’s daughter Mamie, was also a handful).
Quoth the Raven …
In 1842 Dickens and his wife traveled to America. As part of his tour around the states he met with Edgar Allan Poe who had favorably reviewed Dickens’s 1841 novel Barnaby Rudge. In the novel the titular character of Barnaby Rudge has a talkative pet raven whose name just happens to be Grip. Poe was particularly interested in Grip, whom he described as “intensely amusing” and liked that Grip the character was based on Dickens’s own real pet Grip.
A few years after learning about Grip, Poe would write his most defining work, 1845’s The Raven. In the poem a raven flies into the room of the grief-stricken narrator, tormenting him that he will never be reunited with his lost love. It’s widely believed by Poe scholars that the inspiration for the bird in the poem was Grip the raven (both the real Grip and the fictional Grip). There are numerous similarities between the bird in The Raven and Grip the raven in Barnaby Rudge.
Grip the mischievous raven inspired two literary giants. The Raven the poem then went on to inspire untold others including the naming of the Baltimore Ravens (the only football team named after a piece of literature).
Added info: Grip was not the only Dickens pet that had a life after death. After Bob the family cat died Dickens had one of his paws turned into a letter opener.
Also, while crows and ravens are fairly similar there are some easy ways to tell them apart. It’s frequently written that “ravens are larger than crows” but without seeing the two side-by-side it can be difficult if you haven’t previously seen both species. Perhaps the easiest way is the tail feathers which, when in flight, the feathers of a raven come to a point like a “V” (like the “v” in “raven”). A crow’s tail feathers are more of a straight line.
The visual condition that changes what colors you see.
To start, being color blind almost never means someone is blind to color, as if they’re living in a black & white movie. “Color blind” usually just means someone doesn’t see the full spectrum of colors like the rest of us. To understand color blindness we have to understand two concepts: light and our eyes.
Let there be light
The colors that we see are photons moving at different wavelengths/frequencies. They’re part of the electromagnetic spectrum. The full electromagnetic spectrum ranges from Gamma rays (the shortest, highest frequency waves – quite dangerous) to radio waves (the longest, lowest frequency waves – not so dangerous). What we call visible light is radiation in a particular range of wavelengths. Within this bandwidth the colors of violet and blue have the shortest wavelengths while oranges and reds have the longest. Through evolution we have developed two small biological machines capable of detecting this range of wavelengths … our eyes.
Doctor My Eyes
Our retinas have two kinds of photoreceptive cells: rods and cones. Rods see light & dark while cones see color. We have about 120 million rods per eye and but only 6-7 million cones per eye. Instead of just one kind of cone cell we have three and each kind is tuned to a certain range of wavelengths (short, medium, and long). To put it another way, our three kinds of cone cells are each tuned to see certain ranges of colors – blues (short), greens (medium), reds (long).
Bringing it all together, color blindness is when one (or more) of your cone cell types are either defective or missing entirely. The result is that you are unable to properly see certain wavelengths of colors.
Why does color blindness happen? While color blindness can be an acquired condition most of the time it’s genetic. The most common forms of color blindness are carried on the X chromosome and because men only have one X chromosome, if it’s defective they’re out of luck. This is why men are more commonly color blind than women. Women have two X chromosomes so a functioning X chromosome will compensate for a defective one. As a result around 8% of men are color blind compared to only around 0.5% of women. That said color blindness isn’t evenly distributed across men – it has a higher prevalence amongst Caucasian men than other ethnicities.
Because cones come in three varieties, and those cones can be defective or absent, the various combinations of factors means there are many forms of color blindness. The most common type is “red-green” color blind (which is a few kinds grouped together) where reds and greens aren’t seen properly and shift to look more like yellows and browns. This is the result of the medium and long (green and red) cone cells being defective or absent. Red-green color blindness accounts for about 99% of all color blindness with about 1 in 12 men and 1 in 200 women having it.
Blue-yellow color blindness is where blues and greens aren’t seen properly. It’s also genetic but it’s not carried on the X chromosome so men and women are affected relatively evenly. It’s quite rare – around 0.01% of men and women are blue-yellow color blind.
Are you seeing what I’m seeing?
The effects of color blindness range from the benign to the dangerous. Accidentally wearing clothes that don’t match can be embarrassing but confusing “stop” for “go” on a traffic light can be dangerous. Color blind individuals can have difficulty determining the ripeness of fruits & vegetables. They can see sports jerseys as similar and have difficulty tracking games. The designer shorthand that red means error/bad while green means success/good (the traffic light analogy) can make a variety of safety features, dashboards, and websites more difficult to use. One positive is that color blind individuals may be better at detecting camouflage.
What do other animals see?
The concept of “color blindness” is relative. What most humans consider normal is not what most bees would consider normal, or dogs, or any other species. So when people say that most other animals are color blind, it’s just that they can’t see the same spectrum of colors that humans normally see.
To start, most mammals are red-green color blind (which to them is normal). They tend to only have two cone cell types, lacking the third we have to see a wider range of colors. So when a dog can’t find the green tennis ball in the green grass, it’s probably because they really can’t see it (especially if it has stopped rolling). Dogs rely on movement to distinguish between things more than we do. That said dogs have more rods than humans so when it seems like they’re looking at something in the dark, and you can’t see anything, they’re probably seeing something beyond your vision.
The old idea that bulls dislike the color red is untrue – they’re red-green color blind. When a matador waves a red/pink cape to attract a bull the bull is responding to the motion of the cape, not the color. Dolphins and other marine animals see even less due to having only the long wavelength cone type and are monochromatic. Deer are red-green color blind but can see more shorter wavelength colors than we can including some amount of ultraviolet which to us is invisible. Some laundry detergents contain brightening agents that are intended to brighten the colors of your clothes but can make clothes look bright blue to deer. The result is that, even if your clothes are camouflaged, the deer probably saw you long before you saw the deer.
Even beyond seeing ultraviolet, some animals can detect/see the Earth’s magnetic fields. It’s believed that robins can see magnetic field lines as a darker shading on the normal colors they already see, but they can only see it through their right eyes and only on clear days. When cryptochrome molecules in their right eyes are struck by blue light the molecules become active and allow robins to see magnetic fields which they can use to navigate as they migrate north & south. Interestingly, non-migratory bird species seem to have less sensitivity to magnetic fields than migratory birds.
Finally, contrary to popular misconception, bats are not blind and some actually have quite decent sight. While they are red-green color blind like most other mammals they have an ultraviolet sensitivity that helps them hunt as well as detect predators. All of this in addition to echolocation means they are quite capable of seeing and navigating the world around them. That said, some species of bats as well as other nocturnal animals have no cones at all and are really truly color blind.
The intentionally confusing language of business, politics, and advertising that helps the speaker fit in, lie, and pretend to say something when saying nothing.
After WWII there was increasing interest in the sociology of leadership, how groups of people interact, etc. The military as well as corporations (such as General Electric, AT&T, IBM, etc.) wanted to know the most efficient ways to run their organizations. They wanted to know how workers could find personal fulfillment in the workplace while also increasing profits. They turned to researchers and consultants to help them manage their growing workforces. This was the dawn of corporate jargon.
Corporate jargon (e.g. customer-centric, CSAT, flywheel, hard stop, disrupt, in the loop, stakeholders, value added, value stream, synergy, restructure, disrupt, circle-back, think outside of the box, paradigm shift …) is a product of post-WWII consulting. Corporate jargon is the language of white-collar business – it’s metaphors, acronyms, euphemisms, and other linguistic tools used to dress up ideas.
Mid-century consultants peppered their advice with this new business speak. Their clients heard these terms and used the same jargon towards their coworkers, who then told other coworkers, etc. Over time the business lexicon changed & grew as it spread around the world like a virus.
Corporate jargon is a form of doublespeak and doublespeak is designed to deceive. It’s a way to obfuscate the truth. George Orwell’s ideas of “doublethink” and “newspeak” in Nineteen Eighty-Four are the basis of our modern idea of doublespeak. You find doublespeak not just in business but in politics and advertising as well. It’s a way of speaking that can make it seem like you’re saying something when you’re saying nothing at all. It can make the simple seem complex. More dangerously it can make intolerable concepts seem benign – “downsizing” instead of “we’re laying people off”, “gaming” instead of “gambling”, “collateral damage” instead of “we accidentally killed/hurt civilians.”
In the closing of his 1946 essay Politics and the English Language(which you can download here) Orwell says that “Political language — and with variations this is true of all political parties, from Conservatives to Anarchists — is designed to make lies sound truthful and murder respectable, and to give an appearance of solidity to pure wind.”Doublespeak isn’t about communication it’s designed to achieve conformity, or as Joseph Goebbels said, “We do not talk to say something, but to obtain a certain effect.”
Despite knowing that corporate jargon is nonsense people keep using it, and not just to lie or confuse. Using this kind of speech can serve as a signifier that you’re part of the powerful in-crowd, that you’re a serious member of the workplace. Linking right back to how corporate jargon spread in the first place, people use the words & phrases they hear their manager say and they, in turn, use the same words when talking with coworkers.
Using corporate speak is but the latest example in a long line of things subordinates have done to curry favor with their superiors. In the mid 17th century French King Louis XIV began to lose his hair (a side effect of syphilis). He turned to wearing a wig to hide this problem. Soon other members of court also took to wearing wigs so as to copy the style of the king and seek his favor. More extreme is when Louis required a surgery for an anal fistula and, again to be like the boss, other members of court also got the surgery (even if it wasn’t needed). In the court of Louis XVI & Marie Antoinette some women got special pouf hair styles constructed to advertise that they had been inoculated against smallpox just like the king & queen had been — people finding ways to signal that they are (or want to be) like the people in power.
People have always found ways to appeal to those in power and to signal their membership in a tribe. People want to be a part of the in-crowd. While corporate jargon is relatively new the motivations behind it are nothing new.