I received this question right after I finished crocheting all of the giftwrap ribbons into flowers and thought, “Huh, why am I doing this?”
The short answer is that I don’t know.
At the most direct and obvious level, I knit (or crochet, but for the sake of this post, I will be collapsing most yarn-related arts under the term “knitting”) because it’s fun, fast, and easy, and end the end you actually make something.
Knitting is very portable. I love my 3D printer, but I can’t exactly pop it in my purse and take it to the park with me. I skateboard, but I can’t skateboard at the mall or on an airplane (well, I could, but then I’d be having an awkward discussion with security). Sometimes I make chainmail, but that’s full of fiddly little bits that you can’t really balance on your knees.
Knitting is also very cheap. I’d love to learn something like carpentry or glass blowing, but these skills require a lot of time, room and expensive equipment to learn. Learning to knit only requires two pencils (smooth pencils make perfectly passable knitting needles) and a few dollars in yarn. Crochet requires an actual crochet hook, which might set you back a few more dollars, but either way, you can get started for less than $10.
The learning curve is a lot steeper on these skills, too–if I mess up while building a table, I’ve got a ruined table; if I mess up while knitting, I just pull on the yarn and undo the piece.
So that’s why you’ll see people knitting: it’s easy, cheap, and portable.
But this is only a superficial analysis, for Rubik’s cubes are also cheap and portable, and if not exactly “easy” to solve, you can certainly fiddle with them without any training.
But Rubik’s cubes are pointless, outside of the intellectual activity. With knitting, you get an actual item at the end.
To be honest, I think most women find many male-dominated hobbies, like sports or video games, pointless. I’ve seen many football games, and I can tell you the outcome of every single one: one team wins and the other loses. The world keeps on spinning and nothing changes except that some of the players get hurt. Similarly, men will sink hours into a video game and get nothing tangible as a reward.
I’ve said before that men seem to prefer hobbies in which they get to tinker. They like building their own rigs, repairing their own cars, optimizing settings, or trying to figure out the most efficient ways to do things. Women prefer to just get a product straight off the shelf, use it, and get the job done.
(Of course this tinkering does, long term, produce a lot of good things.)
The one sort of exception to this general rule is arts and crafts , where women dominate. A Norwegian study, for example, found that about 30% of Norwegian women between the ages of 18 and 50 had knitted something in the past year, but less than 7% of Norwegian men. Among older Norwegians, the gender divide was much wider–over 60% of women over 60 knitted, but the number of male knitters rounds to zero. So while most women you meet probably don’t knit, even fewer men are likely to pick up a ball of yarn.
I love the arts and crafts store; it’s like a candy shop for adults.
The desire to make little things for the home probably stems from the nesting instinct–a real instinct found most prominently in pregnant women, who are often struck by a sudden urge to make their homes as baby-friendly as possible. This urge is often far in excess of reason, resulting in women compulsively scrubbing the kitchen tile with a toothbrush or rearranging all of the furniture in order to vacuum under the couch. Personally, aside from all of the cleaning, I made things, including a child sized easel, train table, and stuffed animals.
So the ultimate cause is probably a mild version of the nesting instinct–a desire to make one’s home warm and comfortable.
These skeletons can be divided into two groups: those for whom we have some historical evidence (eg, Goliath, famous literary villain), and those with no evidence except images like this one.
Incidentally, modern man does not average 6 feet tall. The average American man, hailing from a well-fed cohort, is only 5″9′ (you think men are taller than they are because they all lie). The global average is a bit smaller, at about 5’7“.
Historically, people tended to be a bit shorter, probably due to inconsistent food supplies.
I have often seen it claimed that heights fell when people adopted agriculture, but most hunter-gatherers aren’t especially tall. The Bushmen, for example, are short by modern standards; I suspect that the pre-agricultural human norm was more Bushman than Dinka.
If we roll back time to look at our pre-sapiens ancestors, Homo erectus skeletons are estimated to have been between 4″8′ and 6″1′, which puts them about as tall as we are, but with a lot of variation (we also have a lot of variation). Neanderthals are estimated about 5″4′-5″5′; Homo habilis was shorter, at a mere 4″ 3′. Lucy the Australophithecine, while female, was even shorter, similar to modern chimps.
On net, a few food-related hiccups aside, humans seem to have been evolving to be taller over the past few million years (but our male average still isn’t 6 feet.)
But does this mean humans couldn’t be taller?
The trouble with being unusually tall is that, unlike apatosauruses, we humans aren’t built for it. The tallest confirmed human was Robert Wadlow, at 8 feet, 11 inches. According to acromegalic gigantism specialist John Wass, quoted by The Guardian, it would be difficult for any human to surpass 9 feet for long:
First, high blood pressure in the legs, caused by the sheer volume of blood in the arteries, can burst blood vessels and cause varicose ulcers. An infection of just such an ulcer eventually killed Wadlow.
With modern antibiotics, ulcers are less of an issue now, and most people with acromegalic gigantism eventually die because of complications from heart problems. “Keeping the blood going round such an enormous circulation becomes a huge strain for the heart,” says Wass.
Ancient people, of course, did not have the benefit of antibiotics.
What about Bigfoot?
Well, Bigfoot isn’t real, but Gigantopithecus probably was.
Gigantopithecus … is an extinctgenus of ape that existed from two million years to as recently as one hundred thousand years ago, at the same period as Homo erectus would have been dispersed, in what is now Vietnam, China and Indonesia placing Gigantopithecus in the same time frame and geographical location as several hominin species. The primate fossil record suggests that the species Gigantopithecus blacki were the largest known primate species that ever lived, standing up to 3 m (9.8 ft) and weighing as much as 540–600 kg (1,190–1,320 lb), although some argue that it is more likely that they were much smaller, at roughly 1.8–2 m (5.9–6.6 ft) in height and 180–300 kg (400–660 lb) in weight.
They’re related to orangutans; unfortunately it’s difficult to find their remains because the Chinese keep eating them:
Fossilized teeth and bones are often ground into powder and used in some branches of traditional Chinese medicine. Von Koenigswald named the theorized species Gigantopithecus.
Since then, relatively few fossils of Gigantopithecus have been recovered. Aside from the molars recovered in Chinese traditional medicine shops, Liucheng Cave in Liuzhou, China, has produced numerous Gigantopithecus blacki teeth, as well as several jawbones.
Please stop eating fossils. They’re not good for you.
Unfortunately, since we only have teeth and jawbones from this creature, it’s hard to tell exactly how tall it was.
Let’s just estimate, then, a maximum human height around 10 feet. After that, your heart explodes. (Joking. Sort of.)
Let’s start with Goliath.
The Philistines were a real people–one of the “Sea Peoples” who showed up in the Mediterranean during the Bronze Age Collapse:
In 2016, a large Philistine cemetery was discovered near Ashkelon, containing more than 150 dead buried in oval-shaped graves. A 2019 genetic study found that, while all three Ashkelon populations derive most of their ancestry from the local Levantine gene pool, the early Iron Age population was genetically distinct due to a European-related admixture …According to the authors, the admixture was likely due to a “gene flow from a European-related gene pool” during the Bronze to Iron Age transition…
The inscriptions at Medinet Habu consist of images depicting a coalition of Sea Peoples, among them the Peleset, who are said in the accompanying text to have been defeated by Ramesses III during his Year 8 campaign. In about 1175 BC, Egypt was threatened with a massive land and sea invasion by the “Sea Peoples,” a coalition of foreign enemies which included the Tjeker, the Shekelesh, the Deyen, the Weshesh, the Teresh, the Sherden, and the PRST. … A separate relief on one of the bases of the Osirid pillars with an accompanying hieroglyphic text clearly identifying the person depicted as a captive Peleset chief is of a bearded man without headdress. This has led to the interpretation that Ramesses III defeated the Sea Peoples including Philistines and settled their captives in fortresses in southern Canaan; another related theory suggests that Philistines invaded and settled the coastal plain for themselves. The soldiers were quite tall and clean shaven. They wore breastplates and short kilts, and their superior weapons included chariots drawn by two horses. They carried small shields and fought with straight swords and spears.
Goliath’s height increased over time: the oldest manuscripts, namely the Dead Sea Scrolls text of Samuel from the late 1st century BCE, the 1st-century CE historian Josephus, and the major Septuagint manuscripts, all give it as “four cubits and a span” (6 feet 9 inches or 2.06 metres)…
It looks like Goliath was tall, but only basketball player tall, not Guinness Book of World Records tall.
The shortest guy in the picture, “Maximinus Thrax,” was a real person and emperor of Rome from 235 – 238 AD. 8″6′ is at least within the range of heights humans can achieve, and he was, according to the accounts we have, very tall. Unfortunately, we don’t know how tall he was–the ancient accounts are considered unreliable, the Roman “foot” is not the same as the modern “foot,” and crucially, no one has dug up his skeleton and measured it.
So Maximinus was probably a tall guy, though not 8″6′ (that would require the Roman foot to equal our modern foot).
Og the Rephaim:
Og, King of the Bashan, is only known from the Bible, but might have been an actual king. We don’t have any chronicles from other countries that mention him (kings often show up in such chronicles because they make war, get defeated, send tribute, sign treaties, etc., but there is one Agag of the Amelekites who does have a similar name.
Interestingly, there is one Og attested in archaeology, found in a funerary inscription which appears to say that if the deceased is disturbed, “the mighty Og will avenge me.”
The Bible claims that Og’s bed was 13 feet long. Wikipedia offers us an alternative explanation for this mysterious bed: a megalithic tomb:
It is noteworthy that the region north of the river Jabbok, or Bashan, “the land of Rephaim”, contains hundreds of megalithic stone tombs (dolmen) dating from the 5th to 3rd millennia BC. In 1918, Gustav Dalman discovered in the neighborhood of Amman, Jordan (Amman is built on the ancient city of Rabbah of Ammon) a noteworthy dolmen which matched the approximate dimensions of Og’s bed as described in the Bible. Such ancient rock burials are seldom seen west of the Jordan river, and the only other concentration of these megaliths are to be found in the hills of Judah in the vicinity of Hebron, where the giant sons of Anak were said to have lived (Numbers 13:33).
Og might have actually been a very tall person, though it is doubtful he was 13 feet tall. He might have been a fairly normal-sized person who had a very impressive megalithic tomb which came to be known as “Og’s Bed,” inspiring local legends. He also might not have existed at all. Until someone digs up Og’s body and measures it, we can’t say anything for sure.
Interestingly, I found two French giants, though neither of them, as far as I know, near Valence.
The Giant of Castlenau is known from three pieces of bone uncovered in 1890. If they are human, they are unusually large, but no research has been done on them since 1894 and even a crack team of Wikipedia editors has failed to uncover anything more recent on the subject.
I’d hold off judgment on these until someone within the past century actually seems them and confirms that they didn’t come from a cow.
Teutobochus, king of the Teutons, was a giant found in 1613, France. Unfortunately, he seems to have been a deinotherium–that is, an extinct variety of elephant.
This is the last of the reasonable skeletons. The rest exist only in graphics like the one at the top of the post and articles discussing them–in other words, there’s more evidence for Paul Bunyan.
So far I’ve found no sources on the 15 foot Turkish giant. Yes, lots of people claiming they exist, eg. No, not one photo of them.
Was a 19’6″ human skeleton found in 1577 A.D. under an overturned oak tree in the Canton of Lucerne? There are no records of it.
Any 23 foot tall skeletons near an unidentified river in Valence, France? Can’t find any.
And what about the 36 foot tall Carthaginian skeletons?
Giraffes, currently the tallest animals on earth, only reach 19 feet. T-rex was 12-20 feet tall. Even the famous Apatosaurus was a mere 30 feet tall (though we don’t know how high he could swing his head).
If you’re talking about humans who were bigger than an Apatosaurus, you’re really going to have to pause and take a biology check–and also check to make sure you aren’t holding an Apatosaurus femur.
Humans could be bigger (or smaller) than they currently are, just as dinosaurs came in many different sizes (some, like hummingbirds, are quite small), but different sizes require different anatomy. That’s why people with giganticism have heart trouble and tall people die younger: we aren’t built for it. Humans aren’t designed to handle Apatosaurus-level weights; our hearts aren’t designed to pump blood that far. A 36 foot tall human couldn’t be a single individual with giganticism, nor even a whole family or tribe of unusually tall people–they’d have to have evolved that way over millions of years. They’d be their own species, and we’d have actual evidence that their bones exist.
Incidentally, most of the sources I found discussing these skeletons, including ones using the graphic above, claim that evidence of these giants is being actively hidden or suppressed or destroyed by The Smithsonian, National Geographic, etc., because they would somehow disprove evolution by showing that humans have gotten shorter instead of taller.
This is absurd. Gigantopithecus is taller than any living ape (including humans) but he doesn’t disprove evolution. He doesn’t even disprove orangutans. A giant human skeleton would simply show that there was once a giant human–not that humans didn’t evolve.
Humans can evolve to be shorter–it has happened numerous times. Pygmies are living human people who are much shorter than average–adult male Pygmies average only 5 feet, one inch tall. Pygmoid peoples are just a little taller, and found in many parts of the world.
Even shorter, though, were Homo floresiensis and Homo luzonensis. The remains we have so far uncovered of H. floresiensis stood a mere 3 feet, 7 inches, and luzonensis was similarly petite. Both of these hominins descended from much taller ancestors.
Evolutionists don’t need to hide the existence of giant skeletons because evolution can’t be disproven by the existence of a tall (or short) skeleton. That’s just not how it works. The Smithsonian would love to display giant skeletons–if it had any. National Geographic would love to run articles on them. They’d make money like hotcakes on such sensational relics.
The problem is that no one can actually find any of these skeletons.
So. I encountered this “TV” thing while on vacation (they had DirectTV at the hotel, and I needed the kids to stay put while packing and unpacking).
Now, obviously we watch some TV, mostly Minecraft videos and some educational things, but regular TV is something else.
My kids actually demanded that we turn it off and maintained this policy through out the trip (even nixing Cartoon Network).
How do people watch this thing?
I didn’t find the basic content of the programs themselves objectionable. We saw a program featuring amateur music and dance numbers that had plenty of nice performances, for example. However, I find the way these programs are structured very unappealing:
Onscreen clutter: For example, any news program will have scrolling tickers, waving flags, and other distracting, on-screen motion that has nothing to do with the things being discussed
Frequent camera movement: Like the onscreen clutter, frequent camera movement and moving transitions between video clips keep changing what’s on the screen
Too many cuts in the footage. This contributes both to visual clutter and makes it more difficult to keep track of what’s going on because subjects keep changing.
Ads. Ads ads and more ads. They are guilty of all of the above and more.
Many ads have the additional problem of making me feel like advertisers think I am an idiot, which makes me angry.
We saw one ad on Cartoon Network in which kids (teens? I forget) made smoothies out of disgusting things and then drank them. This was not entertaining. This did not make my children want to watch the show being advertised. I have seen many absurd Youtube videos, but this took the cake.
I think it was Sesame Street that was first written with the idea that children have very short attention spans and thus the show needs to cut to something new every few minutes. This was obviously wrong, as kids will happily play for hours, day after day, with toys that they like. Crayons, bikes, slides, trains, dolls, trees, other kids–the average kid has no problem paying attention.
The difficulty was getting kids to pay attention to TV, which was still pretty new in the 60s and featured mostly black and white programs aimed at adults. Getting kids who wanted to go ride their bikes to pay attention to a black and white TV was hard. Sesame Street, as an educational project, began with the then-novel idea of using research on children to get them to pay attention so they could learn from the show.
So they pioneered the technique of using frequent visual/narrative switches to constantly ping your “Hey! Pay attention!” reflex.
I don’t know what the technical term for this reflex is, or if it even has one, but you’ve surely noticed it if you’ve ever heard your name randomly spoken at a crowded dinner party. Here you were, conversing with one person, not paying attention to the other conversations around you, when suddenly, ping, you heard your name and your head snapped up. Your brain efficiently filters out all of the noise that you don’t want to listen to, but lets that one word–your name–through all of the gates and filters, up to the conscious level where it demands your attention.
Sudden scene changes, well, they don’t happen in nature. If the lake you are looking at suddenly transforms into a mountain in real life, something has gone very wrong. But things do suddenly move in nature–pouncing lions, fleeing gazelles, occasionally boulders falling down a mountain. Moving things are important, so we pay attention to them.
At least Sesame Street had good intentions. Car advertisers, not so much.
So now television programming and advertisements, in order to keep you from getting bored and wandering away, has been optimized to constantly ping your “pay attention!” reflex. They have hijacked your basic survival instincts in order to get you to watch them so you will watch their ads and so they can make money selling you things that you probably didn’t need in the first place (otherwise they wouldn’t have needed to work so hard to get you to watch their ads).
And you pay for this thing!
The whole thing is like a scaled down version of an arcade or casino, where the whole point is to get you to enjoy paying for the privilege of being separated from your money.
To be fair, I don’t hate all advertising. Sometimes it is useful. I understand that when I download some silly little free game, it has ads. The ads pay for the game, and since it’s on my tablet, I never have sound on and I can just put it down and ignore the ads. But I also spend very little time playing such games.
I feel like the whole thing is designed to turn your brain to jelly. If you thought for too long, you might realize that this entire storyline is stupid, that you’re wasting your time, that you don’t actually care about this thing on the news, and you’d really rather read a book or go for a walk. Instead the scene changes every few minutes so you never have time to concentrate on how meaningless it all is. (Yes, it’s all Harrison Bergeron, all the time.)
While researching my post on Music and Sex, I noticed a consistent pattern: fame is terrible for people.
Too many musicians to list have died from drug overdoses or suicide. Elvis died of a drug overdose. John Lennon attracted the attention of a crazy fan who assassinated him. Curt Cobain killed himself (or, yes, conspiracy theorists note, might have been murdered.) Linkin Park’s Chester Bennington committed suicide. Alice in Chains’s Layne Staley died of heroin. The list continues.
Far more have seen their personal relationships fail, time after time. The lives of stars are filled with breakups and drama, not just because tabloids care to report on them, but also because of the drugs, wealth, and easy availability of other partners.
At least musicians get something (money, sex,) out of their fame, and most went willingly into it (child stars, not so much). But many people today are thrust completely unwillingly into the spotlight and get nothing from it–people caught on camera in awkward incidents, or whose absurd video suddenly went viral for all the wrong reasons, or who caught the notice of an internet mob.
Here we have people like the students from Covington Catholic, or the coffee shop employee who lost her job after not serving a black woman who arrived after the shop had closed, or, for that matter, almost all of the survivors of mass shootings, especially the ones that attract conspiracy theorists.
It seems that fame, like many other goods, is a matter of decreasing returns. Going from zero fame to a little fame is nearly always good. Companies have to advertise products so customers know they exist. Being known as an expert in your field will net you lots of business, recommendations, or just social capital. Being popular in your school or community is generally pleasant.
At this level, increasing fame means increasing numbers of people who know and appreciate your work, while still remaining obscure enough that people who don’t like or care for your creations will simply ignore you.
Beyond a certain level of fame, though, you’ve already gotten the attention of most people who like you, and are now primarily reaching people who aren’t interested or don’t like you. If you become sufficiently famous, your fame alone will drive people who dislike your work to start complaining about how stupid it is that someone who makes such terrible work can be so famous. No one feels compelled to talk about how much they hate a local indie band enjoyed by a few hundred teens, but millions of people vocally hate Marilyn Manson.
Sufficient fame, therefore, attracts more haters than lovers.
This isn’t too big a deal if you’re a rock star, because you at least still have millions of dollars and adoring fans. This is a big deal if you’re just an ordinary person who accidentally became famous and wasn’t prepared in any way to make money or deal with the effects of a sudden onslaught of hate.
Fame wasn’t always like this, because media wasn’t always like this. There were no million-album recording artists in the 1800s. There were no viral internet videos in the 1950s. Just like in Texas, in our winner-take-all economy, fame is bigger–and thus so are its effects.
I think we need to tread this fame-ground very carefully. Recognize when we (or others) are thrusting unprepared people into the spotlight and withdraw from mobbing tactics. Teenagers, clearly, should not be famous. But more mundane people, like writers who have to post under their real (or well-known pseudonyms), probably also need to take steps to insulate themselves from the spasms of random mobs of haters. The current trend of writers taking mobs–at least SJW mobs–seriously and trying to appease them is another effect of people having fame thrust upon them that they don’t know how to deal with.
I read recently (my apologies, I can’t find the link) that in every country where we have reliable testing data, a consistent pattern emerges: girls tend to do slightly better on reading/writing tasks than mathematical tasks, and boys slightly better on mathematical than language-tasks.
This is an interesting dynamic because it creates different “optimal” outcomes depending on what you are trying to optimize for.
If you optimize for individual achievement–that is, get each student to go into the field where they, personally, can do the best–the vast majority of girls will go into language-related fields and the vast majority of boys will go into math-based fields. This leaves us with a strongly gender-divided workforce.
But if we optimize instead for getting talented people into a particular field, the gender divide would be narrower. Most smart students are good at both math and language, and could excel in either domain. You could easily have a case where the best mathematician in a class is even more talented in language, or where the most verbally talented person is even more talented at mathematical tasks (but not both at once).
If we let people chose the careers that best suit them, some fields may end up sub-optimally filled because talented people go elsewhere. If we push people into particular fields, some people will end up sub-optimally employed, because they could have done a better job elsewhere.
Relatedly, we find that people show more gendered job preferences in developed countries, and less gendered preferences in undeveloped countries. In Norway, women show a pretty strong preference, on average, for careers involving people or language skills, while in the third world, they show a stronger preference for “masculine” jobs involving math, science, or technical skills. This finding is potentially explained by different countries offering different job opportunities. In Norway, there are lots of cushy jobs, and people feel comfortable pursuing whatever makes them happy or they’re good at. In the third world, technical skills are valued and thus these jobs pay well and people strive to get them.
People often ascribe the gender balance in different jobs to nefarious social forces (ie, sexism,) but it is possible that they are an entirely mundane side effect of people just having the wealth and opportunity to pursue careers in the things they are best at.
There has been a lot of chatter lately about whether the development of human musical abilities can be explained via some form of sexual selection. Most of this debate has been needlessly heated/involved more insults than it warrants, so I don’t want to pick on any particular people, but all of it seems to have overlooked some basic facts:
Musical success–at least as expressed in our culture–is strongly dimorphic in favor of men.
Music groupies–that is, fans who want to have sex with musicians–are strongly dimorphic in favor of women (especially teens).
Successful musicians have tons of sex.
Let’s run through a little evidence on each of these points. First, talent:
Wikipedia has a nice list of musicians/bands by # of albums sold. It probably doesn’t include folks like Beethoven, but that’s for the best since it would muck up the data to have artists whose work has been for sale for so long.
The top selling artists, with 250 million or more record sales, are:
If this list surprises you, you might want to listen to more music.
Men dominate women here 3:1.
I’m not going to list the rest of the top-selling artists on the page, but if we total them up, I count 27 women/female bands (including two bands that are half women) and 83 male (including the two half-male bands).
Remarkably, 83:27 (and 89:29) is almost exactly 3:1.
Now, some people object that “people liking their music enough to fork over money for it” is not a good measure of “musical talent,” but it is definitely a measure of musical success. If someone is super talented but no one wants to listen to them, well, I am a bit skeptical of the claim that they are talented.
The other common response I get to this runs along the lines of “But we tested musical ability in a lab, and in our experiments, men and women did equally well.”
All that shows is that you got different results; it doesn’t explain why the dimorphism exists in the real world. There are exceedingly few top-selling musicians in the world (118 on Wikipedia’s list, plus or minus a few deaths,) and it’s highly doubtful that anyone of this caliber wandered into a university music lab. It may be that musicians of average quality show no dimorphism at all (or are even biased toward women) while exceptional musicians are disproportionately male, just because there is no particular reason to assume that two different groups of people have the same range of abilities even if they have the same average. In fact, men have a greater range than women in many documented areas, like height and IQ–that is, while there are more men than women in Mensa, there are also more boys than girls in Special Ed.
The first time Scottish concert promoter Andi Lothian booked the Beatles, in the frozen January of 1963, only 15 people showed up. The next time he brought them north of the border… it was as if a hurricane had blown into town.
The night almost unravelled when nervous local police insisted Lothian bring the Beatles on early to satisfy rowdily impatient fans, even though his bouncers were still in the pub. “The girls were beginning to overwhelm us,” remembers Lothian, now 73 and a business consultant. “I saw one of them almost getting to Ringo’s drumkit and then I saw 40 drunk bouncers tearing down the aisles. It was like the Relief of Mafeking! It was absolute pandemonium. Girls fainting, screaming, wet seats. The whole hall went into some kind of state, almost like collective hypnotism. I’d never seen anything like it.”
Gone are all the jerky body movements that once earned Elvis Presley the nickname of ‘The Pelvis’. Gone are all the actions that were dubbed vulgar by his critics. Presley’s stage performance is now restrained. But that did not stop 5,500 wildly excited spectators at the Bloch Arena, Pearl Harbour, Hawaii from going outrageously wild with unreserved enthusiasm last Saturday night. Never have I heard anything like it. Their enthusiasm was fever-pitch, and they were screaming non-stop from start to finish, making it impossible to identify some of the songs he sang. Whether he was talking, singing, raising his eyebrows or just breathing, it was a signal for the volume of excitement to rise higher and higher throughout this fantastic concert.
Hundreds of naval police at this U.S. Navy fortress were detailed to restrain fanatical fans from invading the stage, and they were kept busy for the entire show. …
The climax came when he closed with the all-out rocker ‘Hound Dog’, the signal for the greatest bout of unlimited pandemonium, many of the younger girls going completely berserk! Then came the trickiest part of all – ‘Operation Exit Elvis’ – to get Presley out of the building before the crowd could tear him apart from sheer adoration.
“Screaming girls”—that was a recurring theme in newspaper reviews of Elvis’s stage shows in 1956 and 1957. At almost every stop, the girls screamed so loud that no one could hear Elvis sing. Even the musicians on stage had trouble hearing each other. … Elvis himself explained that at times in 1957 he had to cover his ears with his hands so that he could hear himself sing. …
When I spoke with some women who had attended an Elvis concert back in 1957, most of them admitted they had screamed. …
“We screamed when he came out. I didn’t know I was going to yell and scream. I’d never done that in my whole life. It was spontaneous. … He could excite you with his music so much. My mom’s gone; I guess she wouldn’t care if I said it now … it was like a sexual experience. It went through your body kind of like that.”
A rumor went around in ninth grade English class. We went home and turned on MTV to find out for sure. I remember girls crying in the hallway. …
I was watching the news when I heard, and cried. It was believable and unbelievable, all at the same time. It’s our generation’s “Where were you…?” moment. My husband, our friends, all remember where we were when we heard the news and how devastated we were. …
I was in the bathroom getting ready for school, and my dad yelled “Hey, some guy from that band you like is dead.”
I walked into the living room and saw them playing footage from one of their performances on the TV. And then they said his name. I immediately started bawling. I don’t think my mom made me go to school that day.
Seattle bid goodbye to Kurt Cobain on April 10 in true grunge-rock style, bursting the ranks of a quickly organized public vigil and leaping into the nearby international fountain, a giant, water-spouting structure some 50 yards wide and ten feet deep that flanks the Flag Pavilion. … Weeping girls wore beauty pageant banners around their middles, made out of the plastic yellow, “POLICE LINE DO NOT CROSS” tape, the same kind of tape which, three days earlier, had criss-crossed the driveway to Cobain and Courtney Love’s home.
At this point, denying that women (especially teen girls) seem to have some sort of thing for rock stars is right up there with denying that men have a thing for fertile young women with hourglass figures.
Third, the sex:
Groupie sex, oh groupie sex. How many groupies have rockstars actually boned?
Cracked has a pretty good overview if you’ve never heard of groupies before:
We’ve already written about the sex tents that Van Halen’s Sammy Hagar had installed wherever he performed so that he could disappear mid-solo and indulge himself in a groupie or nine. But that’s not the only way Van Halen was entrepreneurial with his young fans. Let’s take a minute and discuss how original frontman David Lee Roth amused his roadies by sending them out on groupie scavenger hunts.
From his lofty position on the stage, Roth would instruct his roadies to dive into the crowd and collect very specific girls for him to have sex on. The lucky girl would be given a special backstage pass with the initials of the roadie who approached her written in the top corner. If that pass was then among the ones strewn on his floor the next morning, Roth would reward the roadie with a $100 bonus at breakfast the next morning, because exchanging money for sex works up an appetite.
Motley Crue came up with the, uh, creative solution of rubbing burritos on their crotches so their girlfriends wouldn’t smell the scents of groupie sex on them:
He tells Hustler magazine, “We were always f**king other chicks at the studio and backstage… We would take Tommy’s (Lee) van to a restaurant called Noggles to buy these egg burritos and then rub them on our crotches to cover the smell of the girls we had just f**ked.
Before they became a quartet of endless punchlines, Van Halen used to be one of the coolest bands in the world, and they demonstrated their status by having sex with every female who wandered within one mile of their powerful aura. Their career is a filthy memorial to how being in a band is a more powerful aphrodisiac than things like “not looking completely ridiculous,” …
One tour saw the band build a tent directly beneath the stage specifically for Sammy Hagar’s erection. During the mid-show 20-minute guitar solos Eddie Van Halen would launch into each night, Hagar would disappear to the tent and discover a group of naked fans waiting to swallow his penis.
Mick Jagger, by the way, has (at least) eight children via five different women.
Look, I feel a little silly having to spell out in great detail the fact that rock stars get laid a lot. You probably feel a little silly reading it, yet there are people who seem hellbent on arguing that there’s no particular evidence in favor of sexual selection for musical talent.
And no, you can’t explain this away by saying that musicians are “famous” and that women want to have sex with all sorts of famous people. Donald Trump is famous, but he doesn’t have sex tents. Leonardo diCaprio is famous and has legions of fans, but as far as I know, he also doesn’t have sex tents.
I agree that we can’t definitively prove how musical talent evolved among the first humans, (because we don’t have time machines,) but the correlation between sex and music today, in our own society, is overwhelming. A claim that it didn’t have similar effects on our ancestors needs to explain what changed so radically between then and now.
Likewise, we can’t assume that just because music works like this in our own society, it must also work this way in every other society. But conversely, just because something doesn’t work in one society doesn’t imply it doesn’t work in every society. There are a lot of groups out there, and some of them are obviously weird in ways that are’t relevant to everyone else. Some people, for example, like to dress up like anthropomorphic animals and go to conventions. We should be cautious about over-generalizing from small examples. Sure, there might be a random tribe somewhere that with weird traditions like killing any women who see a musical instrument being played, but these tribes generally have fewer people in them than one concert’s worth of screaming Elvis fans.
The results were striking. Various combinations of height, weight, and head shape were significantly related to 90% of the negative C-BARQ behavioral traits. Further, in nearly all cases, the smaller the dogs, the more problematic behaviors their owners reported. Here are some examples.
Height – Short breeds were more prone to beg for food, have serious attachment problems, be afraid of other dogs, roll in feces, be overly sensitive to touch, defecate and urinate when left alone, and be harder to train. They also were more inclined to hump people’s legs.
So what’s up with small dogs? Let’s run through the obvious factors first:
Culling: Behavioral and psychological problems obviously get bred out of large dogs more quickly. An anxious pug is cute; an anxious doberman is a problem. A chihuahua who snaps at children is manageable; a rottweiler who snaps at children gets put down.
Training: Since behavioral problems are more problematic in larger dogs, their owners (who chose them in the first place,) are stricter from the beginning about problematic behaviors. No one cares if a corgi begs at the dinner table; a St. Bernard who thinks he’s going to eat off your plate gets unmanageable fast.
Rational behaviors: Since small dogs are small, some of the behaviors listed in the article make sense. They pee indoors by accident more often because they have tiny bladders and just need to pee more often than large dogs (and they have to drink more often). They are more fearful because being smaller than everything around them actually is frightening.
Accident of Breeding: Breeding for one trait can cause other traits to appear by accident. For example, breeding for tameness causes changes to animals’ pelt colors, for reasons we don’t yet know. Breeding for small dogs simultaneously breeds for tiny brains, and dogs with tiny brains are stupider than dogs with bigger brains. Stupider dogs are harder to train and may just have more behavioral issues. They may also attempt behaviors (guarding, hunting, herding, etc) that are now very difficult for them due to their size.
Accident of training: people get small dogs and then stick them in doggy carriages, dress them in doggy clothes, and otherwise baby them, preventing them from being properly trained. No wonder such dogs are neurotic.
And finally, That’s not a Bug, it’s a Feature: Small dogs have issues because people want them to.
Small dogs are bred to be companions to people, usually women (often lonely, older women whose children have moved out of the house and don’t call as often as they should). As such, these dogs are bred to have amusing, human-like personalities–including psychological problems.
Lonely people desire dogs that will stay by them, and so favor anxious dogs. Energetic people favor hyperactive dogs. Anti-social people who don’t want to bond emotionally with others get a snake.
There’s an analogy here with other ways people meet their emotional/psychological needs, like Real Dolls and fake babies (aka “reborns”). The “reborn” doll community contains plenty of ordinary collectors and many grieving parents whose babies died or were stillborn and some older folks with Alzheimer’s, as well as some folks who clearly take it too far and enter the creepy territory.
Both puppies and babydolls are, in their way, stand-ins for the real thing (children,) but dogs are also actually alive, so people don’t feel stupid taking care of dogs. Putting your dog in a stroller or dressing it up in a cute outfit might be a bit silly, but certainly much less silly than paying thousands of dollars to do the same thing to a doll.
And unlike dolls, dogs actually respond to our emotions and have real personalities. As John Katz argues, we now use dogs, in effect, for their emotional work:
In an increasingly fragmented and disconnected society, dogs are often treated not as pets, but as family members and human surrogates. The New Work of Dogsprofiles a dozen such relationships in a New Jersey town, like the story of Harry, a Welsh corgi who provides sustaining emotional strength for a woman battling terminal breast cancer; Cherokee, companion of a man who has few friends and doesn’t know how to talk to his family; the Divorced Dogs Club, whose funny, acerbic, and sometimes angry women turn to their dogs to help them rebuild their lives; and Betty Jean, the frantic founder of a tiny rescue group that has saved five hundred dogs from abuse or abandonment in recent years.
Normally we’d call this “bonding,” “loving your dog,” or “having a friend,” but we moderns have to overthink everything and give it fussy labels like “emotional work.” We’re silly, but thankfully our dogs put up with us.
The ancestors of horses–small, multi-toed quadrupeds–emerged around 50 million years ago, but horses as we know them (and their wild cousins) evolved from a common ancestor around 6 million years ago. Horses in those days were concentrated in North America, but spread via the Bering land bridge to Eurasia and Africa, where they differentiated into zebras, asses, and “wild” horses.
When humans first encountered horses, we ate them. American horses became extinct around 14,000-10,000 years ago, first in Beringia and then in the rest of the continent–coincidentally about the time humans arrived here. The first known transition from hunting horses to herding and ranching them occurred around 6,000 years ago among the Botai of ancient Kazakhstan, not far from the proto-Indo European homeland (though the Botai themselves do not appear to have been pIEs). These herds were still managed for meat, of which the Botai ate tons, until some idiot teenager decided to impress his friends by riding one of the gol-dang things. Soon after, the proto-Indo-Europeans got the idea and went on a rampage, conquering Europe, Iran, and the Indian subcontinent, (and then a little later North and South America, Africa, Australia, and India again). Those horses were useful.
Oddly, though, it appears that those Botai horses are not the ancestors of the modern horses people ride today–but instead are the ancestors of the Przewalski “wild” horse. The Przewalski was though to be a truly wild, undomesticated species, but it appears to have been a kind of domesticated horse* that went feral, much like the mustangs of the wild west. Unlike the mustang, though, the Przewalski is a truly separate species, with 66 chromosomes. Domesticated horses have 64, so the two species cannot produce fertile hybrids. When exactly the Przewalski obtained their extra chromosomes, I don’t know.
*This, of course, depends on the assumption that the Botai horses were “domesticated” in the first place.
Instead, modern, domesticated horses are believed to have descended from the wild Tarpan, though as far as I know, genetic studies proving this have not yet been done. The Tarpan is extinct, but survived up to the cusp of the twentieth century. (Personally, I’m not putting odds on any major tarpan herds in the past couple thousand years having had 100% wild DNA, but I wouldn’t classify them as “feral” just because of a few escaped domestics.)
Thus the horse was domesticated multiple times–especially if we include that other useful member of the equus family, the ass (or donkey, if you’d prefer). The hardworking little donkey does not enjoy its cousin’s glamorous reputation, and Wikipedia reports,
Throughout the world, working donkeys are associated with the very poor, with those living at or below subsistence level. Few receive adequate food, and in general donkeys throughout the Third World are under-nourished and over-worked.
The donkey is believed to have been domesticated from the wild African ass, probably in ancient Nubia (southern Egypt/northern Sudan). From there it spread up the river to the rest of Egypt, where it became an important work animal, and from there to Mesopotamia and the rest of the world.
Wild African asses still exist, but they are critically endangered.
I have no idea while equines have so much chromosomal diversity; dogs have been domesticated for much longer than horses, but are still interfertile with wolves and even coyotes (tbf, maybe horses could breed with tarpans.)
Interestingly, domestication causes a suit of changes to a species’ appearance that are not obviously useful. Recently-domesticated foxes exhibit pelt colors and patterns similar to those of domesticated dogs, not wild foxes. We humans have long hair, unlike our chimp-like ancestors. Horses also have long manes, unlike wild zebras, asses, and tarpans. Horses have evolved, then, to look rather like humans.
Also like humans, horses have different male and female histories. Male horses were quite difficult to tame, and so early domesticators only obtained a few male horses. Females, by contrast, were relatively easy to gentle, so breeders often restocked their herds with wild females. As a result, domesticated horses show far more variation in their mitochondrial DNA than their Y chromosomes. The stocking of herds from different groups of wild horses most likely gave rise to 17 major genetic clusters:
From these sequences, a phylogenetic network was constructed that showed that most of the 93 different mitochondrial (mt)DNA types grouped into 17 distinct phylogenetic clusters. Several of the clusters correspond to breeds and/or geographic areas, notably cluster A2, which is specific to Przewalski’s horses, cluster C1, which is distinctive for northern European ponies, and cluster D1, which is well represented in Iberian and northwest African breeds. A consideration of the horse mtDNA mutation rate together with the archaeological timeframe for domestication requires at least 77 successfully breeding mares recruited from the wild. The extensive genetic diversity of these 77 ancestral mares leads us to conclude that several distinct horse populations were involved in the domestication of the horse.
The wild mustangs of North America might have even more interesting DNA:
The researchers said four family groups (13.8%) with 31 animals fell into haplogroup B, with distinct differences to the two haplogroup L lineages identified.
The closest mitochondrial DNA sequence was found in a Thoroughbred racing horse from China, but its sequence was still distinct in several areas.
The testing also revealed links to the mitochondrial DNA of an Italian horse of unspecific breed, the Yunnan horse from China, and the Yakutia horse from central Siberia, Russia.
Haplogroup B seems to be most frequent in North America (23.1%), with lower frequencies in South America (12.68%) and the Middle East (10.94%) and Europe (9.38%).
“Although the frequency of this lineage is low (1.7%) in the Asian sample of 587 horses, this lineage was found in the Bronze Age horses from China and South Siberia.”
Westhunter suggests that this haplogroup could have originated from some surviving remnant of American wild horses that hadn’t actually been completely killed off before the Spanish mustangs arrived and bred with them. I caution a more prosaic possibility that the Russians brought them while colonizing Alaska and the coast down to northern California. Either way, it’s an intriguing finding.
The horse has been man’s companion for thousands of years and helped him conquer most of the Earth, but the recent invention of internal and external combustion engines (eg, the Iron Horse) have put most horses out to pasture. In effect, they have become obsolete. Modern horses have much easier lives than their hard-working plow and wagon-pulling ancestors, but their populations have shrunk enormously. They’re not going to go extinct, because rich people still like them (and they are still useful in parts of the world where cars cannot easily go,) but they may suffer some of the problems of inbreeding found in genetically narrow dog breeds.
Maybe someday, significant herds of wild horses will roam free again.
Well, there’s a clickbaity title if ever I wrote one.
Nevertheless, human breasts are strange. Sure, all females of the class mammalia are equipped with mammary glands for producing milk, but humans alone posses permanent, non-functional breasts.
Yes, non-functional: the breast tissue that develops during puberty and that you see on women all around you is primarily fat. Fat does not produce milk. Milk ducts produce milk. They are totally different things.
Like all other mammals, the milk-producing parts of the breasts only activate–make milk–immediately after a baby is born. At any other time, milk production is a useless waste of calories. And when mothers begin to lactate, breasts noticeably increase in size due to the sudden production of milk.
A number of factors associated with low milk supply have been identified, such as nipple pain, ineffective nursing, hormonal disorders, breast surgery, certain medications, and maternal obesity. … Research into breast size and milk production shows that milk supply is not dependent on breast size, but rather on the amount of epithelial tissue contained in a breast that is capable of making milk …
However, in addition to baby attachment issues, accumulating evidence shows that a major factor preventing overweight and obese mothers to breastfeed is the inability of their breast epithelial cells to start producing copious amounts of milk after birth. This is often referred to as unsuccessful initiation of lactation. …
a recent study took advantage of breast epithelial cells non-invasively isolated from human milk. In these cells, certain genes are turned on, which enable the cells to gradually make milk as the breast matures during pregnancy, and then deliver it to the baby during breastfeeding.
The study reported a negative association between maternal BMI (body mass index), and the function of a gene that represents the milk-producing cells. This suggested that the breast epithelial tissue is not as mature and ready to make copious amounts of milk in mothers with higher BMI. Most likely, the large breasts of overweight or obese mothers contain more fat cells than milk-making cells, which can explain the low milk supply of many of these mothers.
Therefore, breast size does not necessarily translate to more milk-producing cells or higher ability to make milk.
More fat=less room for milk production.
Interestingly, average cup size varies by country. Of course the data may not be 100% accurate, and the lumping of everyone together at the national level obscures many smaller groups, like Siberians, but it otherwise still indicates some general trends that we can probably trust.
If breasts don’t actually make milk, then why on Earth do we have them? Why are women cursed with lumpy fat blobs hanging off their chests that have to be carefully smushed into specialized clothing just so we can run without them flopping around painfully?
And for that matter, why do we think they look nice?
One reasonable theory holds that breasts are really just front-butts. Our apish ancestors, like modern chimpanzees, most likely not copulate ad libitum like we do, but only when females were fertile. Female fertility among our chimpish relatives is signaled via a significant swelling and reddening of their rear ends, a clear signal in a species that wears no clothes and often walks on four limbs.
When humans began walking consistently on two legs, wearing clothes, and looking at each other’s faces, this obvious signal of female fertility was lost, but not our desire to look at rear-ends. So we simply transferred this desire to women’s fronts and selectively had more children with the women who piqued our interests by having more butt-shaped cleavage.
In support of this theory, many women go to fair lengths to increase the resemblance between their ample bosoms and an impressive behind; against this theory is the fact that no other bottom-obsessed species has accidentally evolved a front-butt.
I realized yesterday that there is an even simpler potential explanation: humans are just smart enough to be stupid.
Most of us know that breasts produce milk. Few of us really understand the mechanism of how they produce milk. I had to explain that fat lumps don’t produce milk at the beginning of this post because so few people actually understand this. Far more people think “Big breasts=lots of milk” than think “big breasts=lactation problems.” Humans have probably just been accidentally selecting for big breasts for millennia while trying to select for milk production.
Of course there are smart people who are insane, and dumb people who are completely rational. But if we define intelligence as having something to do with accurately understanding and interpreting the information we constantly receive from the world, necessary to make accurate predictions about the future and how one’s interactions with others will go, there’s a clear correlation between accurately understanding the world and being sane.
In other words, a sufficiently dumb person, even a very sane one, will be unable to distinguish between accurate and inaccurate depictions of reality and so can easily espouse beliefs that sound, to others, completely insane.
Is there any way to distinguish between a dumb person who believes wrong things by accident and a smart person who believes wrong things because they are insane?
Digression: I have a friend who was homeless for many years. Eventually he was diagnosed as mentally ill and given a disability check.
“Why?” he asked, but received no answer. He struggled (and failed) for years to prove that he was not disabled.
Eventually he started hearing voices, was diagnosed with schizophrenia, and put on medication. Today he is not homeless, due at least in part to the positive effects of anti-psychotics.
The Last Psychiatrist has an interesting post (deleted from his blog, but re-posted elsewhere,) on how SSI is determined:
Say you’re poor and have never worked. You apply for Welfare/cash payments and state Medicaid. You are obligated to try and find work or be enrolled in a jobs program in order to receive these benefits. But who needs that? Have a doctor fill out a form saying you are Temporarily Incapacitated due to Medical Illness. Yes, just like 3rd grade. The doc will note the diagnosis, however, it doesn’t matter what your diagnosis is, it only matters that a doctor says you are Temporarily Incapacitated. So cancer and depression both get you the same benefits.
Nor does it matter if he medicates you, or even believes you, so long as he signs the form and writes “depression.”(1) The doc can give you as much time off as he wants (6 months is typical) and you can return, repeatedly, to get another filled out. You can be on state medicaid and receive cash payments for up to 5 years. So as long as you show up to your psych appointments, you’ll can receive benefits with no work obligation.
“That’s not how it works for me”
you might say, which brings us to the whole point: it’s not for you. It is for the entire class of people we label as poor, about whom comic Greg Geraldo joked: “it’s easy to forget there’s so much poverty in the United States, because the poor people look just like black people.” Include inner city whites and hispanics, and this is how the government fights the War On Poverty.
In the inner cities, the system is completely automated. Poor person rolls in to the clinic, fills out the paperwork (doc signs a stack of them at the end of the day), he sees a therapist therapist, a doctor, +/- medications, and gets his benefits.
There’s no accountability, at all. I have never once been asked by the government whether the person deserved the money, the basis for my diagnosis– they don’t audit the charts, all that exists is my sig on a two page form. The system just is.
Enter SSI, Supplemental Security Income. You can earn lifetime SSI benefits (about $600/mo + medical insurance) if “you” can “show” you are “Permanently Disabled” due to a “medical illness.”
“You“= your doc who fills out a packet with specific questions; and maybe a lawyer who processes the massive amounts of other paperwork, and argues your case, and charges about 20% of a year’s award.
“show” has a very specific legal definition: whatever the judge feels like that day. I have been involved in thousands of these SSI cases, and to describe the system as arbitrary is to describe Blake Lively as “ordinary.”
“Permanently disabled” means the illness prevents you from ever working. “But what happens when you get cured?” What is this, the future? You can’t cure bipolar.
“Medical illness” means anything. The diagnosis doesn’t matter, only that “you” show how the diagnosis makes it impossible for you to work. Some diagnoses are easier than others, but none are impossible. “Unable to work” has specific meaning, and specific questions are asked: ability to concentrate, ability to complete a workweek, work around others, take criticism from supervisors, remember and execute simple/moderately difficult/complex requests and tasks, etc.
Fortunately, your chances of being awarded SSI are 100%…
It’s a good post. You should read the whole thing.
TLP’s point is not that the poor are uniformly mentally ill, but that our country is using the disability system as a means of routing money to poor people in order to pacify them (and maybe make their lives better.)
I’ve been playing a bit of sleight of hand, here, subbing in “poor” and “dumb.” But they are categories that highly overlap, given that dumb people have trouble getting jobs that pay well. Despite TLP’s point, many of the extremely poor are, by the standards of the middle class and above, mentally disabled. We know because they can’t keep a job and pay their bills on time.
“Disabled” is a harsh word to some ears. Who’s to say they aren’t equally able, just in different ways?
Living under a bridge isn’t being differently-abled. It just sucks.
Normativity bias happens when you assume that everyone else is just like you. Middle and upper-middle class people tend to assume that everyone else thinks like they do, and the exceptions, like guys who think the CIA is trying to communicate with them via the fillings in their teeth, are few and far between.
As for the vast legions of America’s unfortunates, they assume that these folks are basically just like themselves. If they aren’t very bright, this only means they do their mental calculations a little slower–nothing a little hard work, grit, mindfulness, and dedication can’t make up for. The fact that anyone remains poor, then, has to be the fault of either personal failure (immorality) or outside forces like racism keeping people down.
These same people often express the notion that academia or Mensa are crawling with high-IQ weirdos who can barely tie their shoes and are incapable of socializing with normal humans, to which I always respond that furries exist.
These people need to get out more if they think a guy successfully holding down a job that took 25 years of work in the same field to obtain and that requires daily interaction with peers and students is a “weirdo.” Maybe he wears more interesting t-shirts than a middle manager at BigCorp, but you should see what the Black Hebrew Israelites wear.
I strongly suspect that what we would essentially call “mental illness” among the middle and upper classes is far more common than people realize among the lower classes.
As I’ve mentioned before, there are multiple kinds of intellectual retardation. Some people suffer physical injuries (like shaken baby syndrome or encephalitis), some have genetic defects like Down’s Syndrome, and some are simply dull people born to dull parents. Intelligence is part genetic, so just as some people are gifted with lucky smart genes, some people are visited by the stupid fairy, who only leaves dumb ones. Life isn’t fair.
Different kinds of retardation manifest differently, with different levels of overall impairment in life skills. There are whole communities where the average person tests as mentally retarded, yet people in these communities go providing for themselves, building homes, raising their children, etc. They do not do so in the same ways as we would–and there is an eternal chicken and egg debate about whether the environment they are raised in causes their scores, or their scores cause their environment–but nevertheless, they do.
All of us humans are descended from people who were significantly less intelligent than ourselves. Australopithecines were little smarter than chimps, after all. The smartest adult pygmy chimps, (bonobos) like Kanzi, only know about 3,000 words, which is about the same as a 3 or 4 year old human. (We marvel that chimps can do things a kindergartener finds trivial, like turn on the TV.) Over the past few million years, our ancestors got a lot smarter.
How do chimps think about the world? We have no particular reason to assume that they think about it in ways that substantially resemble our own. While they can make tools and immediately use them, they cannot plan for tomorrow (dolphins probably beat them at planning.) They do not make sentences of more than a few words, much less express complex ideas.
Different humans (and groups of humans) also think about the world in very different ways from each other–which is horrifyingly obvious if you’ve spent any time talking to criminals. (The same people who think nerds are weird and bad at socializing ignore the existence of criminals, despite strategically moving to neighborhoods with fewer of them.)
Even non-criminals communities have all sorts of strange practices, including cannibalism, human sacrifice, wife burning, genital mutilation, coprophagy, etc. Anthropologists (and economists) have devoted a lot of effort to trying to understand and explain these practices as logical within their particular contexts–but a different explanation is possible: that different people sometimes think in very different ways.
For example, some people think there used to be Twa Pygmies in Ireland, before that nefarious St. Patrick got there and drove out all of the snakes. (Note: Ireland did’t have snakes when Patrick arrived.)
(My apologies for this being a bit of a ramble, but I’m hoping for feedback from other people on what they’ve observed.)