Approximately once a week, I get an email from the genetic-testing startup 23andme that says it has updated its reports on my five-year-old spit sample. As far as I can tell, the only thing that ever changes is my percentile score for Neanderthal admixture, and it only ever increases. At first, the emails told me that I had more Neanderthal DNA than 92 percent of users. But now, that number has risen to 96 percent — I have 318 instances of Neanderthal-gene variants across my 23 chromosome pairs, which is somehow more than either of my parents. While this doesn’t quite add up mathematically, it certainly checks out aesthetically — I have the wide, robust body, projecting mid-face and large nose, as well as an unusually prominent occipital bun on the back of my skull. Like the Neanderthal, I flush easily, am deeply intolerant of heat, and scientists are still debating whether I am capable of complex symbolic thought.
Despite these coincidences, I remain nonplussed about being less than 100 percent human. Given that the last interbreeding between Homo sapiens and Homo neanderthalensis probably took place 2,000 generations ago, it is more reasonable to assume that my caveman-like attributes are a gift from the more recent proto-human populations (Boston Irish, French Canadian) in my family tree. Bemusement like mine seems to be the majority reaction to users’ high Neanderthal scores, if the 23andme subreddit is any guide, but there are also quite a few examples of ex-Mormon 23andme customers using their results to initiate arguments about the Mormon Church’s idiosyncratic creationist doctrine with their families.
But what does it mean to be more Neanderthal than 96 percent of 23andme customers? Theological considerations aside, the implications are far from clear. According to the U.S. National Library of Medicine, such data points “do not provide practical information about a person’s current health or chances of developing particular diseases. Having more or less DNA in common with archaic humans says nothing about how ‘evolved’ a person is, nor does it give any indication of strength or intelligence.” Sad. Studies have suggested that Neanderthal genes may play a role in the relative risks of depression and nicotine addiction, but 23andme — which has battled the Food and Drug Administration in the past over its authority to give health advice — now tends to steer clear of heavier subjects. Instead, the site tells me that there are Neanderthal variants associated with height, straight hair, and the “reduced tendency to sneeze after eating dark chocolate.” Somehow, none of these attributes are present in my own genome, not leaving me with anything substantive to which I could attribute to my uncommonly high percentile score.
Dichotomies like these, which attempt to neatly connect a specific, visible trait to a specific Neanderthal gene, may be missing the point entirely. A 2017 paper by researchers at the Max Planck Institute for Evolutionary Anthropology found that the effect of Neanderthal variants on modern human phenotypes is complex and often unintuitive. Individuals in the U.K. with red hair had the lowest frequency of Neanderthal hair-color alleles (1.4 percent). Neanderthal hair-color alleles were found at higher rates (10 to 11 percent) among blonde, brown and black haired-individuals, with unique Neanderthal genes contributing equally to each phenotype. Similarly, the researchers found two separate Neanderthal contributions to variance in skin pigmentation — one is associated with darker skin, and the other grants us increased susceptibility to sunburn.
The first solid evidence that modern humans interbred with Neanderthals appeared in 2010, when researchers were able to sequence the Neanderthal genome using bone fragments from three individuals who met their end in a cave in Croatia around 40,000 years ago. By comparison with the genome of Homo sapiens, it has been estimated that modern humans outside sub-Saharan Africa are between one and four percent Neanderthal. (In my case, that number is closer to four.) This admixture likely took place in the Levant region, where the first physical evidence of cohabitation was discovered in 2015. These discoveries would have greatly embarrassed earlier anthropologists, who were eager to emphasize the remains’ resemblance to apes and in 1866 proposed giving the species the scientific name Homo stupidus.
The last remnants of the Neanderthal species died out approximately 40,000 years ago. There are many theories for why Homo sapiens survived and Homo neanderthalensis did not, including violence between the two species, for which there is little evidence; our domestication of the dog and the resultant hunting advantages; the higher daily caloric requirement of Neanderthals; and the unintended transmission of exotic diseases. The best-supported theories revolve around the timeless villain of climate change. Between 41,000 and 39,000 years ago, the waning days of the last ice age brought rapid environmental changes to Europe, including multiple centuries-long cold and dry periods that turned much of the continent from forest to unforgiving steppe. This would have seriously reduced the habitat of the large mammals Neanderthals had adapted to hunt over the preceding 400,000 years. Their population, which may have been less than 10,000 individuals prior to this disaster, was unable to recover.
The other major factor was adaptability. A study earlier this year found that Neanderthals had smaller cerebellar hemispheres than modern humans. Cerebellum size is correlated with, among other things, cognitive flexibility and the ability to innovate in response to changing circumstances. The authors suggest that this led to a “critical difference in cognitive and social ability between the two species” which may have underpinned their extinction, particularly given the widespread climate change that marked their last millennia as a species. Some of the last Neanderthals may have lived on artifact-rich Gibraltar, a peninsula on the southern coast of Spain, where the climate would have been one of the mildest in Europe during the cold/drought periods that coincided with their final disappearance. (Coincidentally, Gibraltar is also home to Europe’s only wild population of non-human primates.) An eight mile sea voyage across the Strait of Gibraltar would have taken them to the ice-free climate of North Africa — a tiny jump compared to what Homo sapiens did in Polynesia with wooden canoes — but something, possibly the non-firing of a few key neurons, seems to have stood in the way.
While infinite adaptability is essential for modern life, it would have seemed superfluous to the vast majority of our forebears, both neanderthalensis and sapiens. Prehistory (excluding the span between the inventions of agriculture and writing) wasn’t merely a time before written history — it was a state of being that changed so little that it precluded the very concept of history. Without history, there are no historical narratives. Without historical narratives, the past, present and future blend together.
As a Neanderthal or a Cro-Magnon, you would have used stone tools to hunt large game. Your tenth-great-grandparents, and your tenth-great-grandchildren, would have used the same stone tools to hunt the same large game. Except for climate patterns, there was precious little change to adapt to for most of the 200,000 years anatomically modern humans have existed. The slight cognitive superiority of sapiens would have been the result of a few chance mutations, and its advantages would have taken thousands of years to become apparent in their relative rates of survival. In short, we have been massively overqualified for the job for most of our tenure here.
We now find ourselves nearing the limits of our adaptability. The rate of social and technological change, which was effectively zero for so long, is growing exponentially. When humans invented the revolutionary technologies of agriculture and metallurgy, we had tens or hundreds of generations to adapt. That adjustment period has been narrowing ever since. The concept of a “generation gap” would have been entirely alien to anyone living before the 20th century, when parents first started noticing profound cultural and technological rifts between themselves and their offspring. This process has accelerated to the point that we now have intra-generation gaps, like the newly-theorized divide between “old millennials” and “young millennials,” and generations are far more divided on issues like race, immigration and religion than they were 50 years ago. Most notably, there is a large age gap when it comes to our views on climate change, meaning that the survival of the human species could hinge on this very, very new axis of political division. If the pattern holds, it will only get worse — particularly if the vampires of Silicon Valley fulfill their promises of life extension, thereby allowing Gen X to live and vote concurrently with the cybernetic mecha-newborns of 2100.
Where we once sneered at Homo stupidus for hitting the upper limit of its adaptability during a cold snap, we now find ourselves perilously close to the same fate. As we begin to suffer the consequences of man-made climate change, information overload and the extirpation of the largest, meatiest mammals, the ghosts of the last species ever to seriously compete with us are ever present. The remains of our defeated competitors seem to appear at portentous times — after 40,000 years of us completely forgetting our closest relatives existed, the first bone fragments were discovered in 1829, at the apex of the first Industrial Revolution, and the type specimen was found in the Neandertal Valley in 1856, the same year Sir Henry Bessemer patented his method for the mass production of steel. Their spirit inhabits the Barbary macaques of Gibraltar, surviving eternally at the very tip of Europe, groping and stealing sunglasses from tourists to taunt us for our hubris. Slowed down, their shrieks sound suspiciously like laughter.