From Prehistoric Naivety to Hypermodern Alienation
A metahistorical analogy and the Promethean revolt
There’s a self-congratulatory way of understanding the meta-history that arose after the Scientific Revolution, according to which history is progressive and can be divided into stages that reflect the degrees of our collective ignorance.
Prehistory and early history would have been marked by myth, magic, and religion. Those periods eventually led to philosophy and protoscience, which produced more rational, naturalistic explanations. Finally, there was the period of science and hyperskepticism in which we tapped into overflowing sources of information and learned to circumvent our personal biases by subjecting our suspicions to scientific tests.
This “modern” interpretation of history was partly propagandistic since the notion of secular progress figured in the culture war between faith and reason. The positivistic story about how our species progressed by solving its problems with courage, reason, and industrious ambition was meant to contrast with the Christian story that we ought to surrender to tyrannical supernatural powers — and to the earthly theocracies that supposedly speak for them. Instead of hard-won, human-made progress, there would be an apocalyptic overhaul of the fallen, natural domain at the end of time, rendering secular advances futile. God would even punish us for that hubris.
In any case, that self-serving appraisal overlooked a more objective basis for comparing the historical stages. Oswald Spengler’s theory of the cyclical progression of civilizations was closer to the mark, although he too presupposed a Romantic outlook.
A civilization, he said, begins with pure commitment to an ideal or a creative vision that distinguishes the culture’s character. Eventually, the population loses confidence in that ideal or can no longer remember the principles that define its collective identity, perhaps due to the population’s mixing with foreigners. Eventually, the civilization becomes a husk of its former self, and its people turn aimless and nihilistic, awaiting a social rebirth or conversion to a new way of life.
Spengler’s theory is much more complicated than that summary and includes numerous insights on a host of topics. But his basic point about the cyclicality of civilizations, which is to say their necessary growth and inevitable decline suggests a comparison with the universal stages of each individual’s growth. The collective stages of history map onto the phases of the individual’s psychological development, not because of any zeitgeist or cosmic design or telos, but because of the similar consequences of the accumulation of knowledge on either scale.
The analogy between historical and individual stages of growth
Children are relatively carefree because they’re unburdened by responsibilities, and they lack duties because they don’t yet know much and haven’t done much of anything. Social responsibilities are supposed to be assigned to those who’ve shown they can handle them. Children are still growing and learning, so their job, as it were, is mainly to play.
By contrast, adults eventually become aware of too much to be as content as a simple-minded child. There are Eastern religions that teach ways of regaining that initial peace of mind, but there’s no way for an adult to return to a child’s state of naïvety without sacrificing his or her intellectual integrity. Not only does the adult know enough about how the real world works to throw into doubt conventional reassurances and easy religious and therapeutic answers, but the adult is weighed down by her experiences and in particular by her memories of what she’s done and of what the world has done to her.
This looks like the underlying dynamic behind Spengler’s idea of the nostalgic loss of our formative purity and vitality. Civilizations would begin in a state of relative ignorance and inexperience and would proceed to develop a history that ends in excessive cultural memory. That memory would bear the bad news that whereas the culture’s founders might have had high hopes, based on their puritanical idealism, the later generations that attempt to apply them discover that the real world at large is perfectly indifferent to that founding vision.
The ideals run up against reality and the latter grinds down the believers until the later generations naturally lose faith. They enter a “hypermodern” phase of jadedness. Their latter-day generation may enjoy wealth, material comforts, and other fruits of economic growth, but these only exacerbate the malaise. Their birth rate declines as they become inward-looking and decadent since they no longer want to throw children to the wolves. Their politics are corrupt because the population has no trust in the founding principles that are supposed to rein in the baser social dynamics of parasitism, predatory exploitation, and the calcification of dominance hierarchies.
But this scope makes for a somewhat muddled account because of the cross-fertilization of ideas in history and the cultural mixing at all civilizational stages, including the early ones. The borders of a civilization aren’t always so clear and we risk losing sight of subcultures if we focus only on the broadest tendencies.
Prehistoric naivety and the accumulation of knowledge
A less equivocal comparison, then, would be between the stages of individual human development and those of our entire species’ accumulation of knowledge and historical memory. This would make for an objective comparison because of the technological limitations of prehistoric people’s ability to remember their past and investigate their present.
In that respect, nomadic people in the Stone Age were childlike. That’s not to say they led carefree lives, of course, since they struggled through ice ages and faced many dangers in the wild with little to protect them; as a result, their average lifespan was relatively short (33 years).
However, the hunter-gatherers also didn’t need to work as hard because their groups were small and stable. They worked less for their food than the farmers who succeeded them, who had to feed the ballooning populations of their sedentary societies. But the most obvious childlike aspect of the hunter-gatherer clans was the naivety of their conception of how they fitted into the world, due to their absence of historical memory.
By definition, in prehistory, there was no reliable sharing of knowledge between generations because there was no objective record. There was an oral transmission of techniques for hunting and the crafting of tools, but there were no sophisticated languages, written records, or long-lasting artifacts to sustain an appreciation of a shared human effort, a historic “race” to achieve a common objective.
To be sure, the prehistoric wanderers had a deep knowledge of their immediate surroundings since they had to track their prey and study the weather and the flora to distinguish between the advantageous conditions and the threatening ones. This is to say they were experts on how to survive in their evolutionary context — but that’s only an animal’s scope of knowledge. All animals that flourish master their terrain in occupying some niche. Prehistoric people’s expertise in surviving by playing their biologically assigned role may have been boundless, but that adaptation wouldn’t have amounted to the type of far-flung awareness that defines behaviourally modern people.
That awareness likely accelerated in the Later Upper Paleolithic, with the transmission of culture, due to the pooling of knowledge in increasingly larger human populations; the inventions of language, writing, and more durable materials; and the advent of hierarchical social structures, with leaders marshaling their subordinates with myths and religions that brought to the forefront the question of their shared undertaking.
As for the prehistoric animist’s mindset, the projection of human spirituality onto the entirety of nature would have made for a charming mode of experience. Animists treated the environment as an extension of themselves, one they could negotiate with rather than responsibly objectify — somewhat like how the child’s social instincts spill out into a world the child perceives as magical since the child hasn’t developed a fixed personal identity (since the child’s brain is still growing).
The child talks to invisible friends and experiences every new sensation as uncanny because her brain is laser-focused on making sense of what she encounters for the first time. Similarly, because of its lack of historical memory, each generation of prehistoric animists encountered the world anew and deemed it animated by sociable projections of the animists’ will and intelligence (the “spirits” of nature).
Facets of Neolithic domination
The Neolithic period begins with the formation of villages that retained the egalitarian mores of the hunter-gatherers and even the practices of hunting and cultivation of the land for food. Agriculture and domestication of livestock emerge after the invention of permanent settlements, not as radical breaks from the long childhood of our species, but as renovations of prehistoric methods.
As Peter Watson points out in Ideas: A History of Thought and Invention from Fire to Freud, there was a half-way stage between hunter-gatherers and full-blown farmers who domesticated animals and sowed crops. In that middle stage, villagers cultivated “both plants and animals, in the sense of clearing areas and planting grasses or vegetables or fruits,” as well as practicing “rough herding” and keeping pets.
The archaeologist Jacques Cauvin highlights how religion grew with the rise of agriculture and the domestication of animals and takes this trifecta to have been crucial to behavioral modernity. Whereas the animistic hunter-gatherers worshipped animals, 10–12,000 years ago a more anthropocentric faith emerged, one, for example, in which goddesses were worshipped with their bull or masculine consorts (symbolizing nature’s untamable character), as shown by carvings of the period. For the first time, says Cauvin, people were depicted as being subordinate to gods who were nevertheless portrayed as superhuman.
This effective deification of human nature and the privileging of humans over animals arose as a result of the shift to sedentary village life since the villagers had separated themselves from the wild and had begun to adapt to an environment comprised largely of other people.
As Watson summarizes Cauvin’s view, it was:
…the cultivation [not the complete domestication] of wild species of cereals that grew in abundance in the Levant and allowed sedentism to occur. It was sedentism which allowed the interval between births to be reduced, boosting population, as a result of which villages grew, social organisation became more complicated and, perhaps, a new concept of religion was invented, which in some ways reflected the village situation, where leaders and subordinates would have emerged. Once these changes were set in train, domesticated plants at least would have developed almost unconsciously as people ‘selected’ wild cereals which were amenable to this new lifestyle.
Cauvin makes the point starkly in The Birth of the Gods and the Origins of Agriculture: “Animal domestication was above all a response to the human desire for domination over the animal kingdom.” Indeed, the same can be said about sedentism, the removal of people from the wild with the introduction of artificial, fixed structures (circular and later rectangular houses), and about the earliest organized religions. Cauvin quotes Jean-Pierre Digard as saying that the “stupefying zeal for domestication” is explained by the image this practice gives to us of “a power over life and living things.”
After the half-way measures of cultivation, domestication was effectively a form of enslavement of dogs, goats, donkeys, horses, pigs, sheep, cows, chickens, and the rest. Sedentism was the creation of an artificial, culture-laden environment that rivaled and displaced the wilderness. Theistic religions featuring human-like gods were a means of boosting pride in human nature and rationalizing larger-scale societies in which an elite class of people eventually dominated the lower classes just as humans began to dominate the animals.
Mid-life crisis and the historic dawn of alienation
I emphasize a few details of this transition to draw out some existential implications of the analogy at hand. The rise of domestication, religion, and cities was comparable to the individual child’s formation of her mind, as she builds up a self-image based on her accumulated memories, experience, and developing character. The inventions of writing and a more permanent record in buildings, megaliths, cave paintings, and texts were like the routine ways of thinking that mark the child’s graduation to her teen and young adult years.
Eventually, the child loses the freedom of her potentiality, as she becomes trapped by her choices. By fixating on this or that, by pursuing one path rather than another in her formative years, she acquires a personality with which she’ll identify for the rest of her life. The fixation of her memories and her personality is like the leaving of artifacts by earlier generations for later historians and anthropologists to find and interpret.
The Neolithic individuals became confined, too, to their social role as they were forced to work not for the equal benefit of the group, as in a nomadic clan, but to serve the priests and royals that represented the gods. After the loss of our initial nomadic freedom, which corresponded in some respects to childhood play in individual development, the emerging specialization of societal classes, as codified by the ranks of gods in a polytheistic pantheon reflects the burdens of adulthood.
As the population expands, each member must learn his or her place in an inflexible hierarchy, and we must submit to conventional expectations. Most civilized human adults, too, become domesticated (virtually enslaved), although the elite class has more luxury to indulge its appetites.
As the techniques of investigating the world and spreading culture — our collective character — improve, especially in the last few centuries but also in waves with the innovations of ancient empires, we enter something like a globalized midlife crisis, a phase of hyperskepticism, cynicism, and apathy as we begin to appreciate the relativity and ultimate insignificance of all our works. The knowledge we painstakingly gather consists of so many messages that undermine our once-naïve confidence in our united enterprise.
At least since the advent of historical memory, we know we’re part of a human race, Team Human. Separating the special case of prehistoric people from the adulthood of our species in the same way that individual children are segregated from adult activities, we can say that all civilized people face the same universal conditions.
In particular, we pass through the naivety of childhood to the anxiety of our teen years, to adult productivity and relative servitude, and some existential crisis and period of doubt and alienation. Many people attempt to overcome that later stage by immersing themselves in their societal roles or retreating to childish ways of thinking, such as by maintaining religious faith in antiquated dogmas. But almost all adult people have at least the rational capacity to doubt their convictions and to understand that the world isn’t as magical and innocent as it seemed to us when we were ignorant children.
Promethean revolt
There are at least two existential lessons to draw from this analogy between our collective and individual developments. First, although the duration of our prehistoric period is vastly greater than our civilized period that has unfolded to date, the character of our species was defined by that transition from our collective “childhood” to our adulthood. Just as the individual teen forms her personality and suffers some anxiety from having to leave behind her earlier freedom and naivety, our ancestors chose to leave a life of roaming the wilderness for a settled lifestyle in which we took up certain decisive responsibilities.
In so far as we participate in mass society, our collective identity is defined specifically by a hubristic revolt against the natural world that disappointed us for being less than what it seemed when we knew no better in our prehistory. Regardless of whether we identify as Muslim or Christian, Chinese or Australian, or as a policeman, teacher, politician, or chef, we’re fundamentally all Prometheans. In Western mythological terms, the demonized version of Prometheus, the Greek Titan who stole fire and gave us the ingenuity that set us on a path of technological progress, is akin to Satan, the devil and rebel angel who despises the creator God and seeks to spoil Creation.
In short, behavioral modernity (especially the cultured, civilized human project) is essentially satanic — not in any confused theological sense, but in the mythic one that captures the gravity of civilization’s challenge to wild nature. The Anthropocene testifies to the magnitude of our impact on the planet, just as the ecological catastrophes we face represent the “divine” punishment for that “sin” of hubris. All of these began with the building of villages, the application of techniques for harnessing plants and animals, and the shift from egalitarian animism to egocentric, implicitly theocratic religions.
The magnitude of this ambition helps explain the tenacity of Neolithic people since the early years of farming were risky. Many of the earliest farmers would have had drastically reduced variation in their diet and suffered undernourishment. Large populations living in close quarters with animals would have been a breeding ground for disease. As a Kurzgesagt video points out, “Virtually every infectious disease caused by microorganisms that have adapted to humans arose in the last 10,000 years,” including cholera, smallpox, influenza, measles, malaria, and chickenpox. Child mortality grew, although so did the overall population because the sedentary lifestyle enabled women to bear more children.
Moreover, droughts would have caused famine, killing off massive numbers in villages. Early farming methods weren’t yet perfected or capable of generating enough stores of food to withstand all foreseeable downturns.
So, even if the transition to agriculture and domestication happened gradually, this was still a case of eventually putting all your eggs in one basket. What motivated this choice wasn’t initially sheer “satanic” ambition to overthrow the created order of enchanted nature. But soon enough, the taming of nature by farming, domestication, and theistic religion marked the conviction that humans deserve to rule, that nature, therefore, deserves to be ruled because of its fundamental flaws (what we’d come to see as its savagery and absurd neutrality towards our moral intuitions).
The creative view from nowhere
The second lesson is that there’s a neglected compensation for this hubristic, potentially doomed adventure we call “human progress,” and that’s the formation of the alienated existential perspective itself, the kind of relentless objectivity that the philosopher Thomas Nagel called the “view from nowhere.” From this perspective everything seems absurd because this outlook is akin to the aesthetic stance from which we appreciate only the surface features of art objects, bracketing our utilitarian concerns to let the art wash over us.
The meaning of art is only intended or designed by us, not inherent in the artwork as a physical object. Likewise, science tells us the real world is inherently pointless because science deals only with what can be thoroughly objectified.
We can discern something like the Hegelian evolution of divinity precisely in our darkest collective hour, in hypermodern alienation, which brings us ever closer to reckoning with the world as it really is, stripped of its childish enchantments. “Postmodern” irony, cynicism, and incredulity towards all the metanarratives that prop up our values leave us with the kind of dreaded detachment that existential philosophers exert in their analysis of the fundamental, “phenomenological” aspects of human life. For example, existentialists dwell on the implications of human freedom and our knowledge of our death’s inevitability.
The point is that life is stark when viewed in its brute existential terms. This dehumanizing objectivity that lies towards the end of our collective endeavor and that culminates in the individual’s terror of dying can also be construed as what Baruch Spinoza called the “God’s-eye view,” the perspective of eternity that grasps everything’s causal role and natural necessity.
Here, then, is a kind of mystical, philosophical maturity, the birth of which is hard-won indeed since the brutal knowledge of the objective, inhuman reality becomes inescapable only after millennia of accumulated historical memory and technological advances. Prehistoric animists foreshadowed universal spirituality in their naivety, but they grew eventually into their adult form, as it were, into the later decadent and hyperskeptical generations that would drastically accelerate the rate of technological progress. These later generations perceive the world as being ripe for human conquest, owing to the physical world’s monstrous dearth of mentality and cosmic purpose.
But again, that later, science-centered stage of ours also has the technical capacity to fill the void with meaning, to humanize the wilderness and redeem our childhood intuition that nature is animated by intelligent designs; that is, in our hands, the aesthetic stance is liable to re-enchant nature by casting all phenomena as products of sublime creativity.
Human technology supplies the designs and purposes we discovered were never really there, despite the animistic illusions sustained by our formative naivety. It’s almost as if our collective childhood dreamt of the future it would be forced to create in its haggard, alienated maturity.