Richard J. Severson
I
In essentials, the cognitive evolution of Homo sapiens necessitated a reckoning with time. Consider our Great Ape ancestors as a starting point. According to Merlin Donald, the Great Apes have stood at the pinnacle of episodic culture for millions of years.1 They are especially adept at understanding complex social events, such as hierarchical grooming protocols, yet incapable of extrapolating abstract knowledge from their experience. They live mostly in the present, responding to the events occurring around them. They can be taught to use sign language at a rudimentary level, but they can’t invent their own languages with which to think abstractly about the world.
The ability to perceive disparate activities as unified events is the highpoint of ape culture. Complex events function like proto-narratives, which means that apes have some inkling of the past and future even though they live almost exclusively in the present. It takes imagination—some sense of what could happen next—for one ape to comprehend that another ape’s aggressive display gestures represent a personal threat. Apes can’t integrate the entire flow of episodes occurring around them into an overarching narrative structure that touts their own agency like we do. They perceive the past in episodic flashbacks, not as recollections of biographical details in their life experience. Yet in Gallup’s mirror studies, some great apes were able to recognize themselves in a mirror.2 Chimpanzees are especially good at using their eyes to control the activities of their hands, a reflexive ability that at least foreshadows human-like self-awareness.
The earliest hominids—the australopithecines—didn’t arrive on the scene already possessing the ability to speak and think like we do today. On the contrary, their minds were much more similar to the Great Apes from which they evolved. It wasn’t until after Homo erectus arrived on the scene approximately two million years ago that a significant cognitive breakthrough occurred. The “missing link” between the episodic culture of apes and the semantic culture of modern humans is the mimetic culture of erectus, with its own rituals, games, social conventions, pedagogical learning, and other distinctively human practices. Erectus migrated across the entire Eurasian landmass, adapting to a wide variety of environmental conditions along the way. They were the first hominids to use fire, and are renowned for the sophistication of their tools.
By pressing our hands to our hearts, we express our sorrow mimetically. Mimetic acts are nonverbal metaphors, inventive physical behaviors that enabled erectus to share feelings and knowledge. That’s not something an ape could do. Apes respond to what they see egoistically, they don’t offer mimetic commentaries about what it all means. Just because they couldn’t speak with words doesn’t mean erectus didn’t make noises such as grunts to communicate feelings. Laughter is one of the earliest shared mimetic experiences, crying also. Gesturing for somebody to move away from an unstable cliff is a mimetic act that builds upon the perception of imminent danger. It marks the difference between perception and re-presentation, episodic memory and mimetic cognition. The latter involves the physical interpretation of what episodic memories mean to those who share them.
It is a principle of evolution that previous adaptations are incorporated into newer ones. That means the mimetic representation system of erectus still operates in modern humans. We still enjoy dancing and playing the ancient game of charades; we still communicate by making facial expressions, groans, shoulder shrugs, etc. Children begin pointing around the age of 14 months, interacting with their environment using the nonverbal mimetic system long before they learn to speak. Indeed, as Donald put it, “the mimetic level of representation underlies all modern cultures and forms the most basic medium of human communication” (p. 188). The acquisition of verbal language was a crucial development in our cultural and cognitive evolution, but it didn’t suppress the mimetic system. Instead, they both operate as parallel communication systems. The verbal system is foremost in our lives, to be sure, but words don’t alter our nonverbal exchanges much.
Mimesis is an expression of a prehistoric root culture even today. In order to touch our deepest feelings, we still need to engage that ancient dialect of the body. Every impassioned speech we make incorporates a great deal of prosodic voice modulation and fist shaking. The limbic system still manages to make our bodies tremble when startled or terrified. I remember attending a Native American pow wow as a college student in South Dakota. The high-pitched vocable (non-lexical) singing and drumming made the hair on my arms stand up. It was a deeply moving experience, probably because it reconnected me to the emotionally impactful language of my body. I think it is the sort of experience we prize dearly. Too much idle chatter cheapens our existence, cutting us off from our embodied roots.
As I said earlier, our Great Ape predecessors were largely confined to living in the present. Is that true for erectus as well? No, it is not. Mimesis is a feat of memory—shadowing the past—reworking prior experiences into messages that embody the protean thoughts and innermost feelings of humans, beginning with erectus. Mimesis is an artful form of self-expression, and it is a crucial part of most artforms to this day. The theater, to take the most obvious example, is awash with primping, grimacing, gesticulating, slouching, pantomime, etc. Even painting re-presents the events of our lives in interpretive forms that can shock and disturb us like body blows. Imagine the physical impact on our bodies when we stand before the hauntingly distorted creatures in Picasso’s Guernica.
The temporal horizon of erectus was memory-bound in rituals and practices that precisely repeated past performances. Some of their theatrical dance rituals probably represented the first hint of budding mythological narratives about their own origins. Not surprisingly, their conception of the future was minimal compared to ours. Even the advent of language couldn’t make much progress in tackling the infinite permutations of the future, at least not right away. The repetition of stories is a memorial form of understanding; the projection of future (new) possibilities, a theoretical form of it. It took tens of thousands of years to reach the point where the future could finally eclipse the past as the more significant temporal horizon of our experience. That reversal in perspective regarding time (idealizing the future instead of the past) is what defines the modern world.
By the time anatomically modern humans arrived in Europe approximately 50,000 years ago, the transition from mimetic culture to mythic culture was complete. The most obvious difference between erectus and modern humans is the latter’s use of language. Yet the acquisition of language didn’t occur out of the blue; people didn’t just suddenly begin talking. Language emerged as a means for servicing a totally new kind of integrative mythic cognitive system. Myths were designed to extract the Gestalt, or “big picture,” from our life experiences, not the nitty gritty details of disparate events which preoccupied the mimetic performance-based cognition of erectus. Words and grammar—the details of our linguistic communication system—were by-products of mythic perception, not its precursors. As every phenomenologist knows, the world has the priority of place in our experience. How, for instance, could I know what to do with a hammer if I didn’t already live in a world within which hammers served a defined purpose? The “world” is a unifying mental tool that we use to put everything into its proper place, and language was invented to satisfy the cognitive need to expand and unify the inner world of the mind. We differ from erectus not because we speak with words, but because of the expansive mental models of the world that we employ. Our models are more comprehensive and integrative, calling forth the use of symbolic invention to keep building a deeper understanding of the universe itself.
Symbols were invented for the purpose of putting the world into order. Aphasic neurology patients can’t think well because they have no words to do it with; their mental world has been stripped of its lexical tools. Narrative is the natural form of our mental worlds, and myth is the highest form of narrative. Essentially, myths impose an interpretation—a framework for understanding—upon the entire world. Myths are the product of generations of story swapping conversations regarding the meaning of life and death. When one group conquers another, they impose their myths upon them. The natural instinct is to resist such an imposition; to lose one’s myth is a loss of identity, a collapse of the world. Myth sat at the top of the cognitive pyramid in every Stone Age society. The mythic mind is a world modeling device, and myths were our first symbolic models. Much like history, all myths reconstruct the past, establishing the original order of creation and what it means to be in the world.
II
In cognitive evolution, early humans mastered the past first, and it is only in recent centuries that we have managed to displace the importance of the past for the sake of the future. That great reversal in time-consciousness required the invention of external memory devices to supplement our own modest biological memory capacities which are best suited for narrative constructions. It is simply not possible to plan a trip to the moon on a rocket ship without the support of mathematics, blueprints, computers, books, precision manufacturing, etc. Simply put, early humans mastered the past first because it was the easier task, and the most amenable to biological memory. We had to look outside of ourselves and invent a new kind of memory to construct the modern world.
The glory of the mythic mind that dominated the Stone Age for 30,000 years or more has been swept aside by theoretical culture, which embodies the kind of cognitive wherewithal we possess today. From its earliest inception in ancient Greece, theoretical culture has undertaken the task of demythologizing the mind. It has been an agonizing process stretching across several millennia. The first theories pertained to astronomy, and the pragmatic predictions necessary for mundane activities such as navigating a sailing ship, or harvesting a crop at the appropriate time of the season. Stonehenge is a prehistoric monument dedicated to the precise analysis of astronomical events. By interpreting what the massive stones revealed, ancient humans knew such things as the exact time of summer and winter solstices.
The invention of visual symbols, beginning with pictorial art in ceremonial caves such as Lascaux in Southern France, enabled early humans to store their memories externally. That was the first crucial step toward theoretical culture. Oral cultures rely almost exclusively upon stories to preserve their knowledge. Narrative is the natural organizing principle of the mind; it’s how our memories work—by weaving what’s important into narrative patterns. Myths, as I said earlier, are the highest form of narrative construction. They make sense of our existence by imposing a meaningful framework—a creation story—upon the known universe. Without the aid of a different kind of memory that isn’t beholden to narrative structures, we wouldn’t be able to study nature with scientific precision. Our natural mode of thought is storytelling. To break out of that mold and think differently—analytically, abstractly, logically, collaboratively, scientifically—required the use of external memory. Painting and sculpture were the first ventures in the long process of externalizing our own thoughts and memories. By storing what we think in visual artifacts, we began to reach beyond the confines of the unaided mind’s narrative limitations.
From the humble beginnings of cave art, ancient humans steadily increased the scope of their visual inventions, including ceramics, metallurgy, maps, diagrams, mathematical symbols, and, most significant of all, writing. The first cuneiform tablets were invented 6,000 years ago in Mesopotamia. Mostly, they were lists of goods sold by the royal courts of Uruk and other ancient city-states. Only a handful of professional scribes were able to master the cumbersome details of literacy until the ancient Greeks invented the first phonetic alphabet. Even then, literacy was the provenance of an elite educated class until well after the invention of the printing press in the 15th century. It’s no coincidence that modern science began to flourish soon thereafter. Writing and other forms of graphic invention enabled us to preserve our thoughts and make them part of a public domain that fostered their continuous improvement. Scientific inquiry is a form of scholarship, a collaborative project that exceeds the ambitions, imagination, and limitations of any one person. Writing enabled us to create a public working space, a shared sketchpad: a new kind of shared “mind” for modeling the world. The human mind has become a hybrid system of representation, symbiotically wedded to its own inventions of external memory.
For most of its history, formal education has been burdened with the task of training students how to speak well, read and write, interpret texts, make arguments, and navigate the burgeoning external memory output of theoretical culture. Not until the 19th century did it begin emphasizing the specialized knowledge of the sciences. The proliferation of knowledge beyond the confines of even the largest university libraries continues unabated to this day. We are all librarians and Google aficionados now, adrift on the great sea of human creativity and invention. Never before has the capacity of the human mind been dwarfed to such an extent by its own artifacts and machinations. Currently, the output of scientific research doubles every nine years, and the pace is quickening.3
Biological memory has always tethered us to the past as the dominant horizon of our existence. Instinctively, we still look back to our childhood, or some other happy (mythic) time, when we feel threatened. Slowly but surely, the steady progress of theoretical culture has tilted our minds toward a future that finally surpassed the past even in the realm of imagination. Science fiction is our uniquely modern myth-generating literary invention, Star Trek and Star Wars our dreamy destiny. The future unleashed from the past has no loyalties to the well-trodden pathways of traditional societies. To be a child of the future is a new iteration of life’s inexorable march toward unfettered freedom.
III
It would not be an exaggeration to suggest that the cognitive evolution of human beings was all about mythology and time. Mythology is a feat of imagination and time-consciousness, the purpose of which was to transport our distant ancestors back in time to an original place of creation. That’s what myths do: they tell us how we got here, and what it was like at the beginning of time. “In the beginning, God created the heavens and the earth.” The book of Genesis begins with those memorable words, the kernel of every creation myth. Every human culture—every distinctive tribe—is unique in its own peculiar version of how their ancestors (the Original People) got here at the beginning of time.
The theoretical mind shifted the gravitational weight of our cognitive abilities away from the past, toward the future. The modern experiment demonstrates that humans are by their own making a bio-cultural hybrid creature. Our being in the world is dependent upon the artifacts of our own imaginative self-creation. Perhaps that has always been the case, not just since we invented writing, mathematics, and all the other external memory devices that enabled us to create a collaborative scientific mind capable of understanding the intricacies of the universe itself. Perhaps, too, it would be unfair to conclude that only our Stone Age ancestors were captivated by a mythological mindset. For every demythologization project, I suspect there is a companion “remythologization” project because, at its biological core, the mind still craves a good story/myth about why we are here.
The new myths of our time use the language of science and technology, not the language of the lost soul and its eternal search for meaning. I find it interesting that many of the most common modern mythologies seem to appear in diametrically opposed pairings. For every optimistic Star Trek-like tale of a future filled with discovery and progress, there is a corresponding Star Wars-like dystopian future of eternal (Manichean) conflict between good and evil. Which narrative strain will win the day? It is hard to predict. “Longtermism” is a moral concept that is gaining traction amongst the billionaire coterie. Based upon the moral philosophy of William MacAskill and other like-minded futurists, its projections of cultural longevity/plasticity into an unlimited future (billions of years, trillions of more humans) seems terribly anthropocentric at best.4 More prevalent to the everyday masses of television/movie-addicted people is a car-wreck fascination with post-apocalyptic mythical stories that foretell the demise of human civilization. HBO’s Game of Thrones is a perfect example.5
One final observation: the making of the modern theoretical mindset involved much more than the externalization of biological memory. Just as significant, I think, was the externalization of human morality, and the self-protective caring arts of medicine and its allied fields. We couldn’t have invented a shared scientific mental workspace without an extensive institutional infrastructure to support such hive-like collaborations. The safeguarding institutional repository of externalized memory devices is our massive educational system, including universities, research journals, external funding sources, and all the rest. Likewise, there is no way we could manage the moral conundrums that bedevil our efforts to live together in large cities and nation-states without governmental bureaucracies and their laws that require a modest level of civility and public spiritedness. Without hospitals—sprawling campuses that now dwarf all but the largest universities—how could we provide adequate levels of wellbeing for all the citizen-participants in the greatest futural myth of ongoing progress and Enlightenment? Whatever else our modern mythologies are or are not, they are (for better or worse) necessarily embedded in massive public institutions that safeguard the externalized resources of our all-too-fallible minds.
Notes:
- Merlin Donald, Origins of the Modern Mind: Three Stages in the Evolution of Culture and Cognition (Harvard, 1991).
- Gordon Gallup, “Self-awareness and the Emergence of Mind in Primates,” American Journal of Primatology 2 (January 1982), 237-48.
- Lutz Bornmann and Ruediger Mutz, “Growth Rates of Modern Science,” Journal of the Association for Information Science and Technology 66, no. 11 (2015).
- William MacAskill, What We Owe the Future (Basic Books), 2022.
- Game of Thrones (television series), created by David Benioff and D. B. Weiss (HBO Entertainment, 2011-19).