Richard J. Severson

The achievement of self-awareness could never have occurred without a sense of time-consciousness.  Without personal memories of the past, there would be no self to recall; and, without expectations of the future, there would be no opportunities to express oneself further.  Our self-identities persist as part of a narrative flow that incorporates memories of the past and future plans into the present moments of lived experience.  This edifice of the narrative self requires all three perspectives of time-consciousness to work together in a seamless march forward.  The past without a future would slip into the stillness of eternity where nothing further could transpire.  This finality to the flow of time is what ancient mystics longed to experience.  The reverse is also true: the future without a past would be just as incomplete.  Minus the reality of what has already occurred, the future would disappear just as all unmoored fantasies eventually do.  Like life and death, past and future are forever entwined in a paradoxical give and take that undergirds all of existence. 

In the evolution of the human mind, the past was mastered first because it was the easier temporal horizon to grasp.  The future is infinite in its possibilities, the past merely eternal in its fullness.  I have heard that there are more potential moves in a game of chess than there are atoms in the known universe.  It is a startling assertion, especially given the confined parameters of a chess board when compared to the possibilities of a lifetime.  It is testimony to the daunting prospect of trying to master the future. 

Without the assistance of external memory technologies, including writing, drawing, mathematics, CAD software, and so forth, the mind by itself is incapable of conceiving even a relatively simple plan for how to construct a stretch of paved highway, not to mention how to manage a naval war game or a trip to the moon.  Our distant ancestors didn’t master the future beyond the level of setting up a modest hunting party because, quite simply, they couldn’t.  They didn’t yet have the internal and external mental resources to do so.  The human mind is a marvel of nature, but it is no match for the infinite possibilities of the future, at least not in its raw, unaided state.  The minds of early humans were fine-tuned for the construction of narrative mythologies about their origins.  The capacity to plan and create vast civilizations was light-years away.    

From the beginning, human consciousness has been predisposed to favor the past, always looking to that temporal horizon for guidance and succor.  Even the future anticipated by our ancestors was conceived upon the model of the past, a repetition of what had always occurred, not a true opening to new possibilities.  A future unfettered by the past is a modern experiment that is unprecedented in our history. 

The cognitive evolution of humans, according to Merlin Donald, is a story about the development of a semantic representation system.1  The earliest hominids—the australopithecines—didn’t arrive on the scene already possessing the ability to speak and think like we do today.  On the contrary, their minds were much more similar to the great apes from which they evolved.  It wasn’t until after Homo erectus arrived on the scene approximately two million years ago that a significant cognitive breakthrough occurred.  The missing link between the episodic culture of apes and the semantic culture of modern humans is the mimetic culture of erectus, with its own rituals, games, social conventions, pedagogical learning, and other distinctively human practices.  Erectus migrated across the entire Eurasian landmass, adapting to a wide variety of environmental conditions along the way.  They were the first hominids to use fire, and are renowned for the sophistication of their tools. 

By pressing our hands to our hearts, we express our sorrow mimetically.  Mimetic acts are nonverbal metaphors, inventive physical behaviors that enabled erectus to share feelings and knowledge.  That’s not something an ape could do.  Apes respond to what they see egoistically, they don’t offer mimetic commentaries about what it all means.  Just because they couldn’t speak with words doesn’t mean erectus didn’t make noises such as grunts to communicate feelings.  Laughter is one of the earliest shared mimetic experiences, crying also.  Gesturing for somebody to move away from an unstable cliff is a mimetic act that builds upon the perception of imminent danger.  It marks the difference between perception and re-presentation, episodic memory and mimetic cognition.  The latter involves the physical interpretation of what episodic memories mean to those who share them.     

It is a principle of evolution that previous adaptations are incorporated into newer ones.  That means the mimetic representation system of erectus still operates in modern humans.  We still enjoy dancing and playing the ancient game of charades; we still communicate by making facial expressions, groans, shoulder shrugs, etc.  Children begin pointing around the age of 14 months, interacting with their environment using the nonverbal mimetic system long before they learn to speak.  Indeed, as Donald put it, “the mimetic level of representation underlies all modern cultures and forms the most basic medium of human communication” (p. 188).  The acquisition of verbal language was a crucial development in our cultural evolution, but it didn’t suppress the mimetic system.  Instead, they both operate as parallel communication systems.  The verbal system is foremost in our lives, to be sure, but words don’t alter our nonverbal exchanges much. 

Mimesis is an expression of a prehistoric root culture even today.  In order to touch our deepest feelings, we still need to engage that ancient dialectic of the body.  Every impassioned speech we make incorporates a great deal of prosodic voice modulation and fist shaking.  The limbic system still manages to make our bodies tremble when startled or terrified.   I remember attending a Native American pow wow as a college student in South Dakota.  The high-pitched vocable (non-lexical) singing and drumming made the hair on my arms stand up.  It was a deeply moving experience, probably because it reconnected me to the emotionally impactful language of my body.  I think it is the sort of experience we prize dearly.  Too much idle chatter cheapens our existence, cutting us off from our embodied roots. 

To reconnect with the vestigial mimetic system of our bodies can be a powerful healing experience.  We seek such encounters in meditation, yoga, prayer, sweat lodge rituals, drumming, and other New Age appropriations of shamanism and Buddhism in particular.  The use of complementary and alternative medicine (CAM) in conjunction with scientific medicine has interesting analogous ties to the parallel use of the deep-seated language of the body in conjunction with spoken language.  Doctors still employ the very ancient bedside habit of prompting their patients to show them where it hurts, a directive that probably predates the use of spoken language.  How hard would it be to get the same message across with a gesture?  It could be argued that the gist of medicine—past, present, and future—lies in the art of interpreting the language of the body.  What is this body telling me about itself?  That’s what every doctor’s intelligent gaze tries to discern. 

What can be said about changes in time-consciousness with the advent of mimetic representation?  Our great ape predecessors were largely confined to living in the present.  Is that true for erectus as well?  I think our predisposition to favor the past begins with erectus, actually.  That is because mimesis is a system of representation that favors the rehearsal of what we perceive so that we can mine new meaning from it.  Poets belabor their choice of words until they hit upon just the right metaphor to suit their composition.  It is similar to what erectus achieved concerning episodic memories.  Mimesis shadowed the past, reworking it into messages that embodied the protean thoughts and innermost feelings of erectus.  Mimesis is an artful form of self-expression, and it is a crucial part of most artforms to this day.  The theater, to take the most obvious example, is awash with primping, grimacing, gesticulating, slouching, pantomime, etc.  Even painting re-presents the events of our lives in interpretive forms that can shock and disturb us like body blows.  Imagine the physical impact on our bodies when we stand before the hauntingly distorted creatures in Picasso’s Guernica.  The temporal horizon of erectus was memory-bound in rituals and practices that precisely repeated past performances.  Some of their theatrical dance rituals probably represented the first hint of budding mythological narratives about the origins of life.  Not surprisingly, their conception of the future was minimal compared to ours.  Even the advent of language couldn’t make much progress in tackling the infinite permutations of that other temporal horizon, at least not right away.  The repetition of stories is a memorial form of understanding; the projection of possible interpretations onto the future, a theoretical form of it.  It took tens of thousands of years to reach the point where the future could finally eclipse the past as the more significant temporal horizon of our experience.       

Approximately 300,000 years ago the transition from the mimetic culture of Homo erectus to the mythic culture of Homo sapiens began its fateful journey.  It is the second of three major transitions in Donald’s schema of cognitive evolution.  By the time anatomically modern humans arrived in Europe approximately 50,000 years ago, the transition to mythic culture was complete.  The most obvious difference between erectus and modern humans is the latter’s use of language.  The acquisition of language didn’t occur out of the blue, however; people didn’t just suddenly begin talking.  Words would be meaningless if the mind wasn’t able to make use of them for its own purposes.  As Donald put it: “It is an empty statement to say simply that language is the crucial adaptation that led to the first semiotic cultures” (p. 212). 

The human mind expanded its reach beyond the confinement to concrete events that had held mimetic cultures in developmental check.  It took half a million years for erectus to domesticate fire.  By inventing myths, humans were able to integrate disparate events into an abstract conceptual model of the entire world.  “The myth,” said Donald, “is the prototypal, fundamental, integrative mind tool” (p. 215).  Language emerged as a means for servicing a totally new kind of integrative system of thought.  Unlike erectus, human perception is designed to extract the big picture (Gestalt) first, not the nitty gritty details of disparate events.  Words and grammar—the details of our linguistic communication system—were by-products of myth, not its precursors.  The world is a unifying mental tool that we use to put everything into its proper place, and language was invented to satisfy the cognitive need to expand and unify the inner world of the mind. 

The expansion of the world as a mental working space was the cognitive development that led to the creation of language.  Even today, young children use words to explore the world experimentally.  They point and reach as they struggle to articulate their desires in just the right words.  In human ontogeny, the mimetic system of gestural communication emerges alongside spoken language.  Not surprisingly, the first words ever uttered were probably derived from standardized mimetic gestures.  Emblems such as waving, shaking the head, and smiling are readily understood without the need for words.  Erectus conveyed messages of that sort a million years before the invention of language.  Gradually, such gestures became more complex and truly linguistic in their scope.  Drawing an air image about how to throw a spear at an antelope is an example of a linguistic gesture that functioned like a proto-sentence.  We differ from erectus not because we speak with words, but because of the expansive mental models we employ.  Our models are more comprehensive and integrative, calling forth the use of symbolic invention to keep building a deeper understanding of the universe itself. 

The use of language changed human anatomy, including tongue dexterity, facial musculature, larynx and vocal cords, brain structure, breathing patterns, etc.  Apes control their voices from subcortical limbic regions of the brain, which is why they can only produce a few dozen distinct sounds.  The human voice, on the other hand, maps to a newly expanded area of the cerebral cortex.  We take the complex mental planning necessary to coordinate rapid speech movements entirely for granted because it is a subconscious routine for us.  It wasn’t always that way, of course.  Imagine the conscious effort that went into learning how to speak fluently; it must have been a painfully slow process.  Not only did our speech mechanisms develop and expand with the acquisition of language, so did our auditory system.  We have the ability to perceive words and phrases as discreet objects rather than random sounds.  We also have an internal articulatory loop that enables us to rehearse what we are about to say to make sure it sounds right.  Sound is crucial to memory.  It is hard to memorize similar sounding words, for instance.  In order to remember things, we repeat them to ourselves over and over.  Articulation is a mnemonic device.  It’s probably an extension of the repetitive ritual behaviors that shaped the mimetic memory of erectus.  Semantic memory internalizes ritual behavior in that sense. 

Symbols were invented for the purpose of putting the world into order.  Aphasic stroke patients can’t think well because they have no words to do it with; their mental world has been stripped of its lexical tools.  Narrative is the natural form of our mental worlds, and myth is the highest form of narrative.  Essentially, myths impose an interpretation—a framework for understanding—upon the entire world.  Myths are the product of generations of story swapping conversations regarding the meaning of life and death.  When one group conquers another, they impose their myths upon them.  The natural instinct is to resist such an imposition; to lose one’s myth is a loss of identity, a collapse of the world.  Myth sat at the top of the cognitive pyramid in every Stone Age society.  Shamans were its regulators; also, I suspect, its creators.  To be sick is analogous to losing one’s identity, which is why sickness is experienced as a disruption of the world.  The mythic mind is a world modeling device, and myths were our first symbolic models.  Much like history, all myths reconstruct the past, establishing the original order of creation and what it means to be in the world.   

Imagine what it was like to create those very first myths.  There were no words yet, no narrative forms.  Time-consciousness barely extended beyond the past of recent events or the near future of tasks yet to be accomplished in the days ahead.  Erectus lived in a very concrete world dominated by basic needs for food, shelter, security, tools, and so forth.  Dreaming of new possibilities wasn’t their forte.  How do you invent mental capacities that don’t yet exist?  It can’t be done, at least not intentionally; to imagine something unimaginable would have to be classified as an accident of fate.  Modern humans didn’t start from scratch in their invention of myths; they couldn’t possibly have done so.  Instead, they relied upon old mimetic habits to accidentally—not deliberately, at least not at first—expand the boundaries of their mental worlds.  Repetition of sounds and movements in rhythmic dance, drumming, prosodic singing, animal calls, ghostly pantomime… these were the ingredients of shamanic séance.  I think the healing séance was the experimental birthplace of the mythic cognition system that distinguished our ancestors from erectus

The final transition in Donald’s schema concerning cognitive evolution was from the mythic outlook of our distant ancestors to the theoretical gaze of modern societies.  Whereas mythic stories attribute human significance to the universe as the natural setting for life’s mortal adventure, theory strips away all mythic attributions in order to analyze the universe from a purely objective perspective.  From its earliest inception in ancient Greece, theoretic culture has undertaken the task of demythologizing the mind.  It has been an agonizing process stretching across several millennia thus far.  The first theories pertained to astronomy, and the pragmatic predictions necessary for mundane activities such as navigating a sailing ship, or harvesting a crop at the appropriate time of the season.  Stonehenge is a prehistoric monument dedicated to the precise analysis of astronomical events.  By interpreting what the massive stones revealed, ancient humans knew such things as the exact time of summer and winter solstices.   

According to Donald, “Greeks were the first to exploit the new cognitive architecture that had been made possible by visual symbolism” (p. 342).  The invention of visual symbols, beginning with pictorial art in ceremonial caves such as Lascaux in Southern France, enabled early humans to store their memories externally.  That was the first crucial step toward theoretic culture.  Oral cultures rely almost exclusively upon stories to preserve their knowledge.  Narrative is the natural organizing principle of the mind; it’s how our memories work—by weaving what’s important into narrative patterns.  Myths, as I said earlier, are the highest form of narrative construction.  They make sense of our existence by imposing a meaningful framework—a creation story—upon the known universe.  Without the aid of a different kind of memory that isn’t beholden to narrative structures, we wouldn’t be able to study nature with scientific precision.  Our natural mode of thought is storytelling.  To break out of that mold and think differently—analytically, abstractly, logically, collaboratively, scientifically—required the use of external memory.  Painting and sculpture were the first ventures in the long process of externalizing our own thoughts and memories.  By storing what we think in visual artifacts, we began to reach beyond the confines of the unaided mind’s narrative limitations. 

From the humble beginnings of cave art, ancient humans steadily increased the scope of their visual inventions, including ceramics, metallurgy, maps, diagrams, mathematical symbols, and, most significant of all, writing.  The first cuneiform tablets were invented 6,000 years ago in Mesopotamia.  Mostly, they were lists of goods sold by the royal courts of Uruk and other ancient city-states.  Only a handful of professional scribes were able to master the cumbersome details of literacy until the ancient Greeks invented the first phonetic alphabet.  Even then, literacy was the provenance of an elite educated class until well after the invention of the printing press in the 15th century.  It’s no coincidence that modern science began to flourish soon thereafter.  Writing and other forms of graphic invention enabled us to preserve our thoughts and make them part of a public domain that fostered their continuous improvement.  Scientific inquiry is a form of scholarship, a collaborative project that exceeds the ambitions, imagination, and limitations of any one person.  Writing enabled us to create a public working space, a shared sketchpad, for such collaborations.  The human mind has become a hybrid system of representation, symbiotically wedded to its own invention of external memory. 

For most of its history, formal education has been burdened with the task of training students how to speak well, interpret texts, make arguments, and, generally, navigate the burgeoning external memory output of theoretic culture.  Not until the 19th century did it begin emphasizing the specialized knowledge of the sciences.  The proliferation of knowledge beyond the confines of even the largest university libraries continues unabated to this day.  We are all librarians and Google aficionados now, adrift on the great sea of human invention.  What that means for the diminishing role of our biological memories, and our being in the world—perhaps even our health and well-being—is unclear.  Never before has the capacity of the human mind been dwarfed to such an extent by its own artifacts and machinations.  Currently, the output of scientific research doubles every nine years, and the pace is quickening.2       

The transition from mythic culture to theoretic culture fostered a monumental reversal in time-consciousness.  Biological memory has always tethered us to the past as the dominant horizon of our existence.  Instinctively, we still look back to our childhood, or some other happy time, when we feel threatened.  As I said earlier, the infinite possibilities of the future are far too complex for unaided human memory to master.  Future planning was not the forte of our hunter-gatherer ancestors.  That’s something modern humans are good at only because we have created an external locus for memory and cognition that enables us to model and predict the complicated mathematical probabilities of future events.              

            At some point in the past few hundred years, the tethers connecting us to the past were truly untied for the very first time.  Slowly but surely, the steady progress of theoretic culture tilted our minds toward a future that finally surpassed the past even in the realm of imagination.  At that tipping point, perhaps the greatest reversal in the history of our species occurred.  The meaning of our existence took on a different tone and significance as a result.  We still look to the starry heavens above for our myths of belonging, but now they represent human possibility, not the monumental grave markers of creation.  Science fiction is our myth-generating literary invention, Star Trek and Star Wars our dreamy destiny.  This great reversal in time-consciousness is epochal on many levels.  Like a stone dropped in a still pond, it created ripples that are still emerging.  Many of the milestones and paradoxes of our time make more sense in light of it—the break with the past that we call “modernity,” for instance.  It truly does represent a breakdown in the continuity of time, and a fundamental reversal of outlook on life.  Ancient philosophers craved the stillness of eternity, and a retiring, contemplative life to go along with it; modern adventurists want to build spaceships to explore the infinite universe in a life dedicated to endless future discovery.  The growing skepticism toward institutional religions—the last harbingers of our animistic/mythic past—seems even more inevitable, as does the New Age yearning for a medicalized form of spiritual self-care that is blithely indifferent to the cultures from which it pilfers rites and ceremonies to serve an ahistorical purpose that is baggage-free and perpetually sunny. 

The future unleashed from the past has no loyalties to the well-trodden pathways of traditional societies.  To be a child of the future is a new iteration of life’s inexorable march toward unfettered freedom.  Life constantly reaches beyond itself in order to continue on its journey.  Only the future can represent unadulterated liberty because liberty, like the future, is made from pure potential.  If every dream suddenly became reality, the future would disappear, as would liberty/freedom.  How is it possible to have a future when you are already there living and breathing it?  It seems like a strange predicament to be in, the exact reversal, perhaps, of what our earliest forebears must have experienced regarding the haunting specter of the past. 

Every step forward in our cognitive evolution has been accompanied by an increase in memory capacity and time-consciousness.  The question is, what comes next?  What sort of biological memory awaits us in the shadows of computational algorithms, artificial intelligence, genetic engineering, cyborg implants, and all the rest of the triumphal prospects that promise to transform the future?  A new kind of memory is emerging alongside of a new kind of self-identity.                

Notes:

  1. Merlin Donald, Origins of the Modern Mind (Harvard, 1991).
  2. Lutz Bornmann and Ruediger Mutz, “Growth Rates of Modern Science,” Journal of the Association for Information Science and Technology 66, no. 22 (2015), 2215-22.

2 Comments

  1. Unknown's avatar Anonymous says:

    Extraordinary!. This sure tickles the synapses. I am sort of training a new pup. I keep trying to just talk to her in English. It occurs to me that non verbal cues might be better. She solidly recognizes my whistle. It gets her attention quickly away from all the other worldly wonders she is sniffing out. But, this essay makes me think that hand, face and body gestures in addition to my words might impress my intentions on her faster than me just talking. Your thoughts? We don’t think of hand signing as a communication tool for dogs in the same way we have accomplished with apes but we do know that hand signals, gestures, work well for a variety of animals. And has the advantage of being able to communicate across a big field, useful for hunting and herding.

    Like

    1. I agree that the language of the body–gestures, for instance–ties us more closely to our animal past, and to the other species of animals that we still live amongst. Good luck training that pup!

      Like

Leave a reply to Anonymous Cancel reply