Richard J. Severson
This is the Introduction to my book, A Moral Theory of Sports (2019).
Growing up, I participated in a wide range of sports, mostly at the neighborhood backyard level. We played everything—football, baseball, basketball, hockey, volley ball, tennis, golf, badminton, horse shoes, racket ball, ping pong, foosball, marbles, even square ball. We wrestled, boxed, raced our bikes and electric cars, swam, dived, bowled, chased each other in foot races, skied on the modest prairie hills of eastern South Dakota, and organized neighborhood-wide snowball fights. “I’m going to kill you!” was a common (empty) threat when you couldn’t quite catch the person or persons who swiped your long-tailed stocking cap and wouldn’t give it back.
The competition wasn’t cutthroat, but we got to know who was good at what; who the alpha kids on the block were. Growing up is about learning your place, and sports was an important proving ground for my tribe of friends.
None of us was groomed to be a professional athlete like the lucky Canadian kids who happened to be born in January that Malcolm Gladwell wrote about in Outliers.[1] According to Gladwell, they were the ones most likely to end up in the National Hockey League because the eligibility cutoff date for youth hockey made them the oldest kids in their cohort. So they got most of the playing time and attention from coaches, which meant they were afforded the best opportunities to further succeed as they got older. Eventually there came a point when the rest of the kids would never be able to catch up.
The rich get richer, accumulating their advantages.
We weren’t treated like young prodigies in my neighborhood, and many of us bear the minor scars of being passed over in the sports selection processes that determined our fates. I wanted to be a pitcher on my little league team, for example. The coach had me throw a few balls, then shook his head and sent me to the outfield. C’est la vie.
In high school, the only organized sport I participated in was wrestling. It wasn’t because I had no interest in football, basketball, or other sports. I wasn’t good enough to make the teams. Even though I grew up in a rural state, with a population of far less than a million souls (and far more cows and pigs than that), my high school in Sioux Falls had 2,400 students. The competition was stiff.
I’d have had better opportunities if I’d gone to the smaller Catholic high school, like my parents wanted. (It went against the grain of being a rock & roll bewitched teenager in 1969 to follow the sensible advice of one’s parents.)
As a sophomore I wrestled in the 105 pound weight class, and I made the varsity team simply because all the upperclassmen had grown too big. As it was, I had to lose ten pounds from my already rawboned frame. I remember chewing gum in the evening (after skipping supper) so that I could get the juices flowing and spit about ten ounces worth of saliva into a glass while watching television with the family. It was as gross then as it sounds now.
Another trick we had for making weight was rolling ourselves up into the wrestling mats after practice until we could slide in a pool of our own sweat.
As a junior I wrestled at 132 pounds, meaning I finally had the big growth spurt I had been praying about for half my life. I was in the 145 pound class as a senior, and by the time I graduated, I weighed 155.
I was a good wrestler in the practice room, but not in the actual matches. I got too nervous and froze up. I lost a lot of matches with low scores of 2 to 1, or 3 to 2. I did just enough not to be humiliated, but I didn’t have the “killer” instinct to try to win. I have often wondered how many people resort to that kind of play-to-survive-but-not-to-win strategy. Is surviving itself a kind of success? Is hanging with the best without actually beating them a kind of winning?
When I got to college, I quit playing sports. My only aerobic exercise came from strutting through bars with a bottle of beer in my hand. I was smoking about a pack of Marlboros a day. If you smoke, I reasoned, then you shouldn’t be a hypocrite and try to exercise for your health as well. We all have our standards.
That’s got to be about the dumbest bit of higher reasoning ever confabulated to justify bad habits and a wayward life.
Then something terrible happened. My mom and dad were on their way home from a weekend trip to Las Vegas with friends. They had flown out of the Twin Cities, and were driving back to South Dakota. The car drifted over the highway line on a foggy Halloween morning, crashing head on into a farm truck that was full of grain from the fall harvest. Only the woman in the front passenger seat survived. My parents were sitting in the back seat.
At the time of their deaths, 50,000 Americans died every year in automobile accidents. Of small comfort to families affected by such a routine modern tragedy, that number has now dropped to under 35,000. The byways of America are getting safer. There’s another noteworthy numbers comparison from that era: approximately 58,000 Americans (mostly young men) lost their lives in the Viet Nam war.
I turned 18 in 1973 and registered for the selective service as required by law. Fortunately, my friends and I were never drafted because the Nixon administration ended that practice the year before.
Those of us of college age in the 1970s (before adult learning came along) were mostly too young to be hippies. Sure, we wore our hair long and dressed like shabby hobos. We smoked some pot and grooved to the music at Minneapolis rock concerts when we could afford tickets and gas money. One of my friends even drove a VW bus. But we weren’t real hippies. They wouldn’t have wanted to hang out in South Dakota in any case. The winters are too harsh.
With encouragement from friends who cared about me when I didn’t care enough about myself, I rediscovered my love of sports and got caught up in the jogging craze. At first, I could barely run a mile without heart palpitations and fear of a collapsed lung. If cell phones had been around, I probably would have called for an ambulance at least once. I kept at it though, and eventually started to run in road races. The first one was a 10 miler at the Howard Wood Relays in Sioux Falls. It surprised me that I was able to finish.
Running made me feel good, and helped wean me off cigarettes. Buying my first Nike Waffle Trainers was a memorable experience. There were no specialty stores for athletic shoes in South Dakota at that time. But two elderly brothers who were fiercely passionate runners set up a little Nike shop in the basement of their furniture store. They didn’t advertise the running shoe business. You just had to hear about it through word of mouth. They also organized a running club called Prairie Striders.
A friend introduced me to the Bartling brothers, and the younger of the two took me downstairs to purchase my Nikes. They were baby blue with a yellow swoop and a black waffled rubber tread on a white foam cushion. They also sold me a tube of liquid goop and a little plastic mold for building up the waffle bumps as they wore down so that the shoes would last longer. I believe I paid $40, a sizeable sum for sneakers, or plain old “tennis shoes” as we called any athletic footwear at the time.
Bill Bowerman was the track coach at the University of Oregon who cofounded Nike with one of his former athletes, Phil Knight. In Bowerman’s 24 years at the helm, Oregon won 4 NCAA track team championships and had 16 top 10 finishes.
Blue Ribbon Sports, which eventually became Nike, was originally an athletic footwear distribution company founded on nothing more than a handshake. At first it was a 50-50 proposition, with Bowerman staying put at the University of Oregon in Eugene and Knight managing the business from Portland. Bowerman experimented with different athletic shoe designs, famously ruining his wife’s waffle iron while concocting a lightweight rubber sole. A few years later the Waffle Trainer was born, fueling the meteoric growth of their legendary company.
Before inventing the Waffle Trainer, Bowerman learned about recreational jogging for health and fitness from Arthur Lydiard in New Zealand. He brought the concept back to America. Jogging, the book he authored, sold more than a million copies and is credited with igniting the jogging craze that helped me recover from my grief and self-neglect.
There is something about running outdoors that seems to satisfy an aboriginal need deep inside of us. That’s what I began to ponder as I scampered hither and yon in my Waffle Trainers: through snowstorms and rain showers, bucking headwinds that could tear the hair from my scalp; on blistering hot days with humidity so high that my lungs felt more like gills; and on frigid mornings when even my bundled-up penis—not to mention toes, fingers, and nose—was in danger of frostbite. I ran down gravel roads, paved streets and neglected highways; golf courses, park trails and campus green spaces; even farm fields and cemeteries. I loved my baby blue Nikes!
Of course, I read about the “runner’s high,” and no doubt experienced some of its euphoric effects. For decades, it was believed that strenuous exercise releases endorphins, feel-good natural opioid peptides that seduce the brain like an illicit drug. Recently, however, German researchers have found evidence in mouse studies that the endocannabinoid reward system is responsible for the runner’s high.[2] It turns out endorphin molecules are too big to quickly cross the blood-brain barrier (a protective membrane).
The endocannabinoid hormone responsible for the runner’s high is similar to marijuana or cannabis. According to Peter Kramer, author of Listening to Prozac, most psychotropic drug research follows a homologous arc.[3] That means researchers design new synthetic drugs by imitating the natural ones found in our bodies. Recreational street drugs follow the same arc—they make us high because they are able to imitate natural feel-good hormones like the endocannabinoids. Anabolic steroids, the most demonized PEDs in all of sports, are homologous with testosterone, the body’s most potent natural performance enhancer.
In my senior biology seminar, a capstone class for biology majors, I was required to make a presentation about new research opportunities in a biological discipline. I chose to talk about ethology, the study of animal behavior through observation of living animals in their natural setting. Squeamish about killing animals and dissecting them in the lab, ethology appealed to my soul.
Jane Goodall was famous by that time for her observational studies of wild chimpanzees at Gombe National Park in Tanzania. Louis Leakey, the renowned paleontologist, raised the funds to sponsor Goodall’s Gombe research, and sent her back to England to finish her education as an ethologist.
At that time, ethologists were vigilant about the dangers of anthropomorphizing their animal subjects. They meticulously described what they saw in value-free language, never imputing human feelings, cognitions, or motivations. Goodall didn’t get the memo apparently.
She named the chimpanzees that she studied, instead of numbering them as protocol required. Nor did she shy away from using human categories to describe their personalities and loving relationships (violent ones as well). The first chimp to accept her presence and allow her to approach him in the forest was called David Greybeard. “My own relationship with David was unique,” Goodall wrote, “and will never be repeated. He allowed me to groom him, and on one never-to-be-forgotten occasion, gave me a gesture of reassurance when I held out my hand to him, offering a palm nut.”[4]
The scientific understanding of primate behavior has exploded since the 1970s. It is now beyond dispute that chimpanzees and bonobos (formerly, pigmy chimpanzees) share many of our feelings and cognitive capabilities. They can recognize themselves in a mirror, which could indicate that they are one of the few species capable of human-like self-awareness. As primatologist Frans de Waal has argued, they practice their own brand of coalition building politics, and have proto-moral instincts regarding fairness and sympathy.[5]
By the early 1990s, genetic comparisons disclosed just how closely related to chimpanzees and bonobos we actually are. They share 98% of our genome, far more than imagined. Nor are gorillas far behind. Because gene mutations accumulate at a predictable rate, it is now possible to pinpoint when archaic Homo species (hominids) split from the Pan species (chimps and bonobos). We shared a common ancestor as recently as 6 million years ago.[6]
In my seminar presentation, I also mentioned Lucy, the australopithecine hominid skeleton that had just been discovered in Ethiopia by Donald Johanson, an American paleoanthropologist. Lucy lived 3.2 million years ago. She was small—3 feet, 7 inches and 64 pounds—and probably looked like a chimpanzee except for one thing. Her pelvis and leg bones were similar to modern humans. That meant she walked upright with the bipedal gait of a hominid.
Fruit eating chimpanzees walk only 1-2 miles per day. According to Daniel Lieberman, early hominids ate tubers and other hard to chew raw plant materials.[7] They were forced to trek up to 9 miles per day in search of food as open savannas replaced forests due to dramatic climate changes. A rhythmic bipedal gait is 4 times more efficient than the crouched arm and leg ambling locomotion of the other great apes.
The need to travel so far to find food in the dry, hot African savannah led to evolutionary adaptations that made us different from our primate relatives: our legs grew longer, making it harder to climb and live in trees; the foot’s arch became more pronounced, and stiff; leg joints got bigger and bones grew thicker to absorb the impact of long-distance running. The fur on our bodies also disappeared as sweating, not panting, became a more efficient method for cooling down the body; and the nose began to protrude, creating air turbulence in the breath, adding another cooling and humidifying effect.
In addition to being foragers for plant-based foods and carrion, hominids experimented with hunting after they began making stone tools approximately 2.6 million years ago. In order to hunt faster, bigger animals than themselves, they used persistence tactics. They ran long distances at moderate speed, tracking an animal that could easily sprint away from them, then stop, then sprint away again when the hunters caught up with it. Eventually, the hunted creature would suffer heat stroke, unable to keep cool by panting as the blazing sun beat down on its fur covered body. The vertical body orientation of the bipedal hunter is less exposed to the sun than the horizontal body orientation of the quadruped quarry, another advantage for the long-distance runner.
So it is true, what I suspected when I began jogging in the mid-1970s—evolution adapted us to be runners. It’s why we have big gluteus maximus muscles, short toes, narrow waists, slow twitch leg muscles, and brains responsive to neurochemicals that minimize the pain while rewarding the metronomic monotony of putting one foot in front of another for hours on end if need be. For those first athletes in Africa, the runner’s high was a foretaste of the feast hopefully to come.
Louis Leakey foresaw the convergence of primate ethology and paleoanthropology as a strategy for better understanding human evolution. That’s why he sent Goodall to Gombe. The kind of history I was taught in college, on the other hand, began with the invention of agriculture at the end of the Stone Age. It also heavily favored the great books of western civilization. It was as if nothing that occurred prior to that (“prehistory,” it was called) actually mattered.
But it did matter, more than any other period in human history. Almost everything significant about us—our bipedal gate, large brains, use of language, music, art, culture, self awareness, moral conscience, spirituality, caring for the sick, tool making, curiosity about the world, superstition, cooking, storytelling, collaborative hunting, playing games, competing, warfare, pair bonding, and on and on—evolved and took shape in the late Pleistocene epoch.
Thanks to the efforts of innumerable scientists and scholars from a wide range of disciplines, we now know a great deal about “prehistory,” or the ancestral environment as it is often called: the roughly 200 millennia between the time when homo sapiens first emerged in southern Africa and the agricultural revolution that marked the beginning of civilization as we understand it today.
When I graduated from college, I never again stopped running or participating in sports at some level. How could I surrender what is bred in my bones to do? After a few years of working in less than satisfying jobs, I went back to graduate school, eventually becoming an academic librarian. I also studied religion and ethics, completing my PhD at the University of Iowa.
I have been curious about the archaic origins of morality for almost as long as I have been wondering about why it felt so instinctively natural—so joyful—when I took up jogging. In The Descent of Man, Darwin suggested that morality is a group fitness adaptation, meaning that moral societies, on average, will outcompete nonmoral ones because they are more cooperative.[8] That view was strongly rejected in the 1960s and 70s by evolutionary biologists who fell under the spell of the “selfish gene.”[9] They viewed evolution as strictly a matter of replicating genes. In that milieu, morality came to seem more like what De Waal called a “thin veneer” of false niceties covering up a nasty creature beneath the surface.[10]
In recent decades, Darwin’s point of view has come back in favor as evidence has mounted for natural selection occurring on multiple levels, not just at the level of gene mutations. Even individual organisms can be viewed as social groups in the sense that their bodies represent collaborative communities of specialized cells. Multi-level selection theory is more attuned to the importance of groups in the evolution of social animals, particularly their natural tendencies toward altruistic behavior.
The idea that morality and the joy of running are closely linked in our ancestral past has been kicking around in the back of my mind for years. There is no question that running, jumping, throwing, climbing, hitting, grappling, and other skills of archaic humans are the basis for many modern sports. The link with archaic morality is less obvious. The moral life as it evolved in the ancestral environment has been transformed into something less relevant and effective—at times almost unrecognizable—in complex modern societies, which are forced to rely upon legal systems and other bureaucratic mechanisms to monitor and control bad behavior. There is, however, at least one exception: A remnant of what moral life was like when all humans lived in small bands as egalitarian hunter-gatherers survives in modern sports. To interpret modern sports from the perspective of our ancestral morality is the purpose of this book.
This interpretive lens works the other way around, too: the moral life in modern times, I suspect, can be better understood, perhaps even reinvigorated, by analyzing it from the point of view of sports. In my rendition of them, morality and sports function as reciprocating metaphors, entangled narratives of aboriginal origin that are especially capable of illuminating the shadowy depths and passions of the human condition.
The play of sports is a competition (a play fight), a re-enactment on artificial terms of the blood and tooth survival games of real life. In their ability to help us both forget and imitate harsh realities, sports grant us safe passage through dangerous terrain. They enable us to find leisure in practices that spill joy upon us like the lost fountain of youth that Ponce de Leon was never able to find in his circumnavigations of the New World.
Likewise, I am going to argue that morality is a sporting contest in essential ways. We compete with one another in our moral practices, just as we do in our sports. At this point, that probably sounds oddly mistaken. That is because the competitive side of our morality has been largely whittled away by civilization. That was not the case in the ancestral environment, however, as we shall see.
This is an adventure story, and it began millions of years ago when our species had yet to emerge on the African savannahs of the late Pleistocene epoch. So far, we are the only creature with a fully formed moral life. Yet primatologists have discerned the building blocks of morality, the predispositions for it, in the empathy, sharing, play, intelligence, political alliances, trickster behavior, peace-making gestures, etc., of our nearest animal relatives.
In its existential purpose, the verve of sports is moral drama. That is my moral theory of sports in a nutshell. In the next chapter, I am going to investigate how bullying behavior led to the development of conscience in the ancestral environment. Teasing also played a seminal role in the evolution of morality. The line between the two—bullying and teasing—can be hard to appreciate sometimes. That is especially the case in modern sports. It’s time to get into the thick of things.
[1] Malcolm Gladwell, Outliers (Little, Brown, 2008), 16.
[2] Johannes Fuss, et al., “A Runner’s High Depends on Cannabinoid Receptors in Mice,” PNAS112, no. 42 (October 20, 2015), 13105-08. Accessed October 23, 2017, https://www.researchgate.net/publication/282604724_A_Runner’s_high_depends_on_cannabinoid_receptors_in_mice.
[3] Peter Kramer, Listening to Prozac (Penguin, 1993), 54.
[4] Jane Goodall, The Chimpanzees of Gombe (Belknap, 1986), 61.
[5] Frans De Waal, Chimpanzee Politics (Johns Hopkins, 2007).
[6] Christopher Boehm, Moral Origins (Basic Books, 2012), 90.
[7] Daniel Lieberman, The Story of the Human Body (Pantheon, 2013), 103.
[8] Charles Darwin, The Descent of Man (John Murray, 1882), 166.
[9] David Wilson, Darwin’s Cathedral (University of Chicago, 2002), 17.
[10] De Waal, Primates and Philosophers (Princeton University, 2006), 7-12.
