Leonardo. Michelangelo. Newton. Mozart. Einstein. They have altered the shape of Western civilization; some of whom, while dead for over 500 years, we know by their first names. We venerate them, hold them in awe—we have even pickled Einstein’s brain in the hope of someday grasping the source of his brilliance. Their contributions are so extraordinary that when we hear the term, “creative genius,” our first association is likely to one of them. Indeed, one of the synonyms for genius listed in the Thesaurus is “Einstein.”
The term, “creative genius”, is freighted with intimations of the divine. Creation is a sacred act. All religions have creation stories about our origins to explain the miracle of our existence so as to answer the most haunting existential question: “Why is there something rather than nothing?” Something appears, inexplicitly, beyond our imagination, and we are left only to wonder. So, too, do we wonder about the unfathomable acts of creation by people like Leonardo and Einstein. The word, “genius,” also intimates wonderment and suggests supernatural origins: The Latin origins of the word refers to an “inborn deity or spirit that guides a person through life.”1 And we do confer a quasi-deity status to them.
This deification, however, misleads. Although, indeed, remarkable, these individuals could not have conjured their creations without the benefit of being in a very special place and time, afforded opportunities, and being embedded in a deep and extensive socio-cultural network that not only gives rise to the occasion for their creative acts, but also provides the means for their execution.
Newton is reported to have discovered the law of gravity after observing an apple fall from a tree. Miraculous, indeed. But if Newton was an Eskimo, I can confidently say: no Newtonian discovery. Same for Mozart and the rest of the pantheon of creative geniuses. Not simply because the Arctic lacks apple trees or pianos, and certainly not because Eskimos lack the capacity for such genius. Consider Newton’s law of gravity with its bizarre jumble of letters and numbers. What the hell does this mean? You need many years of very intense, specialized schooling to grasp its meaning. Or, take a score of Mozart’s music. Again, impenetrable, except with years of specialized study.
Each of these geniuses profited from the good fortune of opportunity and circumstances. Newton was educated—a extremely rare and privileged opportunity—and was a professor at Cambridge University in the 17th century. At the time, England was the driving force of the scientific revolution, and Cambridge was the white-hot center of English science. A similar pattern of good fortune and opportunity is found for the rest of the quintet of geniuses.
We venerate these individuals but fail to venerate the remarkable cultural networks, contexts and opportunities that provided the soil for their individual genius to arise and flourish.
These notable individuals are not apart from the rest of us, but are the apex of the creativity that is our human birthright. Our survival as a species depends on our inborn capacity to create and share ingenious discoveries that are drawn from and, in return, profit the larger community. Genetic mutation enables biological adaption, but the process is very slow; many generations long. The creation of new, adaptive innovations, in contrast, can occur in leaps and gain widespread use within a generation. Human adaptation occurs primarily through memes, not genes.
Our lives are densely packed with the gifts of the long history of human ingenuity, almost all by nameless individuals. The simple screw affords an example. Who invented the screw? We don’t know. When was it invented? Again, not sure, maybe ancient Greece, maybe ancient Egypt. Our world is literally held together by screws; by this original innovation created by unknown, ingenious individuals passed to us from long ago.
What is true for screws holds for sewing needles, bread, chocolate, and most of the rest of what compose our modern lives—-all products of nameless creative geniuses. We are part of this ongoing history, making our contributions, some far reaching, most more local. We each contribute to the network in our own, often humble way. It takes a village for a Newton to thrive. For a screw to be invented. And to create a village.
Planet Earth has never bothered to speak—until now. What follows is an unedited transcription of Planet Earth’s remarks to this reporter.
I am Planet Earth. I am sure you have seen pictures of me, which have, lately, been accompanied by your pleas to save me. I am bloody sick of all the posters and dire warnings about my fate by you trumpeting, self-absorbed critters presuming to speak for me when, of course, it is, as usual, all about you. So, here is my response, although I do not expect any of you to listen, or care, especially as I am not going to bolster your hollow sense of self-importance.
Let me begin by saying that the pictures you have seen of me are only a snapshot of my life, and a very brief moment at that. I would like to share the history of my earlier life as well my thoughts about my future, so you will know, really, who I am and not presume to speak for me.
My life began over 4 ½ billion years ago. That’s billion. I know it is hard, if not impossible, for you creatures of not-even-an-eyeblink to grasp, but it is important for you to appreciate the entire arc of my life before you make claims on my behalf.
The first billion years I lived a simple but volatile life. I began as a molten ball of metals and minerals, bombarded by asteroids and cosmic debris large and small, and even experienced a cataclysmic collision with another planet. All this added to my mass and girth. I was roiling with volcanic activity, enveloped in mixture of gasses (none that you would find pleasant), formed a crust, and water condensed on my surface. Above is a simulated picture. Obviously, no actual pictures exist, but you get the picture. During this time, single-celled life, anaerobes, appeared and thrived in the methane, ammonia, water vapor, neon and carbon dioxide gaseous haze that surrounded me.
After about a billion years or so, cyanobacteria appeared. These are also single-celled life forms, but unlike the anaerobes, they rely on the novel process of using energy from the sun to synthesize water and carbon dioxide to create carbohydrates; a process called photosynthesis. Oxygen is a waste product. The oxygen produced was mostly absorbed in my oceans, seabed rock and land surfaces. I experienced a relatively steady atmospheric state for the next 2 billion years where anaerobes continued to flourish; oxygen did not constitute an appreciable part of my atmosphere.
Take note: This very brief account covers 2/3 of my existance. What follows in a more detailed description of the next 1/3. Should you be keeping score (and you should), you, homo sapiens, have been around for .02% of this remaining 1/3.
Oxygen Holocaust! Oxygen Catastrophe! The Great Oxidation Event (GOE)! “The greatest pollution crisis!” These terms and phrases describe what occurred about 1.5 billion years ago, when the oxygen content of my atmosphere rose from .0001% to 3%—a 30,000 fold increase. Sunlight and oxygen were lethal to almost all the existing life forms and I became enveloped in a bubble of poisonous gas: oxygen.1
Terms like “Oxygen Catastrophe” and reference to oxygen as a poisonous gas call your attention to your use of words that you think are “objective.” Words like “pollution,” which doesn’t really mean anything to me; it is a self-serving value-loaded assessment, like “weeds” and “disease,” and the best one: “invasive species.” “Invasive?!” As if there are boundaries, like your imaginary, militantly enforced political boundaries, that have been breached by “alien” species, often imported by you, requiring extermination because they disrupt your sense of order. Want to know the most “invasive” of species on the planet at this moment? Look in the mirror.2
The oxygen content continued to increase, giving rise to radically new forms of life. Multi-celled life began around 1.5 billion years ago in this hyper-oxygenated atmosphere, but the bizarre menagerie of clinging, crawling, burrowing, flapping, splashing, slithering creatures didn’t make their appearance for another 750 million years. One of the hallmarks of these creatures is they are fragile; they don’t last long. Slight changes in my temperature or atmosphere, relative to what I have experienced, result in wholesale extinctions. 444 million years ago, glaciers amassed at my poles; 85% of the species died. Life reformed, rejuvenated, then 60 million years later enormous volcanic eruptions caused oxygen concentration to plummet; 75% of the species died.
Once again, life reformed, rejuvenated, and once again, after a period of stability, massive volcanic eruptions raised sea and soil temperatures by 25 to 34 degrees, the sea surface temperature at my equator reached 104 degrees, oxygen levels plunged, and the atmosphere filled with methane and other greenhouse gasses; 96% of all marine species perished and 75% of land species died.
Do you see a pattern? Let me continue, just so you get the full scope of my quite recent experience with oxygenated life.
After many millions of years, life regenerated only to be mostly destroyed around 200 million years ago by, you guessed it, another round of huge volcanic explosions. Carbon dioxide levels quadrupled, temperatures rose by 5 to 11 degrees, and many species perished. The next round of extinction was precipitated by a novel cause: an asteroid crashed into my surface raising the dust and killing 76% of all species.
A period of relative stability gives rise to new species. My environment changes, most species die. Stability is achieved for a while, novel species arise by adapting to the new conditions. Things change. They die. Repeat. What has remained stable during this oxygenated stage is not species, but the predictable cycle that leads to their routine rise and fall.
Narcissism. You only see a very small sliver of Planet Earth’s existence—that which directly applies to you. The stability you think you see is self-serving. Your time horizon, while conceptually encompassing, is woefully blinkered and overwhelmed by your own needs, your own concerns, your own dread and panic. But this is your biological destiny. All life is about self-preservation, is “narcissistic” about its own interests, its own importance; driven, determined, desperate to preserve its own life. You are trapped by your own biology. I understand. But your grandiosity is particularly grandiose. That too is your birthright, born, as you are, with a head size so huge you cannot even support it for the first 6 months of life. This should have been a warning to you; a clue of your inability to manage your own brain and, ultimately, to be destroyed by it.
But you have distinguished yourself. Every other species has perished because the environment changed beyond their ability to cope. You have, yourselves, changed the environment that will end up killing you. That is a first. Something that sets you apart. God-like, if you will, which is a term you like to apply to yourselves. Congratulate yourselves—this, too, is one of your biological instincts.
I am rather young, not even middle aged, as I have another 7 billion years ahead of me. My existence has been marked by dramatic, abrupt, tumultous change, often precipitated by unexpected events and surprising developments. I know my future will continue to be riotously volatile and cataclysmic. This does not surprise me—it is the cosmic order. Look up. Look around. Steady states are brief (cosmically speaking). You are simply an insignificant dust mote. Even to me, an only slightly less insignificant pile of cosmic debris, your presence does not comprise an hour’s worth of my time, using your temporal metric.
So, you see, you are not saving me, Planet Earth. It is not my termination that looms—it is yours. And, to put my attitude toward your ending in terms you can understand: “I don’t give a shit.” When you finally leave, which will be much sooner than you think, if I happen to notice, my response will be: “Adios.”
If you were to ask a fish, “What is water?”, they would likely say, “What the hell are you talking about?”1 They live in it, are enveloped by it, inhale it; it pervades everything in every moment of their life, so ubiquitous, yet invisible. How would a fish know about water? Not from the currents and surges, as these are simply “the way things are”. Perhaps a clue might come from the experience of non-water, the dangerous seam above that confers light and darkness, imposes severe restrictions to movement and threats to breathing, and offers the prospect of being an edible while pursuing edibles. But, more likely, not.
We are fish of another kind, denizens of a sea of air. We all know what air is, but this insight was not easily gained. The undulation and wave of trees, plants and grasses, and the savage violence of hurricanes, tornados and water spouts have been viewed as obvious evidence of the pervasive presence and power of unseen supernatural spirits. We breathe and can create our own local “blowing wind-storm”, which offers further evidence for how wind and storms must be caused by cosmic forces. Human history is densely populated with gods and demonic spirits of the wind, blowing good fortune and wreaking destruction.
The water-seam, viewed from our looking-down vantage point, does not necessarily clue us to air. Water is a disorienting and potentially dangerous “something”, where we lose our sure-footedness, where creatures of dreams and nightmares glide and thrash, and where we will sink to a quick gasping death or, with considerable effort, remain afloat—for a while.2 Water is a “something” that contrasts with our “nothing”. We discovered air as a “something” possessing various components, some deadly, some life-giving, only relatively recently, in the 18th century.3 This discovery, not coincidentally, occurred as the demon-haunted world was beginning to be illuminated by the candlelight of science.4
Another invisible “nothing” is gravity. We, of course, are aware that things have different weight, that some things are heavier than others. Historically, gravity was understood as a quality that inheres in objects. It is also a word used to describe the quality of “seriousness”. It is only recently, relatively speaking, since the 17th century with Newton’s famous apple-falling-from-a-tree, that we understand it as an all-pervading, invisible force acting on everything; not just on our planet, but in the entire cosmos. Obviously, apples falling from trees, or falling objects of any kind, are not “Eureka!” moments for most of us.5 What is signally significant about Newton’s apple is he understood that its fall was not caused by an inherent quality of the apple, but a manifestation of an invisible force between earth and apple.
As I look across our kitchen to our screened porch, “acorn” hummingbirds flutter above the sink (thank you, Mr. Calder6), a 2 foot fabric jellyfish and a stained glass sunflower dangle in front of the sliding glass door, and on the porch, a brass wind spinner twirls, wind chimes sing, and glass “bugs” attached to the screen glow.
Hanging in the window of my upstairs study is a school of tropical fish (thank you, again, Mr. Calder); it is especially delightful to see from the street, suggesting the room is a water-filled aquarium. Two translucent colored glass ornaments and a glass hummingbird are suspended in another window. Our living room also has 3 translucent glass ornaments in a window and another is suspended in Sharon’s study (near a hanging wood goldfinch).
These fragile glass hangings appear to be stopped, mid-fall, in their descent to a shattering end. The thin thread that that suspends them makes visible the tugging force of gravity. The mobiles, spinners, and wind chimes not only defy gravity, they also give visibility and voice to the air.
Ours is a funhouse riot of beloved, wind-blown, gravity-defying delights. They are whimsical reminders to this wheezing bottom feeder of the astonishing funhouse we all inhabit, where our every moment is given life, heft, and in-formed by “invisibles.”
The sun has been a source of veneration, worship and deification throughout human history. The pantheon of sun gods is extensive and spans cultures, continents, and times: The Egyptian sun gods, Ra, and the oldest know monotheistic god, Aten; Inca, Mayan, and Aztec sun temples and rituals of human sacrifice to the sun, giver of life; North American tribes sacred sun dances; the Hindu sun goddess, Surya, creator of the universe and the source of all life; Shinto sun goddess Amaterasu, the great divinity illuminating the heavens; the Greek sun gods, Helios and Apollo; the Druids of England and their Stonehenge, built as part of their solstice worship; the Sun Day worship of Christians, which was legislated by the Roman emperor, and pagan-turned-Christian, Constantine, in honor of the Sun, which he called “Unconquered Sun, my companion”.2
We modern, indoor-dwelling sophisticates who possess more “advanced” religious beliefs, or whose world has been desacralized by a secular worldview, typically view such obsessions as pagan sacrilege, or as historical curiosities. But, yet, perhaps, might the sun still be fervently worshiped by us, although shrouded from our awareness by our smug sense of superiority? Might we be unenlightened?
Burn, Baby, Burn
Manure, peat, and coal. Trees, whale oil, and petroleum. Animal bones, natural gas, and corn. Such a bizarre diversity of things, yet they all are united by one essential fact: all been used by humans to keep us warm and light the darkness. We have survived, and thrived, at the sacrificial altar of these “burnables”. It is difficult to imagine how, or even if, human life would be possible without them.
The common element of all these “burnables” is that they are composed of organic material; once a part of life in some form. This is obviously so for many “burnables”, like peat, trees, and whale oil. Less obvious are coal, petroleum, and natural gas; all perished life that has been compressed in the earth for millions of years.
The reason why we burn life in its various of forms is: energy. We all know intuitively what energy is. We feel it in our body, we “have energy” do something; to push, pull, lift, twist, throw—to do work. The remarkable transformation of human life wrought by the Industrial Revolution was launched by the discovery of the physical laws governing energy, force and work. These terms have very precise meanings and measurable values, and we have invented many clever ways to put energy to work.
The most important form of energy that, literally, drives our modern life is heat. Heat, derived from “burnables”, combust in engines that propel us and ignite in furnaces that heat us. Furnaces also generate electricity that lights the night, animates the machines of our modern world, allows commerce and communication across the globe at the speed of light, gives life to our digital world—and so much more.
Where does all this “burnable” energy come from? Plants. And where do plants get their energy? The sun. Plant photosynthesis converts solar energy into potential energy that is stored chemically in the molecular bonds of glucose. Carbon dioxide and water are combined to create these sugars, and oxygen is released in the process. Plants then “burn” this stored energy to grow, flower, and develop seeds. Humans, and all other animals, survive by devouring plants and other animals, converting the stored energy in other living forms into their own chemical “batteries” that store energy to be used for growing, “flowering”, and “seeding”.
The heat and light created by “burnables”, such as coal, petroleum and natural gas, is the captured energy from the sun radiated millions of years ago, stored in compressed organic matter, released into intense flame. The sun—rekindled in our furnaces.
The Cost of “Burnables”
All the energy and work fueled by these furnaces is purchased at a steep cost: Exhaust. Deadly exhaust. The process that converted carbon dioxide into glucose and released oxygen is reversed; oxygen is consumed and carbon dioxide is released. The released carbon dioxide that we send into the sky is trapped in the atmosphere, covering it like a blanket, preventing the sun’s energy from escaping the atmosphere, heating the planet. Ironically, and tragically, our obsessions with “burnable” furnaces has led to our frying ourselves, and many other fellow organisms that share the planet with us.
Our voracious need for energy, as heat and light, has ravaged the planet. Vast areas have been denuded and polluted, millions of species killed or hunted to near extinction, entire mountain ranges reduced to rubble or riddled with miles of toxic tunnels. The benefits, however, have been great. Our modern life, with all the comforts of home, a cornucopia of food, life-saving medical treatments, and unimaginable goods and entertainments are the bounty of our quest for heat and light.
We moderns are the most fanatical worshipers of the sun. Almost everyone across the entire planet kneels to the sacrificial sun alters, dreads even a momentary halt in the offerings, and pursues “burnables” with fanatical religious fervor. Prior, “primitive”, sun worshipers were but simple beginners. We are appalled by the images and revulsed at the thought of the human sacrifices of the Aztecs, standing at the top of their sun temple, holding up the beating heart of a just-sacrificed victim to the sun. Yet we are oblivious to the colossal planetary destruction of life, human and otherwise, wrought by our own, much more ruinous sun worship practices.
We are on a trajectory akin to Icarus who was given wings of feathers and wax but warned to not fly too close to the sun. Icarus, however, did not heed the warning. He rose into the sun, his wings melted—and he tumbled to his death.
We, too, are being warned of the self-immolating consequences of our “burnable” sacrifices; that we are “flying too close to the sun”. Will we heed the warning? Does Icarus’ fate await?3
Hello, sun in my face. Hello, you who make the morning and spread it over the fields and into the faces of the tulips and the nodding morning glories, and into the windows of, even, the miserable and crotchety–
best preacher that ever was, dear star, that just happens to be where you are in the universe to keep us from ever-darkness, to ease us with warm touching, to hold us in the great hands of light– good morning, good morning, good morning.
Watch, now, how I start the day in happiness, in kindness.
You probably read that statement and thought, “That can’t be true.” You would be half right. The term “child abuse” was first used around 1960 and encompasses physical abuse, sexual abuse, emotional abuse, and neglect. These can occur, not just with babies and young children, but at any point in childhood, which, in most states, extends to 18 years of age.
The behavioral characteristics of child abuse have likely been ever-present features of human life across time and cultures. So, in that regard, you would be correct. However, a second component of our contemporary understanding of child abuse is quite new: the identification and valuation of these behaviors as morally repugnant and legally punishable. The first appearance of organized concern for the physical maltreatment of children was the New York Society for the Prevention of Cruelty to Children, founded in 1874. It was an extension of the already established Society for the Prevention of Cruelty to Animals.1
The concern for cruelty to children emerged in concert with the appearance of other concerns about children’s welfare, including child labor laws. Prior to the 19th century, the economy of most societies consisted of agriculture and handcrafts. Children worked on family farms, as indentured servants, in homecrafts, and some were apprenticed to trade guilds, typically between 10 and 14 years of age. Children were critical to family finances and the labor force, employed like adult workers, and essential for survival in a time of struggle and hardship. Children lacked legal rights or any recourse. Cruelty to children, and child abuse, were behaviorally integral to the social order; simply a part of life. However, there was no identification or valuation of these behaviors as moral or legal injustices.
It is difficult to appreciate that what is now so obviously, and profoundly, morally repugnant could have been invisible; an accepted fact of life. Differentiating the behaviors from their moral valuation allows us to better grasp how the dramatic transformation in material circumstances, political and economic contexts, and cultural values can reconfigure our moral universe.
Racism did not exist before the 20th century.
You probably read that statement and thought, again, “That can’t be true.” You would be half right, again, for the same reasons as that just discussed for child abuse.
The word, “racism” first appeared in 1902 to describe and condemn the practice of segregating a race of people from the rest. The current definition of racism, “a belief that race is a fundamental determinant of human traits and capabilities and that racial differences produce inherent superiority of a particular race”2 gained its full meaning in the 1930’s to describe the political ideology of the Nazi’s about Jews.
Although the term “racist’ did not exist prior to the 20th century, what did exist was the abolitionist movement that was propelled by the conviction that no race had the right to enslave another, and that freedom was a right due to everyone regardless of race. The abolitionists movement began in the 18th century and the first abolitionist organization in the United States was founded in 1775.
Slavery has been an ever-present part of civilizations since the first settlements in Mesopotamia, beginning about 3500 BCE, and has flourished on all inhabited continents since that time. Slavery in many societies was not always, or often, based on race. Slavery in Colonial America and, later the United States, however, was largely racial, as most slaves were brought from Africa by Europeans. Here, slavery and race were inseparably linked. The belief that one race, Whites, is superior to another, Blacks, was simply assumed (by Whites) as part of the natural hierarchical order. Behaviorally, slavery and racism were an ever-present given.
No systematic philosophical or religious arguments questioning the moral injustice of slavery existed in the West before the 18th century. It was simply a fact of life. So, what changed in the 18th century that lead to a moral awaking to the evils of slavery and racism; to the appearance of the abolitionists?
The idea that all individuals have rights that are not conferred by a divine ruler, privileged personages, or an institutional decree, but are intrinsic to being human, first appeared in the 17th century and was most powerfully argued by John Locke. Locke’s arguments provided the foundation for the American Revolution; a revolution that forged a radically new political, social, and moral order.3
The Declaration of Independence, written by Thomas Jefferson, announces this new ordering in the very first lines: We hold these truths to be self-evident, that all men are created equal, and endowed by their Creator with unalienable rights, that among them are life, liberty and the pursuit of happiness. These truths were NOT self-evident at that time. They challenged the existing self-evident beliefs that all are NOT created equal; NOT endowed by the Creator with rights; and do NOT possess the right to life, liberty, or pursuit of happiness.4
The Declaration was a missile aimed directly at monarchy. Rights are not innate to a divinely sanctioned ruler; they are conferred by the Creator to allindividuals, who establish a governmental order that derives its legitimacy from these individual rights. It inverts the social hierarchy. It shatters the political order.
We fail to appreciate the enormous risks taken by Jefferson and his colleagues—fortune, family, position, reputation, their very lives—were staked on a bold declaration to lead an insurrection against the world’s most powerful, wealthy, militarily strong country that controlled every corner of Colonial governments, every courthouse, every harbor, every form of authority and power in the land, and was supported by many passionate Colonialists. It was not only a declaration of war against British monarchial rule, it was a trigger for a civil war. These unlikely rebels had no army, no navy, no organized government, no currency, no allies. This was a wild, crazy, mad gamble.
Unappreciated by our country’s founders was how far reaching “All are created equal” would become; not only a bugle call to overthrow monarchy, but a clarion call for a revolutionary moral order. It is not an accident that abolitionist movements first appeared in human history at the exact moment as the revolution against monarchial order.
The 250 years since the Declaration have been marked a growing appreciation of the reach of the ethics of rights and equality. The abolitionist movement to end slavery has been joined by other liberation movements that continue to our day: civil rights, equality of women, the rights of children, of animals, gender equality, the rights of persons with disabilities. All these moral uprisings have involved struggle, hardship, sacrifice, and bloodshed. They continue. This history affords appreciation of the long, arduous, and tumultuous process required to wrench behaviors out of taken-for-granted darkness into the hot glare of moral injustice.
Our highest ideal, that all are created equal, is inherently disruptive, calling us to continually confront injustice and reaffirm our commitment to a more perfect union.
Jefferson a Racist?
Was Jefferson a racist? Of course. As was everyone else of his time—and before. The pointed critiques of how racism, sexism, and the other heretofore invisible injustices have shaped human history, and killed, maimed, and destroyed so many lives, is a necessary corrective to the blindness of the past. But simply condemning Jefferson and his brethren who championed equal rights, at great personal cost, without recognizing their contributions flattens the moral landscape; affords easy self-righteous moralizing at the expense of understanding.
250 years from now, will any of us escape the moral condemnation certainly due us “fossilists”5 for our wanton, headless destruction of the biosphere that may, perhaps, render human life extinct by then? How many of us have sworn off using fossil fuels; taken a Jeffersonian stance to create a revolutionary political, social, and moral order to save our species and our planet? Very, very few.
But damning climate patriots among us as equally morally reprehensible as the CEO’s of the oil and gas companies, and the commercial and political interests that denude, pollute, and destroy the planet’s forests, rivers, and oceans, flattens our moral landscape, obliterating the crucial differences between individuals that make a difference. This is how revolutions happen: Individuals, in the trenches, relentlessly, aggressively, tirelessly, in the face of long odds and bleak prospects, at great personal cost, confront and attempt to upend powerful vested interests. The future is forged by the efforts of courageous, imperfect individuals acting in the uncertain, messy present.
Jefferson and the other Declaration instigators were imperfect visionaries in their messy, uncertain time who championed ideals they were prepared to die for. They bequeathed to us a moral order that, ironically, we use to condemn them for their moral shortcomings. This paradox is their legacy. When we critique them, we need to do so with gratitude. And humility.
I had never given much care to genealogy, thinking it a form of navel gazing into a distant past that has no bearing on me, and my life, now. That changed after my mother died. My father’s family was Dutch and my mother’s was German and Scots-Irish—solidly Christian, White, and Northern European. Checks all the boxes. However, my mother’s father died in 1934, when she was 12, and she told us she thought he had a family secret, but didn’t know what it might be and didn’t want to find out. After my mother died, my sister did a genealogy of my mother’s family and discovered my grandfather’s secret—he was Jewish. He changed his name when he immigrated to the U.S as a 16 year old.
I was stunned. At a subterranean level, I felt very vulnerable. I certainly understood antisemitism and abhorred it, but from a position with my feet firmly planted on the “mainland”, waving with empathy at those on a close, but offshore island (e.g., Jews and the other “outsiders”). Now I share not only a past with some of those offshore, but given the long history of parsing ancestry to sniff out Jewish ”blood” for extermination, it changed my understanding of myself, my identity. The low rumble of antisemitism was now quite audible and personally menacing. The living power of the past, its relevance for my life in the present, and its possible consequences for my future—pile-driven home.
The seismic power of the past is also the engine of effective psychotherapy. We all construe stories about our identity; who we are, what formed us, who influenced us, what memorable events mark our lives. These stories compose our identity and are distilled into habits, assumptions, and reactions that reflexively govern much of what we believe, say, and how we act. The past is relived at the visceral level, guiding our present lives and our expectations for the future. When these guideposts falter or breakdown we can find ourselves unmoored; anxiety, angst, anger, and despair become our companions. Therapy, in its various forms, focuses on changing the visceral assumptions, and this often involves reexamining the stories we tell ourselves about ourselves.
Therapy for trauma evinces this in its most stark form. Trauma therapy must directly confront shocking past experiences to forge new narratives, new habits, new reactions about the trauma’s meaning and its bearing on who we are. This therapy can be destabilizing, painful, and terrifying, requiring great courage. This is a major reason therapy is often avoided; the pain we know is less frightening than the destabilizing unknown pain that awaits. But the courage to confront the horrors of the past is rewarded with a reclaimed life and hope for the future. The only way out is through.
A similar process to therapy occurs at the national level. Our narratives about our past, which is the history that we tell ourselves about ourselves, forms our identity. These narratives, typically, are heroic; obstacles faced, challenges met, and adversity overcome at great odds. They are morality plays where, after grappling with demons for 40 days and 40 nights, virtue and righteousness triumph over evil. Our national pride is, after all, pride about our past, which defines who we are now and what we hope for our future; it shapes our political landscape and national conversations, our laws, institutions, legislation, and elections.
Slavery is a 400-year indelible stain on American history. It was integral to our country’s founding, essential to its economic viability and vitality, and intrinsic in its social structure. Unimaginable cruelty, brutality, suffering, and murder of slaves, and their descendants, have been routine in American life for centuries. The presence of the progeny of slaves in our midst—in their very appearance—is a stark reminder of the horrors of our past, evoking reflexive, habitual reactions conditioned by the longstanding narratives about race; about Black and White and what they mean.1
The current upheaval in our country about racism is a challenge to the dominant stories of our history, of our identity. As with therapy, and genealogy, our past is complicated, filled with facts—known, unknown, avoided, denied, and ignored— selectively highlighted and given structure and meaning by our narratives. Was Robert E. Lee the hero of the “Lost Cause”, fighting for “States Rights” and a “Noble Defender of the South?” Or was he a traitor leading a rebellion to preserve slavery and destroy the Union? The struggle over the narratives about our past is not simply an esoteric debate among academics; it is a struggle for our nation’s soul: Who were we? Who are we? Who do we want to be?
The Abolitionist movement, the Civil War, the 13th, 14th, and 15th Constitutional Amendments, and the Civil Rights movement of the 60’s are all landmarks of progress toward equality; achievements certainly worthy of note. Our narratives, however, only highlight these accomplishments, avoiding the 400 years of murder and misery. We comfort ourselves with hagiography to avoid honest history.
The power of the trauma at the very heart of our national identity—slavery— threatens the foundations of our body politic, our civil life, our personal engagements. We, collectively, face the same onerous challenge as many combat vets: Do we choose to continue with the pain we know, or do we have the courage to face the searing hard truths of our past, our moral failings of great consequence, and endure the disturbing uncertainties and disruptive pain that will result?
Germany offers a model of what this might look like. Monuments to their Nazi past are scattered throughout Germany, with an especially dense concentration in its capital, Berlin, marking the sites of momentous dark happenings, egregious atrocities , and homage to the victims who were tortured and murdered. These monuments testify to this past and bear witness to grievous moral failings. They also, however, are bold statements of Germany’s values, now, and their commitment to a future informed by this past. They display a unique kind of heroism worth emulating: moral courage.
National monuments to the past are values we hold, now, about ourselves, made visible.2
A Niagara of books have poured off the presses in recent years extolling the ever expanding opportunities available to the elderly and retirees. “80 is the new 60”, we are told. “Start a new business.” “Follow your dreams.” The AARP magazine carries pictures of aging movie stars, in their 60’s, 70’s, and sometimes 80’s, facelifts intact, smiling with perfect teeth, extolling the virtues and pleasures of aging.
There is some truth to this. Americans are retiring with greater health, wealth, and opportunities than ever before. Rarely acknowledged, however, is the black-hooded figure, circulating through this party, striking down revelers, sometimes mid-sentence, to the horrifying glances of other revelers. No one is exempt, and the most susceptible are those who cannot afford the books and magazines trumpeting geriatric wellbeing. Since the onset of the pandemic, however, much of the chatter about the joys of aging has subsided and the disparities and injustices in health, wealth, and happiness are now glaringly apparent.
Statistical Life Expectancy
We do, nonetheless, live in extraordinary times. The statistical definition of life expectancy is the average, across all births, of how long a person may expect to live at birth. Life expectancy, for most of human history, has remained remarkably constant. Based on the best estimates from the historical record, life expectancy, across all civilizations, from ancient Greece and Rome, to the Inca and Teotihuacan empires, to Renaissance Italy and medieval Japan—indeed, up to the mid 19th century, was around 30 years.1 These data lead to the obvious conclusion that 30 years is the biological limit of life expectancy for the human species. Every species has its lifecycle and this is ours.
Dramatic changes, however, occurred in the last 150 to 200 years. In 1850, life expectancy was 38 years; in 1900, 48 years; in 1950, 66 years; in 2000, 77 years.
This change in life expectancy for the world shows an even more startling increase. Prior to 1900, life expectancy was 30 years. By 2013, it had risen to 72 years. The global average today is higher than it was in any country in 1950. Life expectancy for the entire human population has doubled in 200 years! This is an astonishing improvement in human life.2
Why? Science. A radically new way to understand the material world, based on doubt, systematic methods of experimentation, and material explanations that can be objectively verified, was developed in the 16th and 17th centuries. When this method began to be applied to medical conditions in the 19th century, astonishing discoveries and advances occurred. We now have the power to intervene in the course of nature, able to change our biological destiny.
The full benefits of this power are not equally distributed in the United States. Communities of color who have been targets of systemic racism and injustice, and those living in poverty have significantly lower life expectancies. We must ask ourselves, “Why?”
Semantic Life Expectancy
Life expectancy also has a semantic meaning: What do we expect from life? What do we anticipate for our future? What possibilities are open to us? Semantic life expectancy also has remained relatively constant throughout most of human history. The answer to the question, “What do you expect from life?”, was simple: Our fate is the same as our parents; and their fate was the same as their parents. Our birth determined our destiny, and little changed from generation to generation. And this fate was likely grim. 95% of the population were laborers, surfs, peasants and poor. Life was toil, suffering, degradation, and hardship. The most important semantic life expectancy in a nasty, brutish, and short life, was the question of afterlife expectancy; what fate awaits beyond this mortal coil of suffering.
Now, in our current times, the semantic question, “What do we expect from life?”, extends beyond the confined straightjacket of our birth, embracing possibilities unimaginable to our ancestors. Our youth is shadowed by the question, “What do you want to be when you grow up?”, we experience “identity crises”, and spend the first 18 years of our life, and oftentimes much more, going to school; many of us spend almost the entire life expectancy of earlier times preparing for life.
We enjoy a surfeit of food, which is delivered to “super” markets in fleets of tractor trailers, miles of railroad cars, and a flotilla of ocean tankers from distant places, offering a cornucopia of choices. We travel at breakneck speeds across town, across the country, and across oceans. In our heated and air conditioned homes, we turn facets to shower and to drink purified water. Music, information, and entertainment to amuse and inform us are at our fingertips, 24/7. We enjoy comforts that would have made even royalty of bygone eras jealous. And after 30 or so years of work, many of us retire to enjoy 20 or more years of labor free life, embrace, “80 is the new 60”, and ponder: “What do I want to do with my life?”
This dreamscape is not evenly distributed. The semantic life expectancy of those outside the circumference of health, wealth, and opportunity is much grimmer and bleaker. The happy talk about the “Golden Years”, and the wrinkle-free faces and cheerleading smiles of aging celebrities are marketed to a select group; the fate of the impoverished and communities of color who have been targets of racism are airbrushed out of the picture. Again, we must ask ourselves, “Why such disparities?”
The answer to this troubling question about semantic life expectancy is, not surprisingly, the same as for statistical life expectancy: both result from the lack of good health care, nutrition, housing, education, opportunity, and employment.
The 19th century not only ushered in a dramatic rise in statistical life expectancy, but another startling growth: the size of the population.
This growth begins around 1850. We would expect that a population growth like this would usher in mass starvation and universal misery. It has not.
The exponential increase in life expectancy and in population trace almost identical paths. These have been accompanied by a host of other exponential growths, referred to in scientific circles as the “Great Acceleration”.3
This graph captures the costs on the earth’s biosystems of our long life of luxury, and includes atmospheric composition, stratospheric ozone, the climate system, water and nitrogen cycles, marine ecosystems, land systems, tropical forests, and terrestrial biosphere degradation. You can see the initial jump out of the steady state beginning in 1850, about the same time as the jump in statistical life expectancy, and biosphere destruction picks up steam into the 20th century
Great inequities lie behind these trends as well. Currently, 18% of world’s population controls 75% of world’s wealth, and consumes much of the world’s resources. The biggest consumer of the world’s resources is us in the United States.
This second graph traces trends in the socioeconomic factors contributing to our affluent life and includes economic growth, primary energy use, fertilizer use, large dams, water use, paper production, transportation, telecommunications, and international tourism. These make our remarkable lives possible.
The great acceleration of socioeconomic changes, as we can see in this graph, is much steeper than the prior one. It took a century of environmental exploitation to build the infrastructure that enabled the abrupt, explosive growth in socioeconomic benefits. The appearance of science and modern medicine, the dramatic rise in statistical life expectancy, the rapid industrialization, steep acceleration of socioeconomic factors, and transformation of semantic life expectancy are all interconnected, all of a single piece of a profound, historically unprecedented alteration in human life.
This all came together around 1950, as this second graph indicates. After WWII, the United States emerged as the only combatant nation untouched by bombs or invasion, the economy humming at war time production, and possessing a GDP that was equal to the entire rest of the devastated world. Rarely, if ever, in human history has so much of the worlds wealth resided in one country.
We have been living in this historically anomalous time, in this historically anomalous place, where many of us have enjoyed the bountiful life expectancies of our unique and narrow place-time envelop that could not even have be dreamed of by most humans who have ever walked this planet.
And it is ending. The entire edifice that undergirds our privileged life is unsustainable. The bill has come due on the costs: the biosphere is being degraded beyond repair, species are being exterminated at a rate unseen for 65 million years, essential resources are being depleted, and our planet is irrevocably changed. We are in the midst of rapid and profound changes to the entire biosphere that no previous single generation in human history has experienced.
There will not be a return to “normal”, as normal was decidedly not normal.
The pandemic is simply a baby-step dress rehearsal for the cataclysmic changes rushing toward us. The pandemic provides a preview of how capable we are to effectively respond to a known, impending catastrophe. We, in the United States, have failed. Miserably. We can’t even get cooperation on the simple inconvenience of wearing a mask. This response is a sign of a deeper unraveling of American society.
Furthermore, the life expectancies for our children and grandchildren are being dramatically altered. So too, for those of us who have come of age in the midst of 9/11, the 2008 financial meltdown, in the shadow of global warming, and, now, the pandemic.
Ethical Life Expectancy
Embedded in the statistical and semantic meanings of life expectancy is a third meaning: Ethical.
We now have the power to intervene in course of nature, able to change not only our biological destiny, but of that of the entire planet. It is a fearsome power with equally fearsome responsibilities. We live in an apocalyptic age. There are 2 meanings for this term. The one we are most familiar with is “the impending destruction of the world”. The second is the original Greek meaning: “A revealing of things not previously known”. This meaning beckons a response, poses a challenge to confront a new reality, to forge new paths, to plant seeds for new possibilities from the ashes of the of what has been lost.
Both definitions apply. We face the impending destruction of the world. We also are beckoned—courage, vision, and an unprecedented marshaling of the talents, energy, and collaboration of the entire human community are urgently needed.
Each of us must choose. We are at a high leverage point in time where actions now will have huge consequences for the future—even if there will be a future for our children and their children. The onrushing catastrophe of biosphere destruction, the appalling disparities and injustices between the wealthily privileged few and the impoverished many, and the societal unraveling, pose a most dire moral challenge:
How should I Iive? What is the nature of the Good? How should we live together? By what authority? These life quandaries, often not explicitly stated, have haunted humans from the time of our cave dwelling ancestors. Religion provides an explicit, sanctifying framework that situates our lives within a cosmic horizon, providing meaning, purpose, and moral grounding. Answers to fundamental moral quandaries are conferred by supernatural powers beyond the frail groping of humans—something clear, universal, unassailable, absolute.
Christianity and morality have been synonymous in the West for nearly two millennia, the Bible providing the moral pillar supporting church, state, and the grounding for adjudicating good and evil. The worst crime in Christendom was not murder (”Thou shalt Not Kill”), as punishment could be mitigated by circumstances1, but heresy, which usually could not. Indeed, heretics received especially intense condemnation and persecution, and for good reason. Heresy doesn’t violate a commandment. It is much more dangerous—it challenges the legitimacy of the commandments.
Not surprisingly, one of the deepest divides in contemporary American life and politics is between moral absolutists and their rivals, often called “moral relativists” by the absolutists. Absolutists, led by the Christian Right2, claim the country was founded as a Christian nation and therefore should adhere to Christian moral dictates. The “relativists”, in stark contrast, allow for a multiplicity of moral codes and religious beliefs. Indeed, they argue for respecting diverse moral orientations, and strive to be open and non-judgmental, acknowledging the claims of legitimacy of many, often competing, moral frameworks. Debate and disagreement are to be expected, but no one approach is inherently superior to another, should hold sway simply because those in power say-so. What is valued is a plurality of voices and possibilities.
The absolutists raise challenging questions about this seemingly all-embracing doctrine of fairness and acceptance. How are we to arrive at any moral certainties, to find any moral basis on which to act, to discover the answers to: “How shall I live?” “What is the nature of the Good?” And how, specifically, are we to address foundational political questions: “What should be the rule of law?” “By what justification?” It is easy to conclude that the “relativists” guiding moral principle is that none should hold sway, that morality is arbitrary, that “anything goes”—simply another name for amorality. It is also easy to understand the absolutist’s opposition, even militant resistance, to this apparent decent into the moral abyss.
The absolutist code, free of the confusions of mortals, offers the promise of clarity, safety and security. As alluring as this is, it begs the question: Whose moral code? Christians were among the first settlers to arrive in America, en masse, from Europe. Most made the harrowing journey to this distant shore because they were persecuted minorities in their country of origin, heretics to the ruling orthodoxy. Methodists, Baptists, Congregationalists, Presbyterians, Lutherans, Quakers, Mennonites, Huguenots, Catholics, and Moravians all fled to the “New World” seeking freedom to practice their unique orthodoxies without persecution.3 European history, that is, the history of Christendom, is written in the blood of the vicious slaughter of millions over disagreements about orthodoxy. The lesson to be learned from over a millennium of Christendom’s history is that Christian absolutism leads to absolute chaos, wanton murder, and brutal persecution of individuals whose sole moral failing is to believe a different interpretation of biblical text.
The framers of the Constitution of the United States, having just won a war of independence from a despotic monarch who was also head of the state church, were acutely aware of this legacy of Christian absolutism. They also were acutely aware they were creating a new order, free of absolutism. Monarchy was countered by an elected president and a system of checks and balances. Christian absolutism was countered by the first Constitutional amendment guaranteeing freedom of religion. James Madison, the principal author of the Constitution, understood “that the government sanction of a religion is a threat to religion: Who does not see that the same authority which can establish Christianity, in exclusion of all other Religions, may establish with the same ease any particular sect of Christians, in exclusion of all other Sects?4 God is not mentioned once in the Constitution.
America was not founded as a Christian nation. It was founded as a nation defined by the Constitution, establishing a form of government unlike any in human history5; one that has become a beacon for many other peoples across the globe seeking liberty. It is a radical alternative to absolutism in its many forms. It is more than a political document. The Constitution is a Revolutionary Moral Order. It allows a multiplicity of moral codes and religious beliefs, respects diverse moral orientations, is open and non-judgmental, acknowledging the claims of legitimacy by many, often competing, moral frameworks. Debate and disagreement are to be expected, but no one approach is inherently superior to another, should hold sway simply because those in power say-so. What is valued is a plurality of voices and possibilities.
This moral order is not “moral relativism”. It embodies the values of democracy, explicitly crafted to avoid the plagues of moral absolutism, religious warfare, arbitrary justice, and the gross mistreatment of the many by the few. It is a statement of ethical principles of relationship, of respect for each person. It is the basis for justice and order of a different kind than offered by absolutists; it forbids as much as it allows. It also is not the opposite of absolutism—it is an alternative. The opposite of moral absolutism, as well as democratic morality, is anarchy, the true morality of “anything goes.”
We live in a large multicultural society with untold number of congregations and believers ascribing to diverse, often absolutists, moral codes and commandments. We are confronted with the same urgent question as the American founders: How can we live together if there is NOT a superordinate moral and political framework that allows a multiplicity of moral codes and religious beliefs, respects diverse moral orientations, and acknowledges the claims of legitimacy by many, often competing, moral visions? Democratic morality allows each of us to live a moral life free from persecution, and in doing so necessarily results in disagreement, confusion, and uncertainty. It also can be quite distressing and disturbing, requiring strength, fortitude, faith, and humility. Living a moral life is never easy. But it is necessary. It is what living in a democracy involves. Living together peacefully, but with considerable discord, in the 21st century requires that we embrace, with courage and conviction, the demands of democratic morality.