What is a weed? I am not a gardener and have always found the question puzzling. As I gaze at a garden, it is never obvious which are weeds. The definition of weeds explains why: “Any plant that grows in an unwanted place…”2 Weed is a moral term. The world is partitioned into the “wanted” and the weeds. We do not simply plant a garden, we also impose a moral grid on the earth. And we do not do this alone. We may not care about what plants spring up in our patch of earth—but others do. Try not weeding, mowing or tending to your lawn. Community ordinances in most urban areas prohibit the proliferation of weeds and you could receive a citation. Even if the ordinance is not enforced there are consequences: Weeds lower property values and raise neighborly ire.
Good friends of ours are native garden enthusiasts and offered to do ”missionary work” on a small patch in our small backyard. They carefully designed and planted a garden of native plants with beautiful flowers that bloom at different times and also attract birds, insects, and butterflies. They told us that we would have to weed the garden. I was surprised, as I thought “native” meant “no weeding necessary.” Despite being native, the division endures: the “wanted” and the weeds.
The day after the garden was completed, I watched a bunny munch away in our precious, newly planted patch. I was horrified! I quickly went to the garden supply store and bought sturdy two-foot high chicken wire to protect the plants from marauding rabbits. I began thinking of these intruders in different terms. They are no longer bunnies or even rabbits. They are varmints, pests, felons.
It is not lost on me that the words “varmint,” “pest,” and “felon” are all moral terms applied to undesirables who have trespassed into my space. It is also not lost on me that my moral compass about rabbits swung from “cute-harmless” to “objectionable-destructive” when I altered my values about our small plot of earth. And it is not lost me that claiming it to be “our plot” is the height of god-like presumptuousness. I am, of course, not alone in making such a claim. Every square inch of this entire continent is “owned” by someone or some collective entity.
One of the reasons we must fence out rabbits is the lack of predators. Rabbits proliferate and hop around our yard with impunity. They do this because most rabbit-predators are more objectionable: wolves, foxes, wild dogs, feral cats, raccoons. Those that prey on rabbits from the sky, including hawks, kites, falcons, merlins, eagles, and owls are of limited number because of urban habitat degradation. Rabbits thrive because the larger communal moral grid banishes or inadvertently diminishes their predators.
The “civilized space” in many urbanized areas is typically neatly partitioned into garden and lawn. The garden is an enhancement, a beauty mark set off from the grassy green lawn. Grass. My puzzlement about weeds is accompanied by my loathing of grass. My father bought 250 acres in the rugged, forested mountains in the Southern Tier of New York State when I was 12 years old. Our weekends were spent clearing large areas of natural growth, making trails, planting—and mowing—grass. This is not how a teenaged urban boy wants to spent his weekends. It is where my loathing of grass was fostered.
Turf grass is America’s biggest irrigated crop; more than corn, wheat, and fruit orchards combined. Over 1/3 of urban water use goes to lawns and up to 60% of this is wasted. Lawnmowers use gas and pollute—over 17 million gallons of gas are spilled filling lawnmowers. Grass is environmentally unsustainable in many parts of the country, especially the Southwest, where climate change, drought, and fires renders grass an extravagant folly. Nonetheless, grass still flourishes in many of these areas, where the extravagance of grass is not viewed as folly, but a badge of wealth and privilege. The moral compass has not yet swung from privilege to folly, but movement is afoot: Las Vegas, for example, has banned certain types of grass, water rationing is in effect across the Southwest, and fines have been imposed on those using water for their grass.
A grassy, manicured, weeded lawn is a moral imperative for much of the rest of the country; a communally enforced aesthetic. It is a moral esthetic imported into this country in the 19th century from England, where the large manor houses and palaces are fronted with huge lawns. Americans now front (and “rear”) their “estates”, mimicking the privileged class of Old England. The pernicious grip of grass-fed entitlement reaches across continents and centuries.
The moral compass about grass will be harder to swing in these water enriched areas, but early signs suggest the needle is starting to move. The environmental advantages of native gardening are many: Native plants are hardy, do not require fertilizer, need less water, reduce pollution, promote biodiversity, and are beautiful. Native gardening is becoming a booming industry, sale of native plants has dramatically increased, and standards for certification of native gardens have been established. Posting a sign of certification provides moral standing to a yard that might otherwise be judged as a neglected eyesore. On my small street alone there are several yards that are covered with plants, not grass, and another with a certification notice.
Gardening is inescapably moral, having consequences for the birds, insects and butterflies, the plants and animals, and the planet that cascade from our decisions about what constitutes a weed. Grass is a weed.
Planet Earth has never bothered to speak—until now. What follows is an unedited transcription of Planet Earth’s remarks to this reporter.
I am Planet Earth. I am sure you have seen pictures of me, which have, lately, been accompanied by your pleas to save me. I am bloody sick of all the posters and dire warnings about my fate by you trumpeting, self-absorbed critters presuming to speak for me when, of course, it is, as usual, all about you. So, here is my response, although I do not expect any of you to listen, or care, especially as I am not going to bolster your hollow sense of self-importance.
Let me begin by saying that the pictures you have seen of me are only a snapshot of my life, and a very brief moment at that. I would like to share the history of my earlier life as well my thoughts about my future so you will know, really, who I am and not presume to speak for me.
My life began over 4 ½ billion years ago. That’s billion. I know it is hard, if not impossible, for you creatures of not-even-an-eyeblink to grasp, but it is important for you to appreciate the entire arc of my life before you make claims on my behalf.
The first billion years I lived a simple but volatile life. I began as a molten ball of metals and minerals, bombarded by asteroids and cosmic debris large and small, and even experienced a cataclysmic collision with another planet. All this added to my mass and girth. I was roiling with volcanic activity, enveloped in mixture of gasses (none that you would find pleasant), formed a crust, and water condensed on my surface. Above is a simulated picture. Obviously, no actual pictures exist, but you get the picture. During this time single-celled life, anaerobes, appeared and thrived in the methane, ammonia, water vapor, neon, and carbon dioxide gaseous haze that surrounded me.
After about a billion years or so, cyanobacteria appeared. These are also single-celled life forms, but unlike the anaerobes, they rely on the novel process of using energy from the sun to synthesize water and carbon dioxide to create carbohydrates; a process called photosynthesis. Oxygen is a waste product. The oxygen produced was mostly absorbed in my oceans, seabed rock, and land surfaces. I experienced a relatively steady atmospheric state for the next 2 billion years where anaerobes continued to flourish; oxygen did not constitute an appreciable part of my atmosphere.
Take note: This very brief account covers 2/3 of my existence. What follows in a more detailed description of the next 1/3. Should you be keeping score (and you should), you, homo sapiens, have been around for .02% of this remaining 1/3.
Oxygen Holocaust! Oxygen Catastrophe! The Great Oxidation Event (GOE)! “The greatest pollution crisis!” These terms and phrases describe what occurred about 1.5 billion years ago, when the oxygen content of my atmosphere rose from .0001% to 3%—a 30,000 fold increase. Sunlight and oxygen were lethal to almost all the existing life forms and I became enveloped in a bubble of poisonous gas: oxygen.1
Terms like “Oxygen Catastrophe” and reference to oxygen as a poisonous gas call your attention to your use of words that you think are “objective.” Words like “pollution,” which doesn’t really mean anything to me; it is a self-serving value-loaded assessment, like “weeds” and “disease,” and the best one: “invasive species.” “Invasive?!” As if there are boundaries, like your imaginary, militantly enforced political boundaries, that have been breached by “alien” species, often imported by you, requiring extermination because they disrupt your sense of order. Want to know the most “invasive” of species on the planet at this moment? Look in the mirror.2
The oxygen content continued to increase, giving rise to radically new forms of life. Multi-celled life began around 1.5 billion years ago in this hyper-oxygenated atmosphere, but the bizarre menagerie of clinging, crawling, burrowing, flapping, splashing, slithering creatures didn’t make their appearance for another 750 million years. One of the hallmarks of these creatures is they are fragile; they don’t last long. Slight changes in my temperature or atmosphere, relative to what I have experienced, result in wholesale extinctions. 444 million years ago, glaciers amassed at my poles; 85% of the species died. Life reformed, rejuvenated, then 60 million years later enormous volcanic eruptions caused oxygen concentration to plummet; 75% of the species died.
Once again, life reformed, rejuvenated, and once again, after a period of stability, massive volcanic eruptions raised sea and soil temperatures by 25 to 34 degrees, the sea surface temperature at my equator reached 104 degrees, oxygen levels plunged, and the atmosphere filled with methane and other greenhouse gasses; 96% of all marine species perished and 75% of land species died.
Do you see a pattern? Let me continue, just so you get the full scope of my quite recent experience with oxygenated life.
After many millions of years, life regenerated only to be mostly destroyed around 200 million years ago by, you guessed it, another round of huge volcanic explosions. Carbon dioxide levels quadrupled, temperatures rose by 5 to 11 degrees, and many species perished. The next round of extinction was precipitated by a novel cause: an asteroid crashed into my surface raising the dust and killing 76% of all species.
A period of relative stability gives rise to new species. My environment changes, most species die. Stability is achieved for a while, novel species arise by adapting to the new conditions. Things change. They die. Repeat. What has remained stable during this oxygenated stage is not species, but the predictable cycle that leads to their routine rise and fall.
Narcissism. You only see a very small sliver of Planet Earth’s existence—that which directly applies to you. The stability you think you see is self-serving. Your time horizon, while conceptually encompassing, is woefully blinkered and overwhelmed by your own needs, your own concerns, your own dread and panic. But this is your biological destiny. All life is about self-preservation, is “narcissistic” about its own interests, its own importance; driven, determined, desperate to preserve its own life. You are trapped by your own biology. I understand. But your grandiosity is particularly grandiose. That too is your birthright, born, as you are, with a head size so huge you cannot even support it for the first 6 months of life. This should have been a warning to you; a clue of your inability to manage your own brain and, ultimately, to be destroyed by it.
But you have distinguished yourself. Every other species has perished because the environment changed beyond their ability to cope. You have, yourselves, changed the environment that will end up killing you. That is a first. Something that sets you apart. God-like, if you will, which is a term you like to apply to yourselves. Congratulate yourselves—this, too, is one of your biological instincts.
I am rather young, not even middle aged, as I have another 7 billion years ahead of me. My existence has been marked by dramatic, abrupt, tumultous change, often precipitated by unexpected events and surprising developments. I know my future will continue to be riotously volatile and cataclysmic. This does not surprise me—it is the cosmic order. Look up. Look around. Steady states are brief (cosmically speaking). You are simply an insignificant dust mote. Even to me, an only slightly less insignificant pile of cosmic debris, your presence does not comprise an hour’s worth of my time, using your temporal metric.
So, you see, you are not saving me, Planet Earth. It is not my termination that looms—it is yours. And, to put my attitude toward your ending in terms you can understand: “I don’t give a shit.” When you finally leave, which will be much sooner than you think, if I happen to notice, my response will be: “Adios.”
The sun has been a source of veneration, worship, and deification throughout human history. The pantheon of sun gods is extensive and spans cultures, continents, and times: The Egyptian sun gods, Ra, and the oldest know monotheistic god, Aten; Inca, Mayan, and Aztec sun temples and rituals of human sacrifice to the sun, giver of life; North American tribes sacred sun dances; the Hindu sun goddess, Surya, creator of the universe and the source of all life; Shinto sun goddess Amaterasu, the great divinity illuminating the heavens; the Greek sun gods, Helios and Apollo; the Druids of England and their Stonehenge, built as part of their solstice worship; the Sun Day worship of Christians, which was legislated by the Roman emperor, and pagan-turned-Christian, Constantine, in honor of the Sun, which he called “Unconquered Sun, my companion”.2
We modern, indoor-dwelling sophisticates who possess more “advanced” religious beliefs, or whose world has been desacralized by a secular worldview, typically view such obsessions as pagan sacrilege, or as historical curiosities. But, yet, perhaps, might the sun still be fervently worshiped by us, although shrouded from our awareness by our smug sense of superiority? Might we be unenlightened?
Burn, Baby, Burn
Manure, peat, and coal. Trees, whale oil, and petroleum. Animal bones, natural gas, and corn. Such a bizarre diversity of things, yet they all are united by one essential fact: All have been used by humans to keep us warm and light the darkness. We have survived, and thrived, at the sacrificial altar of these “burnables”. It is difficult to imagine how, or even if, human life would be possible without them.
The common element of all these “burnables” is that they are composed of organic material; once a part of life in some form. This is obviously so for many “burnables”, like peat, trees, and whale oil. Less obvious are coal, petroleum, and natural gas; all perished life that has been compressed in the earth for millions of years.
The reason why we burn life in its various of forms is: energy. We all know intuitively what energy is. We feel it in our body, we “have energy” do something; to push, pull, lift, twist, throw—to do work. The remarkable transformation of human life wrought by the Industrial Revolution was launched by the discovery of the physical laws governing energy, force, and work. These terms have very precise meanings and measurable values, and we have invented many clever ways to put energy to work.
The most important form of energy that, literally, drives our modern life is heat. Heat, derived from “burnables”, combust in engines that propel us and ignite in furnaces that heat us. Furnaces also generate electricity that lights the night, animates the machines of our modern world, allows commerce and communication across the globe at the speed of light, gives life to our digital world—and so much more.
Where does all this “burnable” energy come from? Plants. And where do plants get their energy? The sun. Plant photosynthesis converts solar energy into potential energy that is stored chemically in the molecular bonds of glucose. Carbon dioxide and water are combined to create these sugars, and oxygen is released in the process. Plants then “burn” this stored energy to grow, flower, and develop seeds. Humans, and all other animals, survive by devouring plants and other animals, converting the stored energy in other living forms into their own chemical “batteries” that store energy to be used for growing, “flowering”, and “seeding”.
The heat and light created by “burnables”, such as coal, petroleum and natural gas, is the captured energy from the sun radiated millions of years ago, stored in compressed organic matter, released into intense flame. The sun—rekindled in our furnaces.
The Cost of “Burnables”
All the energy and work fueled by these furnaces is purchased at a steep cost: Exhaust. Deadly exhaust. The process that converted carbon dioxide into glucose and released oxygen is reversed; oxygen is consumed and carbon dioxide is released. The released carbon dioxide that we send into the sky is trapped in the atmosphere, covering it like a blanket, preventing the sun’s energy from escaping the atmosphere, heating the planet. Ironically, and tragically, our obsessions with “burnable” furnaces has led to our frying ourselves, and many other fellow organisms that share the planet with us.
Our voracious need for energy, as heat and light, has ravaged the planet. Vast areas have been denuded and polluted, millions of species killed or hunted to near extinction, entire mountain ranges reduced to rubble or riddled with miles of toxic tunnels. The benefits, however, have been great. Our modern life, with all the comforts of home, a cornucopia of food, life-saving medical treatments, and unimaginable goods and entertainments are the bounty of our quest for heat and light.
We moderns are the most fanatical worshipers of the sun. Almost everyone across the entire planet kneels to the sacrificial sun alters, dreads even a momentary halt in the offerings, and pursues “burnables” with fanatical religious fervor. Prior, “primitive”, sun worshipers were but simple beginners. We are appalled by the images and revulsed at the thought of the human sacrifices of the Aztecs, standing at the top of their sun temple, holding up the beating heart of a just-sacrificed victim to the sun. Yet we are oblivious to the colossal planetary destruction of life, human and otherwise, wrought by our own, much more ruinous sun worship practices.
We are on a trajectory akin to Icarus who was given wings of feathers and wax but warned to not fly too close to the sun. Icarus, however, did not heed the warning. He rose into the sun, his wings melted—and he tumbled to his death.
We, too, are being warned of the self-immolating consequences of our “burnable” sacrifices; that we are “flying too close to the sun”. Will we heed the warning? Does Icarus’ fate await?3
Hello, sun in my face. Hello, you who make the morning and spread it over the fields and into the faces of the tulips and the nodding morning glories, and into the windows of, even, the miserable and crotchety–
best preacher that ever was, dear star, that just happens to be where you are in the universe to keep us from ever-darkness, to ease us with warm touching, to hold us in the great hands of light– good morning, good morning, good morning.
Watch, now, how I start the day in happiness, in kindness.
You probably read that statement and thought, “That can’t be true.” You would be half right. The term “child abuse” was first used around 1960 and encompasses physical abuse, sexual abuse, emotional abuse, and neglect. These can occur, not just with babies and young children, but at any point in childhood, which, in most states, extends to 18 years of age.
The behavioral characteristics of child abuse have likely been ever-present features of human life across time and cultures. So, in that regard, you would be correct. However, a second component of our contemporary understanding of child abuse is quite new: the identification and valuation of these behaviors as morally repugnant and legally punishable. The first appearance of organized concern for the physical maltreatment of children was the New York Society for the Prevention of Cruelty to Children, founded in 1874. It was an extension of the already established Society for the Prevention of Cruelty to Animals.1
The concern for cruelty to children emerged in concert with the appearance of other concerns about children’s welfare, including child labor laws. Prior to the 19th century, the economy of most societies consisted of agriculture and handcrafts. Children worked on family farms, as indentured servants, in homecrafts, and some were apprenticed to trade guilds, typically between 10 and 14 years of age. Children were critical to family finances and the labor force, employed like adult workers, and essential for survival in a time of struggle and hardship. Children lacked legal rights or any recourse. Cruelty to children, and child abuse, were behaviorally integral to the social order; simply a part of life. However, there was no identification or valuation of these behaviors as moral or legal injustices.
It is difficult to appreciate that what is now so obviously, and profoundly, morally repugnant could have been invisible; an accepted fact of life. Differentiating the behaviors from their moral valuation allows us to better grasp how the dramatic transformation in material circumstances, political and economic contexts, and cultural values can reconfigure our moral universe.
Racism did not exist before the 20th century.
You probably read that statement and thought, again, “That can’t be true.” You would be half right, again, for the same reasons as that just discussed for child abuse.
The word, “racism” first appeared in 1902 to describe and condemn the practice of segregating a race of people from the rest. The current definition of racism, “a belief that race is a fundamental determinant of human traits and capabilities and that racial differences produce inherent superiority of a particular race”2 gained its full meaning in the 1930’s to describe the political ideology of the Nazi’s about Jews.
Although the term “racist’ did not exist prior to the 20th century, what did exist was the abolitionist movement that was propelled by the conviction that no race had the right to enslave another, and that freedom was a right due to everyone regardless of race. The abolitionists movement began in the 18th century and the first abolitionist organization in the United States was founded in 1775.
Slavery has been an ever-present part of civilizations since the first settlements in Mesopotamia, beginning about 3500 BCE, and has flourished on all inhabited continents since that time. Slavery in many societies was not always, or often, based on race. Slavery in Colonial America and, later the United States, however, was largely racial, as most slaves were brought from Africa by Europeans. Here, slavery and race were inseparably linked. The belief that one race, Whites, is superior to another, Blacks, was simply assumed (by Whites) as part of the natural hierarchical order. Behaviorally, slavery and racism were an ever-present given.
No systematic philosophical or religious arguments questioning the moral injustice of slavery existed in the West before the 18th century. It was simply a fact of life. So, what changed in the 18th century that lead to a moral awaking to the evils of slavery and racism; to the appearance of the abolitionists?
The idea that all individuals have rights that are not conferred by a divine ruler, privileged personages, or an institutional decree, but are intrinsic to being human, first appeared in the 17th century and was most powerfully argued by John Locke. Locke’s arguments provided the foundation for the American Revolution; a revolution that forged a radically new political, social, and moral order.3
The Declaration of Independence, written by Thomas Jefferson, announces this new ordering in the very first lines: We hold these truths to be self-evident, that all men are created equal, and endowed by their Creator with unalienable rights, that among them are life, liberty and the pursuit of happiness. These truths were NOT self-evident at that time. They challenged the existing self-evident beliefs that all are NOT created equal; NOT endowed by the Creator with rights; and do NOT possess the right to life, liberty, or pursuit of happiness.4
The Declaration was a missile aimed directly at monarchy. Rights are not innate to a divinely sanctioned ruler; they are conferred by the Creator to allindividuals, who establish a governmental order that derives its legitimacy from these individual rights. It inverts the social hierarchy. It shatters the political order.
We fail to appreciate the enormous risks taken by Jefferson and his colleagues. Their fortune, family, position, reputation, their very lives, were staked on a bold declaration to lead an insurrection against the world’s most powerful, wealthy, militarily strong country that controlled every corner of Colonial governments, every courthouse, every harbor, every form of authority and power in the land, and was supported by many passionate Colonialists. It was not only a declaration of war against British monarchial rule, it was a trigger for a civil war. These unlikely rebels had no army, no navy, no organized government, no currency, no allies. This was a wild, crazy, mad gamble.
Unappreciated by our country’s founders was how far reaching “All are created equal” would become; not only a bugle call to overthrow monarchy, but a clarion call for a revolutionary moral order. It is not an accident that abolitionist movements first appeared in human history at the exact moment as the revolution against monarchial order.
The 250 years since the Declaration have been marked a growing appreciation of the reach of the ethics of rights and equality. The abolitionist movement to end slavery has been joined by other liberation movements that continue to our day: civil rights, equality of women, the rights of children, of animals, gender equality, the rights of persons with disabilities. All these moral uprisings have involved struggle, hardship, sacrifice, and bloodshed. They continue. This history affords appreciation of the long, arduous, and tumultuous process required to wrench behaviors out of taken-for-granted darkness into the hot glare of moral injustice.
Our highest ideal, that all are created equal, is inherently disruptive, calling us to continually confront injustice and reaffirm our commitment to a more perfect union.
Jefferson a Racist?
Was Jefferson a racist? Of course. As was everyone else of his time—and before. The pointed critiques of how racism, sexism, and the other heretofore invisible injustices have shaped human history, and killed, maimed, and destroyed so many lives, is a necessary corrective to the blindness of the past. But simply condemning Jefferson and his brethren who championed equal rights, at great personal cost, without recognizing their contributions flattens the moral landscape; affords easy self-righteous moralizing at the expense of understanding.
250 years from now, will any of us escape the moral condemnation certainly due us “fossilists”5 for our wanton, headless destruction of the biosphere that may, perhaps, render human life extinct by then? How many of us have sworn off using fossil fuels; taken a Jeffersonian stance to create a revolutionary political, social, and moral order to save our species and our planet? Very, very few.
But damning climate patriots among us as equally morally reprehensible as the CEO’s of the oil and gas companies, and the commercial and political interests that denude, pollute, and destroy the planet’s forests, rivers, and oceans, flattens our moral landscape, obliterating the crucial differences between individuals that make a difference. This is how revolutions happen: Individuals, in the trenches, relentlessly, aggressively, tirelessly, in the face of long odds and bleak prospects, at great personal cost, confront and attempt to upend powerful vested interests. The future is forged by the efforts of courageous, imperfect individuals acting in the uncertain, messy present.
Jefferson and the other Declaration instigators were imperfect visionaries in their messy, uncertain time who championed ideals they were prepared to die for. They bequeathed to us a moral order that, ironically, we use to condemn them for their moral shortcomings. This paradox is their legacy. When we critique them, we need to do so with gratitude. And humility.
I had never given much care to genealogy, thinking it a form of navel gazing into a distant past that has no bearing on me, and my life, now. That changed after my mother died. My father’s family was Dutch and my mother’s was German and Scots-Irish—solidly Christian, White, and Northern European. Checks all the boxes. However, my mother’s father died in 1934, when she was 12, and she told us she thought he had a family secret, but didn’t know what it might be and didn’t want to find out. After my mother died, my sister did a genealogy of my mother’s family and discovered my grandfather’s secret—he was Jewish. He changed his name when he immigrated to the U.S as a 16 year-old.
I was stunned. At a subterranean level, I felt very vulnerable. I certainly understood antisemitism and abhorred it, but from a position with my feet firmly planted on the “mainland”, waving with empathy at those on a close, but offshore island (e.g., Jews and the other “outsiders”). Now I share not only a past with some of those offshore, but given the long history of parsing ancestry to sniff out Jewish ”blood” for extermination, it changed my understanding of myself, my identity. The low rumble of antisemitism was now quite audible and personally menacing. The living power of the past, its relevance for my life in the present, and its possible consequences for my future—pile-driven home.
The seismic power of the past is also the engine of effective psychotherapy. We all construe stories about our identity; who we are, what formed us, who influenced us, what memorable events mark our lives. These stories compose our identity and are distilled into habits, assumptions, and reactions that reflexively govern much of what we believe, say, and how we act. The past is relived at the visceral level, guiding our present lives and our expectations for the future. When these guideposts falter or breakdown we can find ourselves unmoored; anxiety, angst, anger, and despair become our companions. Therapy, in its various forms, focuses on changing the visceral assumptions, and this often involves reexamining the stories we tell ourselves about ourselves.
Therapy for trauma evinces this in its most stark form. Trauma therapy must directly confront shocking past experiences to forge new narratives, new habits, new reactions about the trauma’s meaning and its bearing on who we are. This therapy can be destabilizing, painful, and terrifying, requiring great courage. This is a major reason therapy is often avoided; the pain we know is less frightening than the destabilizing unknown pain that awaits. But the courage to confront the horrors of the past is rewarded with a reclaimed life and hope for the future. The only way out is through.
A similar process to therapy occurs at the national level. Our narratives about our past, which is the history that we tell ourselves about ourselves, forms our identity. These narratives, typically, are heroic; obstacles faced, challenges met, and adversity overcome at great odds. They are morality plays where, after grappling with demons for 40 days and 40 nights, virtue and righteousness triumph over evil. Our national pride is, after all, pride about our past, which defines who we are now and what we hope for our future; it shapes our political landscape and national conversations, our laws, institutions, legislation, and elections.
Slavery is a 400-year indelible stain on American history. It was integral to our country’s founding, essential to its economic viability and vitality, and intrinsic in its social structure. Unimaginable cruelty, brutality, suffering, and murder of slaves, and their descendants, have been routine in American life for centuries. The presence of the progeny of slaves in our midst—in their very appearance—is a stark reminder of the horrors of our past, evoking reflexive, habitual reactions conditioned by the longstanding narratives about race; about Black and White and what they mean.1
The current upheaval in our country about racism is a challenge to the dominant stories of our history, of our identity. As with therapy, and genealogy, our past is complicated, filled with facts—known, unknown, avoided, denied, and ignored— selectively highlighted and given structure and meaning by our narratives. Was Robert E. Lee the hero of the “Lost Cause”, fighting for “States Rights” and a “Noble Defender of the South?” Or was he a traitor leading a rebellion to preserve slavery and destroy the Union? The struggle over the narratives about our past is not simply an esoteric debate among academics; it is a struggle for our nation’s soul: Who were we? Who are we? Who do we want to be?
The Abolitionist movement, the Civil War, the 13th, 14th, and 15th Constitutional Amendments, and the Civil Rights movement of the 60’s are all landmarks of progress toward equality; achievements certainly worthy of note. Our narratives, however, only highlight these accomplishments, avoiding the 400 years of murder and misery. We comfort ourselves with hagiography to avoid honest history.
The power of the trauma at the very heart of our national identity—slavery— threatens the foundations of our body politic, our civil life, our personal engagements. We, collectively, face the same onerous challenge as many combat vets: Do we choose to continue with the pain we know, or do we have the courage to face the searing hard truths of our past, our moral failings of great consequence, and endure the disturbing uncertainties and disruptive pain that will result?
Germany offers a model of what this might look like. Monuments to their Nazi past are scattered throughout Germany, with an especially dense concentration in its capital, Berlin, marking the sites of momentous dark happenings, egregious atrocities , and homage to the victims who were tortured and murdered. These monuments testify to this past and bear witness to grievous moral failings. They also, however, are bold statements of Germany’s values, now, and their commitment to a future informed by this past. They display a unique kind of heroism worth emulating: moral courage.
National monuments to the past are values we hold, now, about ourselves, made visible.2
A Niagara of books have poured off the presses in recent years extolling the ever expanding opportunities available to the elderly and retirees. “80 is the new 60”, we are told. “Start a new business.” “Follow your dreams.” The AARP magazine carries pictures of aging movie stars, in their 60’s, 70’s, and sometimes 80’s, facelifts intact, smiling with perfect teeth, extolling the virtues and pleasures of aging.
There is some truth to this. Americans are retiring with greater health, wealth, and opportunities than ever before. Rarely acknowledged, however, is the black-hooded figure, circulating through this party, striking down revelers, sometimes mid-sentence, to the horrifying glances of other revelers. No one is exempt, and the most susceptible are those who cannot afford the books and magazines trumpeting geriatric wellbeing. Since the onset of the pandemic, however, much of the chatter about the joys of aging has subsided and the disparities and injustices in health, wealth, and happiness are now glaringly apparent.
Statistical Life Expectancy
We do, nonetheless, live in extraordinary times. The statistical definition of life expectancy is the average, across all births, of how long a person may expect to live at birth. Life expectancy, for most of human history, has remained remarkably constant. Based on the best estimates from the historical record, life expectancy, across all civilizations, from ancient Greece and Rome, to the Inca and Teotihuacan empires, to Renaissance Italy and medieval Japan—indeed, up to the mid 19th century, was around 30 years.1 These data lead to the obvious conclusion that 30 years is the biological limit of life expectancy for the human species. Every species has its lifecycle and this is ours.
Dramatic changes, however, occurred in the last 150 to 200 years. In 1850 in the united States, life expectancy was 38 years; in 1900, 48 years; in 1950, 66 years; in 2000, 77 years. The change in life expectancy for the world shows an even more startling increase. Prior to 1900, life expectancy was 30 years. By 2013, it had risen to 72 years. The global average today is higher than it was in any country in 1950. Life expectancy for the entire human population has doubled in 200 years! This is an astonishing improvement in human life.2
Why? Science. A radically new way to understand the material world, based on doubt, systematic methods of experimentation, and material explanations that can be objectively verified, was developed in the 16th and 17th centuries. When this method began to be applied to medical conditions in the 19th century, astonishing discoveries and advances occurred. We now have the power to intervene in the course of nature, able to change our biological destiny.
The full benefits of this power are not equally distributed in the United States. Communities of color who have been targets of systemic racism and injustice, and those living in poverty have significantly lower life expectancies. We must ask ourselves, “Why?”
Semantic Life Expectancy
Life expectancy also has a semantic meaning: What do we expect from life? What do we anticipate for our future? What possibilities are open to us? Semantic life expectancy also has remained relatively constant throughout most of human history. The answer to the question, “What do you expect from life?”, was simple: Our fate is the same as our parents; and their fate was the same as their parents. Our birth determined our destiny, and little changed from generation to generation. And this fate was likely grim. 95% of the population were laborers, surfs, peasants and poor. Life was toil, suffering, degradation, and hardship. The most important semantic life expectancy in a nasty, brutish, and short life, was the question of afterlife expectancy; what fate awaits beyond this mortal coil of suffering.
Now, in our current times, the semantic question, “What do we expect from life?”, extends beyond the confined straightjacket of our birth, embracing possibilities unimaginable to our ancestors. Our youth is shadowed by the question, “What do you want to be when you grow up?”, we experience “identity crises”, and spend the first 18 years of our life, and oftentimes much more, going to school; many of us spend almost the entire life expectancy of earlier times preparing for life.
We enjoy a surfeit of food, which is delivered to “super” markets in fleets of tractor trailers, miles of railroad cars, and a flotilla of ocean tankers from distant places, offering a cornucopia of choices. We travel at breakneck speeds across town, across the country, and across oceans. In our heated and air conditioned homes, we turn facets to shower and to drink purified water. Music, information, and entertainment to amuse and inform us are at our fingertips, 24/7. We enjoy comforts that would have made even royalty of bygone eras jealous. And after 30 or so years of work, many of us retire to enjoy 20 or more years of labor free life, embrace, “80 is the new 60”, and ponder: “What do I want to do with my life?”
This dreamscape is not evenly distributed. The semantic life expectancy of those outside the circumference of health, wealth, and opportunity is much grimmer and bleaker. The happy talk about the “Golden Years”, and the wrinkle-free faces and cheerleading smiles of aging celebrities are marketed to a select group; the fate of the impoverished and communities of color who have been targets of racism are airbrushed out of the picture. Again, we must ask ourselves, “Why such disparities?”
The answer to this troubling question about semantic life expectancy is, not surprisingly, the same as for statistical life expectancy: both result from the lack of good health care, nutrition, housing, education, opportunity, and employment.
The 19th century not only ushered in a dramatic rise in statistical life expectancy, but another startling growth: the size of the population.
This growth begins around 1850. We would expect that a population growth like this would usher in mass starvation and universal misery. It has not.
Costs and Consequences
The exponential increase in life expectancy and in population trace almost identical paths. These have been accompanied by a host of other exponential growths, referred to in scientific circles as the “Great Acceleration”.3
This graph captures the costs on the earth’s biosystems of our long life of luxury, and includes atmospheric composition, stratospheric ozone, the climate system, water and nitrogen cycles, marine ecosystems, land systems, tropical forests, and terrestrial biosphere degradation. You can see the initial jump out of the steady state beginning in 1850, about the same time as the jump in statistical life expectancy, and biosphere destruction picks up steam into the 20th century.
Great inequities lie behind these trends as well. Currently, 18% of world’s population controls 75% of world’s wealth, and consumes much of the world’s resources. The biggest consumer of the world’s resources is us in the United States.
This second graph traces trends in the socioeconomic factors contributing to our affluent life and includes economic growth, primary energy use, fertilizer use, large dams, water use, paper production, transportation, telecommunications, and international tourism. These make our remarkable lives possible.
The great acceleration of socioeconomic changes, as we can see in this graph, is much steeper than the prior one. It took a century of environmental exploitation to build the infrastructure that enabled the abrupt, explosive growth in socioeconomic benefits. The appearance of science and modern medicine, the dramatic rise in statistical life expectancy, the rapid industrialization, steep acceleration of socioeconomic factors, and transformation of semantic life expectancy are all interconnected, all of a single piece of a profound, historically unprecedented alteration in human life.
This all came together around 1950, as this second graph indicates. After WWII, the United States emerged as the only combatant nation untouched by bombs or invasion, the economy humming at war time production, and possessing a GDP that was equal to the entire rest of the devastated world. Rarely, if ever, in human history has so much of the worlds wealth resided in one country.
We have been living in this historically anomalous time, in this historically anomalous place, where many of us have enjoyed the bountiful life expectancies of our unique and narrow place-time envelop that could not even have be dreamed of by most humans who have ever walked this planet.
And it is ending. The entire edifice that undergirds our privileged life is unsustainable. The bill has come due on the costs: the biosphere is being degraded beyond repair, species are being exterminated at a rate unseen for 65 million years, essential resources are being depleted, and our planet is irrevocably changed. We are in the midst of rapid and profound changes to the entire biosphere that no previous single generation in human history has experienced.
There will not be a return to “normal”, as normal was decidedly not normal.
The pandemic is simply a baby-step dress rehearsal for the cataclysmic changes rushing toward us. The pandemic provides a preview of how capable we are to effectively respond to a known, impending catastrophe. We, in the United States, have failed. Miserably. We can’t even get cooperation on the simple inconvenience of wearing a mask. This response is a sign of a deeper unraveling of American society.
Furthermore, the life expectancies for our children and grandchildren are being dramatically altered. So too, for those of us who have come of age in the midst of 9/11, the 2008 financial meltdown, in the shadow of global warming, and, now, the pandemic.
Ethical Life Expectancy
Embedded in the statistical and semantic meanings of life expectancy is a third meaning: Ethical.
We now have the power to intervene in course of nature, able to change not only our biological destiny, but of that of the entire planet. It is a fearsome power with equally fearsome responsibilities. We live in an apocalyptic age. There are 2 meanings for this term. The one we are most familiar with is “the impending destruction of the world.” The second is the original Greek meaning: “A revealing of things not previously known”. This meaning beckons a response, poses a challenge to confront a new reality, to forge new paths, to plant seeds for new possibilities from the ashes of the of what has been lost.
Both definitions apply. We face the impending destruction of the world. We also are beckoned—courage, vision, and an unprecedented marshaling of the talents, energy, and collaboration of the entire human community are urgently needed.
Each of us must choose. We are at a high leverage point in time where actions now will have huge consequences for the future—even if there will be a future for our children and their children. The onrushing catastrophe of biosphere destruction, the appalling disparities and injustices between the wealthily privileged few and the impoverished many, and the societal unraveling, pose a most dire moral challenge:
How should I Iive? What is the nature of the Good? How should we live together? By what authority? These life quandaries, often not explicitly stated, have haunted humans from the time of our cave dwelling ancestors. Religion provides an explicit, sanctifying framework that situates our lives within a cosmic horizon, providing meaning, purpose, and moral grounding. Answers to fundamental moral quandaries are conferred by supernatural powers beyond the frail groping of humans—something clear, universal, unassailable, absolute.
Christianity and morality have been synonymous in the West for nearly two millennia, the Bible providing the moral pillar supporting church, state, and the grounding for adjudicating good and evil. The worst crime in Christendom was not murder (”Thou shalt Not Kill”), as punishment could be mitigated by circumstances1, but heresy, which usually could not. Indeed, heretics received especially intense condemnation and persecution, and for good reason. Heresy doesn’t violate a commandment. It is much more dangerous—it challenges the legitimacy of the commandments.
Not surprisingly, one of the deepest divides in contemporary American life and politics is between moral absolutists and their rivals, often called “moral relativists” by the absolutists. Absolutists, led by the Christian Right2, claim the country was founded as a Christian nation and therefore should adhere to Christian moral dictates. The “relativists”, in stark contrast, allow for a multiplicity of moral codes and religious beliefs. Indeed, they argue for respecting diverse moral orientations, and strive to be open and non-judgmental, acknowledging the claims of legitimacy of many, often competing, moral frameworks. Debate and disagreement are to be expected, but no one approach is inherently superior to another, should hold sway simply because those in power say-so. What is valued is a plurality of voices and possibilities.
The absolutists raise challenging questions about this seemingly all-embracing doctrine of fairness and acceptance. How are we to arrive at any moral certainties, to find any moral basis on which to act, to discover the answers to: “How shall I live?” “What is the nature of the Good?” And how, specifically, are we to address foundational political questions: “What should be the rule of law?” “By what justification?” It is easy to conclude that the “relativists” guiding moral principle is that none should hold sway, that morality is arbitrary, that “anything goes”—simply another name for amorality. It is also easy to understand the absolutist’s opposition, even militant resistance, to this apparent decent into the moral abyss.
The absolutist code, free of the confusions of mortals, offers the promise of clarity, safety, and security. As alluring as this is, it begs the question: Whose moral code? Christians were among the first settlers to arrive in America, en masse, from Europe. Most made the harrowing journey to this distant shore because they were persecuted minorities in their country of origin, heretics to the ruling orthodoxy. Methodists, Baptists, Congregationalists, Presbyterians, Lutherans, Quakers, Mennonites, Huguenots, Catholics, and Moravians all fled to the “New World” seeking freedom to practice their unique orthodoxies without persecution.3 European history, that is, the history of Christendom, is written in the blood of the vicious slaughter of millions over disagreements about orthodoxy. The lesson to be learned from over a millennium of Christendom’s history is that Christian absolutism leads to absolute chaos, wanton murder, and brutal persecution of individuals whose sole moral failing is to believe a different interpretation of biblical text.
The framers of the Constitution of the United States, having just won a war of independence from a despotic monarch who was also head of the state church, were acutely aware of this legacy of Christian absolutism. They also were acutely aware they were creating a new order, free of absolutism. Monarchy was countered by an elected president and a system of checks and balances. Christian absolutism was countered by the first Constitutional amendment guaranteeing freedom of religion. James Madison, the principal author of the Constitution, understood “that the government sanction of a religion is a threat to religion: Who does not see that the same authority which can establish Christianity, in exclusion of all other Religions, may establish with the same ease any particular sect of Christians, in exclusion of all other Sects?4 God is not mentioned once in the Constitution.
America was not founded as a Christian nation. It was founded as a nation defined by the Constitution, establishing a form of government unlike any in human history5; one that has become a beacon for many other peoples across the globe seeking liberty. It is a radical alternative to absolutism in its many forms. It is more than a political document. The Constitution is a Revolutionary Moral Order. It allows a multiplicity of moral codes and religious beliefs, respects diverse moral orientations, is open and non-judgmental, acknowledging the claims of legitimacy by many, often competing, moral frameworks. Debate and disagreement are to be expected, but no one approach is inherently superior to another, should hold sway simply because those in power say-so. What is valued is a plurality of voices and possibilities.
This moral order is not “moral relativism”. It embodies the values of democracy, explicitly crafted to avoid the plagues of moral absolutism, religious warfare, arbitrary justice, and the gross mistreatment of the many by the few. It is a statement of ethical principles of relationship, of respect for each person. It is the basis for justice and order of a different kind than offered by absolutists; it forbids as much as it allows. It also is not the opposite of absolutism—it is an alternative. The opposite of moral absolutism, as well as democratic morality, is anarchy, the true morality of “anything goes.”
We live in a large multicultural society with untold number of congregations and believers ascribing to diverse, often absolutists, moral codes and commandments. We are confronted with the same urgent question as the American founders: How can we live together if there is NOT a superordinate moral and political framework that allows a multiplicity of moral codes and religious beliefs, respects diverse moral orientations, and acknowledges the claims of legitimacy by many, often competing, moral visions? Democratic morality allows each of us to live a moral life, free from persecution, and in doing so necessarily results in disagreement, confusion, and uncertainty. It also can be quite distressing and disturbing, requiring strength, fortitude, faith, and humility. Living a moral life is never easy. But it is necessary. It is what living in a democracy involves. Living together peacefully, but with considerable discord, in the 21st century requires that we embrace, with courage and conviction, the demands of democratic morality.
If a tree falls in the forest and nobody hears it, does it make a sound? This oft-cited philosophical question is an example of epistemology, which is the philosophical inquiry into the nature of knowledge, asking “What is true, what is real, and why can we say so?”
I first encountered this question as a freshman engineering major in an elective course I took on philosophy. I was utterly befuddled: “What are they talking about?” “What’s the point?” “Who cares?” The “tree-falling” query is also a source of ridicule about the absurdities generated in philosophy, so my befuddlement is widely shared.
I have since learned, however, the importance of questioning the truths we believe, and the reasons we believe them. The answer to the “tree-falling” question, the one that makes sense to me, is that when it falls, it creates vibrations in the air, but if nobody hears it, then it makes no sound. There are physical consequences that are independent of a human presence (vibrating air), and social-psychological consequences that require human presence (sound). So, the answer is “no”, there is no sound, but “yes,” it is an event with physical consequences.
The “no and yes” answer undermines the common assumption of a singular, imperial Truth. Rather, it points to different kinds of truth: a natural kind of truth, which is not dependent on human presence, and a human kind of truth, which arises from human presence, often within a social context.1 What is considered real, what is deemed truth, differs greatly between these kinds of truth. Important practical, even lifesaving consequences, can result from appreciating these differences.
Malaria, tuberculosis, small pox, bubonic plague, cholera, and influenza are the most deadly diseases in human history, killing untold billions of people, bringing unimaginable suffering, and changing the course of human history. A host of other diseases, although not quite as lethal, have made their unique contributions to human misery and include AIDS, yellow fever, typhoid, tetanus, meningitis, diphtheria, measles, whooping cough, chicken pox, and polio.
All of these diseases have been cured, prevented, or their effects have been greatly mitigated and managed through vaccines, sanitation, and other medical treatments. The estimated lifespan for most of human history, up to the 19th century, was 30 years of age. In 1850, it was 38 years; in 1900, 48 years; in 1950, 66 years; in 2000, 75 years.2 Prior to the 19th century, presumed causes of disease included the visitation of malevolent spirits, retribution rained on humankind for their transgressions by angry gods, alignment of the stars, imbalance of bodily processes, and miasmas. Cures included charms, amulets and chants; sacrifices, offerings and prayer; smoke, nosegays and herbs; potions, baths and purgatives; self-mutilation, bloodletting and witch-killing. Despite the many deeply believed causes and desperately sought cures, little worked. What did work did so by accident; the reasons were not related to the presumed cause.
What changed? Science. A radically new way to understand the material world, based on doubt, systematic methods of experimentation, and material explanations that can be objectively verified, was developed in the 16th and 17th centuries. When this method began to be applied to diseases in the 19th century, germs were identified as their cause. Despite great resistance, astonishing medical advances ensued. Human life expectancy has doubled in the last 150 years!
Germs are a natural kind of truth. They are indifferent to human beliefs, have consequences that only derive from their physical properties, and are real whether we think so or not. Plagues cannot be stopped or mitigated by chanting, appeals to supernatural powers, or bloodletting. Human actions can, possibly, change how natural kinds of truth impact humans, for better or worse, but cannot change the their causal properties. The entire physical world is composed of natural kinds and causes.
Money is real and has very tangible consequences. If you do not have any, you are in very serious trouble, possibly life-threatening trouble. If you have a lot, then luxury, ease, and opportunity beckon. Money has a material presence, traditionally in the forms of coins and paper, and can be counted, calculated, and subjected to the most advanced, complex mathematical analysis. Nothing could be more real in our lives, more concretely manifested, with quantifiable properties and consequences.
But it is not a natural kind of truth. Its meaning, its value, its reality, does not derive from its physical properties. What makes money money, what confers its value, makes it real, derives from the communal belief that it is valuable. Money is money because we trust that others share our belief in its value, and trust the backing and assurances given by those who issue it. RAI stones, bottle caps, and shells are not money in our culture. They may, perhaps, be valued by some, and might be bartered, but they are not money.
During times of stability and general wellbeing, we go about our daily lives assuming the bedrock fiscal reality of money. The foundational nature of belief and trust as the source of value for money becomes distressingly apparent, however, during times of communal crises, such as war, fiscal collapse, and pandemics. Doubt about the value of money, and mistrust of the assurances and policies of those who issue it, can provoke hyperinflation or deflation, bank-runs, stock crashes, hoarding, and acquiring gems, gold and other and precious metals.3 They also reveal why trust in the leader of a society, especially in times of crisis, is so important, for trust is, quite literally, the coin of the realm, critical for surviving times when confidence and trust have been undermined or lost.
Money is but one example of human kinds of truth arising within a social context that are real, have concrete consequences, yet derive from shared communal belief. Our life is structured, organized, and populated by human kinds of truth: stop lights, building and legal codes, jails and juries, tools and toys, voting rights and election outcomes, democracy and despots, corporations, ego, intelligence—the list is endless. Many, if not most, have material properties, but their reality arise from communal beliefs and agreement.
I am amused, and deeply disturbed, when I see reports of the most recent poll assessing whether Americans believe in global warming—as if the issue is a referendum! It is not a human kind of truth; not a phenomenon that is amenable to belief. Its causal properties are not determined by popular belief or majority rule, and government proclamations banning the term will not make it disappear.4 Unlike money, or corporations, or election results, global warming is a natural kind of truth. It is indifferent to human beliefs. Global warming is happening, with evermore-likely catastrophic consequences. Human action can mitigate it, but only by taking the necessary steps that impact the material causal pathways governing it. Likewise, pandemics cannot be wished away, are not subject to politically opportunistic remarks, cannot be cured by beliefs in supernatural powers, or ended by leaders asserting that, “One day it’s like a miracle—it will disappear.”5
Failure to distinguish between natural and human kinds of truth can have dire consequences. Many deaths result when pandemics and global warming are treated as human kinds of truth. It could be called murder if decisions are made by leaders who, knowing that it is, in fact, a natural phenomenon, but because it is unprofitable or politically problematic, promote policies and practices that exacerbate and accelerate it.6 Leaders who understand the difference between natural and human kinds of truth and seek to use this understanding for the public good, not personal gain, save lives and rescue societies in crisis. Electing those who don’t can be fatal.
Crises expose. Some crises are personal; a traumatic event, life-threatening illness, death of a loved one. Some are communal; war, economic collapse, pandemic. Crises rupture our everyday life, upending what we have taken for granted, rendering what we thought essential as trivial, forcing us to confront stark realities about our life—and the impending end of our life.
Crises cloud our vision, as anxiety and dread about our future, our fate, our survival, disorient and overwhelm. The rupture of crises also expose, perhaps for the first time, the hidden undergirding that gives our life structure and stability. It can be a clarifying moment, if we open our eyes, allowing us to understand and appreciate what we otherwise failed to see.
The pandemic affords us this opportunity. What is important? Basic needs: food, shelter, health, and safety. We discover that those who are essential for providing these are not sports “heroes”, media celebrities, or hedge fund managers. The heroes are nurses, health care workers, doctors, and hospital cleaning people; truckers, delivery people, mail carriers, and supermarket employees; police, firefighters, utility workers, and trash collectors. They risk their lives for the greater good, and have performed these tasks, every day, before the pandemic—and with few exceptions, have been invisible, unappreciated, and underpaid.
The previously simple act of driving to the market to buy supper is now a considered act of anticipation (when is the best time?), hope (have they run out of…?) and consternation (will I be able to feed myself and my family?). It is also an act made possible by a dense network of rules, regulations, policies, and fiscal commitments by the government. The extent of this is so enormous that only a small set of examples are needed to make the point.
Get in your car. There is standard, mandatory equipment, critical for travel, such as breaks, headlights and taillights, windshields, wipers, etc., etc. Driving requires everyone to follow an array of rules and regulations, otherwise commerce and travel would be nearly impossible; stay on the righthand side of the road, “stop” on red, “go” on green, etc., etc. All drivers are required to be licensed to guarantee universal understanding and competence. Once we begin our drive, the hidden governmental structures making possible the roads we travel also fill volumes—and we have not even arrived at the market, which possesses its own vast edifice of governmental structures and supports. A simple drive to the market, so critical for sustenance and now a conscious focus of concern, is supported and made possible by the “deep state”. And this is only a single, simple act. Almost everything we do in our modern life would be impossible without government, which enables complex collaboration, commerce, and exchange among over 300 million citizens. The past 40 years has been marked by the rise of a powerful political movement that has marched to the slogan, Government is not the answer, it is the problem. One of the aims of this movement have been to dismantle programs that arose during the New Deal, such as Social Security, and subsequent programs, such as Medicare. “Wasteful, inefficient, and unnecessary resources given to the undeserving” is chanted under this banner. It has been more than 75 years since the Great Depression and World War II, when we were confronted with a global crisis like the pandemic1.
Individual initiative, drive, and intelligence, as well as corporate profit-making and market forces, are powerless to effectively confront the challenges posed in these times of crises. Government is not only the answer, it is the only answer. Massive marshaling of resources, creation and coordination of agencies to address a single aim, and new agencies and regulations for banking, commerce, fiscal markets, and corporate conduct are needed, as is deficit spending and government programs to provide food, shelter, health, and safety.
The pandemic, like previous global crises, requires massive government intervention, unprecedented deficit spending, and a new “rulebook”, which is in the process of being written as we go. Very few of either political party object, and there is near universal agreement of the necessity to multiply the size of national debt. And who is at the head of the line, hat in hand, demanding a governmental handout? Captains of industry, CEO’s of investment firms, small businesses owners, and other ardent proponents of Government is the Problem.
Church and state—inseparable since the beginning of civilization, when Homo Sapiens first developed agriculture, exchanging hunter-gathering for a settled life in large communities. All of the early, great civilizations, from China to Egypt, Inca to Sumer, as well most others, were characterized by a typical organization structure. Peasants, who cultivate the crops, tended the livestock, provide the food for survival and the labor for buildings and projects. Soldiers, who collect taxes and tributes, enforce order, and protect the community from outside attack. Priests and spiritual leaders, who serve as emissaries to the sacred realm of gods and spirits, seeking their favor, divining their intentions, offering sacrifices to appease and please, and performing religious rites and ceremonies. And royalty, who rule with the authority to orchestrate, organize, and command, providing the leadership that enables a large community to cohesively function. Peasant, soldier, priest, and royalty — interlocking pieces, held together by their scared beliefs, rituals, and practices.
Agrarian cultures, because they are dependent on the food provided in large, cultivated crops, and dwell in large settlements with people living in close proximity to each other and their domesticated animals, are especially vulnerable to droughts, fires, floods, earthquakes, storms, plagues, crop failure, illness, and disease, as well as attack by nomadic and warring neighbors, any of which can bring dire consequences. This chaotic, turbulent, calamitous cosmos is under the sway of the Gods and spirits. The priestly class, imbued with powers that partake of the divine, are especially important for the survival of the community. Royalty, too, are divinely touched, conferring special powers, potency, and force. Priests and royalty—united by their shared privilege of being agents and messengers of the divine.1
Religious practices and cultural beliefs, everyday life and legal order, church and state, are of a single piece, woven seamlessly into a cultural tapestry. The hierarchical social order—that some humans have higher value, possess special powers and attributes, and should be conferred special positions and privileges as their birthright—is assumed; an obvious consequence of a divinely-given hierarchical cosmic order.
We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness. This famous line in the Declaration of Independence is NOT self-evident, especially at the time of it being penned. Indeed, it is a manifesto of revolution and rebellion, announcing its radical intent for a new order. And in this new order, each individual, not just special privileged classes, is divinely consecrated. Hierarchy and the privilege of rank are tossed overboard, replaced by equality and unalienable individual rights.2
The rending of Church from State was an especially important outcome of the Revolution in a new country composed of immigrants of various religious sects, many fleeing persecution for their beliefs. This separation is historic, and essential to establishing a new order based on secular, civil authority and laws.
Profits and Rebellion
In 1776, the same moment as the Declaration of Independence, a revolutionary book was published: The Wealth of Nations. This is no coincidence. The Industrial Revolution was creating upheavals in every corner of British culture. The Wealth of Nations, the first systematic treatise addressing economic philosophy and practices in a market economy, provided guidance to prosperity in this bewildering time. It is still oft cited, chapter and verse, as if Biblical scripture, to justify and support contemporary economic policies.
America, and its Revolution, is a child from the loins of this new economic order; money, business, and profit shape our origins, our Revolution, our laws, our culture, our selves. The founding of New York City, the heart of the American colonies, and a place of primary influence to this day, bespeaks this history. Originally settled as New Amsterdam by the Dutch, it was not founded by the Dutch government, but by the Dutch East India Company.3New Amsterdam was a corporate trading outpost. And a very profitable one that passed into the hands of the enterprising British, for whom the entire New World, including the New York, became an engine of imperial wealth.
The American colonists became very British, who, in very British fashion, protested when Parliament pinched their pocketbooks. The American Revolution was ignited by taxes; by injustices perceived by the merchant class, whose income was threatened by the tax levies. The signers of the Declaration of Independence were prominent, wealthy merchants and businessmen, and the Revolution was fueled by Britain’s violation of the rights of the enterprising class to earn the profit they thought their due.4
Church and state were separated after the Revolution. A new entanglement, however, perhaps as invisible to us as the Church-State entanglement was to earlier civilizations, has emerged. Money, business, and profit, the DNA of this new order, accelerated post-Revolution. Individual rights, a foundational belief for our democracy, was extended to African Americans by the 14th Amendment after the Civil War. Shortly thereafter, it was extended to…corporations!
What kind of twisted logic could render an Amendment for enfranchising former slaves to include conferring rights to a fiction that derives its existence from a legal contract? Answer: A logic that is “reasonable” within the cultural DNA of money, business, and profit. It is an analogous “logic” to that which gave coherence to early civilizations, whose foundations were tethered to the divine. The absurdity of our logic is less visible to us because we are less aware of our foundations.
The consequences of our cultural DNA, and its resulting “logic,” are far reaching, pervasive, and profound. Corporations possess freedom of speech and religion, and have the right to refuse goods or services on religious grounds. When they break the law, nobody goes to jail—because there is no body.
Unlimited “dark money” campaign funding by corporate-backed groups now fuel a politics that is unconstrained by libel, lies or scruples. Lobbyists swarm legislators, plying them with milk and honey in exchange for legislative favors; there are 23 lobbyists for every member of congress, spending over $3 billion. Lobbyists are routinely appointed to government posts that oversee the industries they lobbied for (the current administration has appointed 187 former lobbyist to government positions). And, lobbying is the single most popular post-retirement career choice by Congress members. The revolving door spins, and the line between public and private interests fades.
Is bribery a crime if it is called lobbying? Is extortion criminal if it is called “soliciting campaign contributions”? Is a government auctioned to the highest bidder corrupt if it is called a democracy?
How easy the transformation from criminal to legal when meanings are reconfigured to align with axiomatic cultural logic. Church and State has been replaced by a new, double helix entanglement: Corp. and State.