December Songs

"As the days dwindle down to a precious few..."

Homo Economicus

We are no longer Homo sapiens—wise humans, which is what sapiens means. We have evolved into Homo economicus. These creatures are motivated by rational self-interest who seek to maximize their wealth and the “utility,” or satisfaction, derived from consumption of purchased goods. Homo economicus arose in the 17th-18th centuries in response to a dramatically altered environment where capital replaced land as the basis for wealth, and large scale production of goods in an industrialized landscape changed the living conditions of Homo sapiens.

The Wealth of Nations, published by Adam Smith in 1776 when market economies were emerging, is considered by many to be the progenitor of Homo economicus. This revolutionary work, offered at a tumultuous time when capital markets and democratic uprisings where transforming human life, offered a radically new way to understand wealth and government. Although he did not use the term, Homo economicus, Smith’s analysis hinges on the revolutionary assumptions that characterize Homo economicus: individuals, motivated by rational self-interest, seek to maximize their profit.1 

The Wealth of Nations has become a sacred text, often quoted chapter and verse by contemporary economists whose dizzyingly complex, mathematically based analyses typically begin with the assumptions of Homo economicus. The reach of these assumptions is not confined to economic concerns, but are presumed to motivate all human actions, behaviors, and habits. Here is what Nobel Prize winning economist Gary Becker has to say: “All human behavior can be regarded as participants who maximize their utility from a stable set of preferences and accumulate an optimal amount of information and other inputs in a variety of markets.”

Becker articulates, in the jargon of his profession, what is often presumed by many Homo economicus supporters, and these assumptions have guided analysis of almost all areas of human life, from the most intimate matters of love and family to government policies and practices. His analysis of the family, for example, addresses “marital-specific capital” (i.e., children) and argues that child services are “the commodity that provides the utility a couple receives” from this “marital-specific capital”. And here is an example of an analysis at the institutional level: A World Health Organization initiative to end river blindness in Africa prevented hundreds of thousands of people from going blind. The World Bank’s cost analysis of the program, however, concluded that these benefits were not measurable; the benefits were conferred to people so poor that there was no measurable profit from the treatment.2

“Marital-Specific Capital”

These analytic, quantitative appraisals and balance-sheet conclusions of human endeavors clearly and unambiguously state the value, or profit, to be derived from them. Debate, then, centers on whether the profit justifies the cost. But is this a success, or a reason for concern? When children become “marital-specific capital,” and a wildly successful intervention that prevent blindness in hundreds of thousands of people is questioned because the recipients are poor, then maybe this is evidence that something is amiss. Perhaps Homo economicus is a mutation that needs to be modified or eliminated.

The central value of Homo economicus is profit. The balance-sheet determines worth. But why, for example, was a major initiative, at considerable cost, undertaken to prevent river blindness in impoverished areas of Africa? Certainty not to gain a profit on investments. And children are more than capital goods for most parents. Other values are at play.

Adam Smith would certainly agree. He considered himself a moral philosopher, and in his first book, The Theory of Moral Sentiments, he argues that we are inherently social and moral beings. We care about others and this social-moral sense is essential if we are to live together without destroying each other. Adam Smith’s self-interest is not greed. It is in everyone’s self-interest to conduct commerce and exchange within a moral framework of trust and honesty, which are essential for a successful society. Theory of Moral Sentiments, established the foundation for The Wealth of Nations; wealth is built within a social-moral framework.3

Economic, social, and moral values differ starkly in the kind of exchanges and relationships presumed, and the expected benefits.4 Economic values are individual and impersonal, and economic analysis treats families, communities, and societies as a collection of individuals. Exchange is contractual and who the other person is in the transaction does not matter. Economic values are instrumental; you expect to get something of equal or greater value in return. If it has no profit, it has no value.

Social values are personal and depend on the relationships involved. Our relationship with our children, spouse, or friends may be the most important thing in our life. They have no economic value, as we cannot buy, sell, or trade it. And social relationships are reciprocal, not instrumental. Each participant gains from it, but there is nothing definite about what we will gain, when we will gain it, or even if we will get anything tangible from it.

Moral values are altruistic; things are done for their intrinsic worth with no expectation of getting anything in return. They are neither individual or relational. They are universal. We do things for others because it is morally the right thing to do. These values are not proved or supported by evidence or analysis—they are axiomatic.

The lack of proof or evidential support for moral values does not diminish their importance. Quite the contrary. Their universality can give them the animating power of ideals. People commit their lives to fulfilling these values; care for others at great personal cost, risk their safety and well-being, even give their lives for a just cause and, too, commit unspeakable atrocities in the name of the good.

The Declaration of Independence embodies foundational moral values upon which our nation stands. Axioms:
1. “All (men) people are created equal.”
2. “They are endowed with unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.”5
No justifications or proofs are provided, as “these truths are self-evident.” Moral and social values provide the grounding assumptions for how we govern ourselves; they embody our values of who we are, who we want to be, and how we define our relationships with each other. Commerce, economic transactions, and trade are conducted within the framework provided by the moral and social values that are the basis of our governance.

It is an inherently destructive act to treat all human life as some sort of financial transaction. Preventing river blindness is a moral and social value, not an economic one. Children, family, and friends are not commodities of exchange. Economic analysis should be used in the service of our social and moral values, not the determiner of them. Homo economicus hollows us out. If we do not harken the call of our foundational social and moral values, if we allow Homo economicus to become who are, we surrender our wisdom. And our humanity.

.

.

.

Free Will?

Do we have free will? This question has bedeviled Western thought for millennia, spanning the centuries from Aristotle and the Greeks, to Thomas Aquinas and medieval Christian theology, to Einstein and modern science. The power of the question derives from a primal concern about the source of our sense of agency and responsibility for our decisions and actions.

The question has practical implications that reach well beyond a debate among philosophers and scholars. So, for example, should those who are severely mentally ill or cognitively impaired be held responsible for their criminal actions? What about children? Or someone who is drugged? Or forced at gunpoint to comply with a command?

No surprise—I do not have the answer. But I offer thoughts that help me to make sense of this question and enable me to reach an understanding that has some practical value. I will confine my comments to the psychological domain (although I think they can be applied to the physical sciences as well).

Psychological Science

Provocative findings from two areas of contemporary psychological research challenge the belief in free will. One research thread suggests that automatic cognitive processes occurring out of awareness control our actions. Our sense of agency occurs after our choices have been made; choice is an illusion. The second thread offers evidence that the neural impetus to act is already under way before the conscious intention to act occurs. Again, choice is an illusion. These data and conclusions, not surprisingly, have generated pointed challenges and heated disputes among scientists.

This debate is not new to psychology, as behaviorist researchers of previous generations also argued against free will. The most notable was B. F. Skinner, who in his book, “Beyond Freedom and Dignity,” argued that we are the product of our reinforcement history and, consequently, freedom, dignity, as well as all other terms describing human traits and ideals are empty fictions. Needless to say, a great uproar ensued.

I would like to focus on four conceptual issues that are often overlooked in these disputes about the methods and meaning of these research findings: science and truth; the nature of will; yin-yang; and the pragmatics of “I can”.

Science and Truth

M. C. Escher: Drawing Hand

Scientist are independent thinkers who consider a question, pose a hypothesis to answer the question, devise experiments to test the hypothesis, and use the evidence derived from the experiments to determine if the hypothesis is true. If the evidence supports the hypothesis, the scientists have empirical justification for drawing valid, if provisional, conclusions about the question.

Now, if we accept the scientific evidence from the recent psychological research suggesting we lack free will, surely this conclusion applies to the scientists as well; that their hypotheses, decisions, tests, and conclusions are all determined.1 Validity and truth are, thus, comforting fictions, and the entire scientific enterprise of discovering an objective truth untainted by our personal opinions, subjective biases, and blinkered perspectives is a pointless exercise. What legitimate claim can scientists make that their findings should receive any more credibility than, say, astrology, reading palms, or divining from bird entrails? Paradoxically, if we accept the validity of the findings we must conclude that the findings cannot be objectively true. Furthermore, if we accept the soundness of this paradoxical conclusion, we also must, then, accept that this conclusion is not really the result of our judiciously considering the logic and meaning of the argument. Rather, it is just another round of self-delusion. And round and round it goes. Hmm…

What is Will?

Heave of Effort

It is typically assumed, both in common understanding and in the methods and interpretation of the research, that will is a single, conscious, heave of effort that leads to action. An alternative, more comprehensive understanding is that will is a “mental agency that transforms awareness into action, it is the bridge between desire and action.”2

Consider St. Paul’s definition of sin: “I do not understand my own actions. For I do not do what I want, but I do the very thing I hate.… I can will what is right, but I cannot do it.”3 I can will what is right, but I cannot do it. This is a situation that we all understand and often is a major focus of psychotherapy. A simple exhortation to engage will, to “get up off your butt and do it,” in these circumstances often fails. A conscious heave of effort fails because we harbor contradictory intentions and desires, some of which may be out of awareness. Will, the bridge between desire and action, is conflicted.

Psychotherapy is often focused on bringing hidden desires and conflicts into awareness, thus disencumbering will, enabling deeper conscious understanding of the conflicts, allowing deliberative consideration of the choices, and embracing responsibility for action. Cognitive science provides evidence for the power of unconscious factors, and neuroscience documents the neurological run-up to a conscious “heave of effort.” Will includes both these and much more.

Either-Or or Yin-Yang?

An assumption often made is that the question of free will is a dichotomous choice: Yes or no. It might be more fruitful to assume it to be a yin-yang relationship; freedom and determinism are interdependent, two sides of the same coin. So, for example, a piano has 88 keys. It is a deterministic structure that limits the possible notes that can be played. This relatively small set of keys, however, allows for the composition of an infinite number of songs. The 88 keys constrain or determine the possibilities, but within these constraints resides freedom.

Now consider the psychological and neuropsychological evidence of unconscious influences and neurological readiness potential that precedes conscious intention. We are a biological piano. We are able to make music, but are constrained in many, multiple ways by our anatomy. We do not have complete freedom and much happens out of awareness (and be thankful that it does, for we would be overcome to the point of paralysis by the Niagara of sensations, perceptions, ideas, and choices flooding our every waking moment). We are bound by the constraints of our being, which provide us with the means to create our music, sing our (December) songs.

I Can

One of the most important steps in psychotherapy is believing “I can.” It is understandable why many who are in dire straits, who have endured harrowing trauma, personal loss and anguish, or whose lives have been blighted by misfortune, feel doomed, condemned, and believe “I can’t”. The often difficult first step of seeking therapy carries the tentative hope that, “Maybe I can”. It is the beginning of the journey from “I can,” to “I will,” to “I did.”

William James, who is (in my view) the greatest American psychologist and philosopher, experienced a bout of severe depression, unable to rise from his bed, convinced that he had no free will to alter his situation. He overcame his depression by deciding that regardless of the evidence and compelling arguments for determinism, he would act as if he had free will; believe “I can.” His “will to believe”4 created the reality of free will.

James was a leader of the philosophical school called Pragmatism and he coined the term “Cash Value” to describe criteria to assess the merit and truth of an assertion or belief. Cash Value is used metaphorically, meaning “does the assertion have practical utility; does it have real-world consequences or is it merely empty words”. Cash Value can be applied to the question of free will.

Therapy is an act of courage, requiring effort, commitment, humility, resiliency, and honesty. All of these attributes, along with truth, justice, guilt, responsibility, dignity, etc., etc., etc., are self-delusional fictions if we seriously believe we are puppets of deterministic forces. The question of free will is a captivating puzzle and an impetus to interesting research and lively dispute. But, ultimately, only a conclusion of free will has any cash value. “I can” gives us our world. The choice between “I can” or “I can’t” is a hard-fought, life or death decision for many in anguish and distress, and scientific evidence and philosophical arguments are irrelevant. I take a stand for free will. I have no choice. . .5

.

.

.

What is Intelligence?

What is intelligence?

This has been among the most longstanding, fiercely debated questions across a wide range of disciplines, from philosophy and psychology, to sociology and education, to anthropology and comparative biology. The practical consequences of this debate touch every one of us. Intelligence tests mark us, route us, and shape the expectations of our teachers, parents, peers and ourselves, which have a profound effect on who we are, who we think we are, and who we become. IQ score: A single figure, so much power.

Intelligence Tests

The choice of how we measure intelligence reveals what we think intelligence is, and the history of this testing exposes assumptions that underlie the testing. Knowing these assumptions allows us to critically examine what is typically understood to be intelligence.

The first tests of mental abilities were initiated at the end of the 19th century in the newly established scientific field of psychology. Unlike the philosophical speculation about the nature of mind that dominated Western thought for millennia, scientific psychology is rooted in laboratory experimentation. Psychophysics, the measurement of the physical properties of mental activity, was the predominant approach to the measurement of mind, which assessed reaction time, memory, and various measures of sensory acuity and discrimination (e.g. visual, auditory, touch).

A revolutionary study by Wissler in 1901 overturned this entire approach to measuring intelligence. He used the newly developed statistical measure, the correlation coefficient, to demonstrate that these tests were not correlated with school performance. This is very telling. The skills necessary for success in school are presumed to be essential to intelligence. This seems so transparently obvious that we fail to see the far reaching implications.

This is understandable, as success in school is critically important in today’s modern world. Intelligence tests arose hand-in-glove with the emergence of industrialization, which required mandatory schooling to provide workers with the skills necessary for this new form of society.

The first practical intelligence test, the Binet-Simon, was first published in 1905 and became the standard for assessing school children, and a revised form is still used today. Binet’s aim was to help teachers identify children who struggled in traditional school settings and could profit from alternative settings. Binet thought that intelligence is flexible, influenced by motivational issues and personal circumstances, and that the test failed to assess other important traits, such as creativity and emotional intelligence. The measurement of intelligence quickly became swept into the eugenics movement and engulfed in raging controversies about nature-nurture, race, cultural fairness, treatment of those who score low, and a host of other emotionally charged issues; contentious issues that remain with us.

Environmental Fit

We think that we are measuring and debating a universal capacity measured by these tests, but it is an intelligence that “fits” today’s techno-industrial world. Ours is a human-made environment; a very narrow, artificial environmental niche. Humans have adapted to the most diverse environments on the planet: deserts, jungles, Arctic regions, mountain tops, tropical islands, caves, savannas. Almost everywhere. Homo sapiens evolved approximately 300,000 years ago. Written language emerged about 5000 years ago and the first educational system was created about 4000 years ago. Schools have been absent for 99% of humans’ time on the planet.1

If we think of intelligence as the ability of a species to adapt to its environment, and failure on this test has mortal consequences, then there must be much more to intelligence that school-related abilities. Consider, for example, the skills required for early Pacific Islanders who, in small canoes, navigated the vast Pacific using stars, sun, moon, wind, clouds, ocean currents, fish, birds, waves, etc., etc., to reach a destination thousands of miles away (e.g., Hawaii). What kind of “intelligence test” would they develop?2 Certainly nothing like ours. And what would Eskimo, or Pigmy “intelligence tests” consist of? Very different again.

Much speculation has been given to the abilities that underly human capacity to survive across these varied environments, and include bipedalism, opposing thumbs, a complex brain, language, tool-use, genetic changes (e.g., genetic protection against malaria; capacity to digest a variety of foods) which take generations, and non-genetic capacities to flexibly adjust to environmental changes (e.g., culture practices, individual learning, transmission of skills across individuals). None of these require literacy or schooling.

Species’ Intelligence

Human intelligence has long been considered the pinnacle of a hierarchy of intelligence among species. More than a century of research has sought to examine the comparative intelligence of other species using, of course, human abilities as the yardstick. These abilities include tool use, language skills (those species able to learn analogues to human communication ranked as the smartest) and self-reflective consciousness, which encompasses the ability to consider the mental state of others and evidenced in deception, empathy, grief, envy and cooperative action with others.

The species that we have long considered to occupy the next rung below human intelligence are primates; species that look like us. More recently, this has changed, as many other species have been identified that possess these capacities, including dolphins, whales, elephants, birds and dogs. What has also changed, dramatically, is that species’ intelligence is no longer considered a totem pole but a bush. Each species possesses an intelligence, an ensemble of capacities that enable it to adapt and survive, often involving capacities humans do not possess, such as flight, echolocation, and the ability to perceive sensations invisible to humans (e.g., infra-red light; high and low sound frequencies, etc., etc.). If a fly were to construct an intelligence test, how well do you think we would do?

Our Species Intelligence Test

Ironically, the type of intelligence that “fits” our human-made techo-industrial niche supersizes our ability to survive and thrive almost anywhere on the planet. And also, to dramatically alter the entire biosphere—so much so that it threatens our very existence.3 Is this a measure of our superior intelligence or proof that it is very limited? Have we out-smarted ourselves?

The scope of the climate catastrophe will require more that individual intelligence. Our techno-industrial world is the product of a collective, collaborative intelligence that can solve problems beyond what individuals could not even dream possible on their own. We each may be very smart, but alone we are incapable of providing the many essentials required for living in the modern world, from producing a simple screw that holds things together, to electricity that makes everything run. We are part of a “Hive Mind”.4

The products of this Mind are the source of our imperiled biosphere, as well as the wellspring of the cornucopia of riches enjoyed in our modern society. And, perhaps, it may be our salvation. Each of us is but a single “neuron” in this vast Mind. But we are in networks with other “neurons”, and by influencing them, and they in turn influencing their connections, which then influence other networks, a cascade of changes can result that alters the workings of the Mind. Individually, we can recognize threats to our species survival and adaptively respond, but our actions must be a part of a larger systemic response of the Mind if we are to survive.

How intelligent are we? Individually? Collectively? We will soon find out. We cannot afford to fail this intelligence test.

.

.

Atlantis

Remembrance of things past1 is the special province of the aged, for whom the long shadow of the past looms over the present and the ever-shortening future. Everything conspires to remind: face in the mirror, alien; daily routines, upended; places, bulldozed; useful things, obsolete; friends, family, and acquaintances who peopled our world, sick, dying, dead. Our attempts to freeze time—photographs, videos, recordings—fade, as do our memories. Despite our efforts to stop time, the loves and friendships, the hardships and tragedies, the triumphs and defeats, the jubilations and heartbreaks, the full pulsing, throbbing experiences of life—are gone.

We hold close our mementos and memories, trying to salvage fragments of a world being submerged by the in-rushing tide of time. We are denizens of Atlantis, that mythic lost world that sunk beneath the sea, leaving no trace. But not yet. This moment, right now, is our time. Hold it close.

.

1000 Years of Joys and Sorrows

Of a thousand years of joys and sorrows
Not a trace can be found   

You who are living, live the best life you can
Don’t count on earth to preserve memory

Ai Qing2  

.

.

Where am I?

Where am I? To be lost, disoriented, confused about where we are situated in the world can evoke a profound sense of unmooring. We rarely experience this in the extreme; usually we are lost in a building, a mall, on a highway or, perhaps, hiking. These situations are relatively inconsequential because we are oriented within a much larger matrix of place. We possess background knowledge that serves as a “North Star” to keep us oriented.  Although the surroundings may be unfamiliar and even, perhaps, alien and strange, we still are able to place ourselves, know where we are and not be completely lost.

Place Names

What are the coordinates of place that we typically rely on to anchor ourselves? Names. Place names. For example, I grew up on Amsterdam Road in the city of Rochester, in the state of New York, in the country of the United States of America. Place names, all imaginary constructions of collective agreement—a virtual reality—blanketing the entire landscape that provides a sense of stability and “home.” Failure to be reflexively situated within this “home” is to be disoriented and adrift.

Where do these signposts come from? History and power reside in the names. The names are tracings of the past, a residue given official designation by those with the authority to name. Look at a map—it is a crazy quilt of names. Each name possesses a history, not only of what is designated by the name, but, less visibly, who designated it. Let’s take a little road trip and pause to consider some of the road signs.

Countries and Continents

Most countries are named after one of four things:

Tribal names, which include Thailand (Thais), France (Franks), Russia (Russ), Italy (Vitali), Germany (Germanic tribes), England (Angles), Switzerland (Schwyz).

Land features, for countries like Iceland, Greenland, Haiti (“mountainous”), Costa Rica (“rich coast”), Honduras (“deep water”) and Peru (“land of the river”).

Directional placement, which include Australia (“unknown southern land”), Norway (“northern way”), Japan (“Nippon, land of the rising sun”, i.e., east of China), Ecuador (“equator”) and Chile (“where the land ends”).

Important figures (almost always men), which identifies America (Amerigo Vespucci), Colombia (Columbus), Bolivia (Simon Bolivar), Philippines (King Philip II), El Salvador (“The Savior”), Seychelles (Jean Moreau de Seychelles) among others. 

Who names  the names? World history is imprinted on the map. The “New World” was new to European explorers (but not to the indigenous peoples) who were the colonialist scouts for European conquest. The continents, North, South as well as Central America are named after an Italian adventurer who was the first to appreciate that these new lands are separate continents from Asia. The names of the countries, and the languages now spoken there, tell all: English speaking, in the North, with a French outpost in Quebec; Spanish speaking in the Central and South, with a Portuguese outpost in Brazil.

Two other continents, Australia and Africa, also bear the imprint of colonizing Europe. Australia, of course, named by the British, and speaking English. Many African countries names are the residue of colonization, and often the spoken languages are a mix of indigenous peoples and colonizers. Prior to colonization, the native peoples did not have maps and charts demarcating sharp (and sharp angled) tribal boundaries. These came later, organized in the Berlin Conference of 1884 that drew territorial boundaries that were distributed among seven European countries in their “Scramble for Africa”.

Asia, with its long history of high civilization and bustling, populous cities carries its own history in the country names and languages. The influence of European colonization, while late arriving, is folded within the history of many of these countries; witness, for example, the histories of India and Pakistan, where English is widely spoken and the borders defined by an unprecedented “redistricting”.

United States of America

Continuing our road trip closer to home, the map of the United States is a map of colonial exploration, claimed—and often fought over— ownership, and established settlements. New England is, of course, new England, and the thirteen Colonies are Britain transplanted: New York. New Jersey. New Hampshire. Delaware (after Lord De La Warr). Georgia (King George II). North and South Carolina (King Charles II) and so forth. Many cities and towns of this region also mirror the “Mother Country”. This includes my home town, Rochester, in the state of New York, and my home on Amsterdam Road echo’s the earlier Dutch colonists that preceded the English.

The French plied their trade along the Mississippi and Missouri Rivers and French names signpost this trail. Louisiana (King Louis XIV) and the French trading posts of New Orleans, St. Louis, St. Paul, and Sault Ste. Marie are among the more notable. The Southwest and Florida carry the imprint of the Spanish: California (named after a queen in a popular 16th century Spanish novel). New Mexico. Colorado (color red, for color of the Colorado River). Florida (“full of flowers”). Traveling through the Southwest, the names encountered at every turn could easily cause a naive traveler to think they were in Mexico…or Spain.

This naïve traveler touring the length and breadth of our country could also easily think they are visiting territory settled by tribal peoples. Half of the states names derived from Native American languages: Alabama. Alaska. Arizona. Arkansas. Hawaii. Illinois. Iowa. Kansas. Kentucky. Massachusetts. Michigan. Minnesota. Mississippi. Missouri. Nebraska. New Mexico. North and South Dakota. Ohio. Oklahoma. Tennessee. Texas. Utah. Wisconsin. Wyoming. Not to be left out are the names of the Great lakes: Ontario. Erie. Michigan. Huron. These names are gravestones; markers of peoples and cultures, mostly gone, except for a few small, flickering outposts in the most remote and forbidden corners of this land.

Crazy Places

All the states have towns with bizarre, humorous, crazy names that make you wonder, “Wherever did this name come from?!” We have spent some time in Arizona and visited several towns I find amusing: Why  (“Why?”, I ask). Tombstone (obvious). Show Low (a card game hand that won ownership of the town). Arizona towns we have missed include: Nothing (a good name for a ghost town), Three Way (use your imagination), and Skull Valley (yup, skulls found in a valley).  

Head-scratch-and-chuckle names abound across our country’s broad landscape. Here are a few of my favorites: Knockemstiff, OH; Hell for Certain, KY; Satan’s Kingdom, MA; Cut and Shoot, TX; North Carolina’s answer to Why AZ— Whynot. There are also several surprising “sister cities”: Cannon Beach, OR and Cannon Ball, ND; Gunbarrel, CO and Gun Barrel City, TX; Rifle ,CO and Point Blank, TX; Boogertown, NC and Booger Hole, WV; No Name, CO and Nameless, TX. Of course, I must also mention Truth or Consequences, NM, named after a radio game show. Only in America. Indeed, these and many other crazy, colorful, irreverent names reflect our crazy, colorful, irreverent history.

Where am I?

So, “Where am I?” Surprisingly, I find myself in the land of the dead. As I speak the language of the conquerors and am oriented by historical markers and gravestones, I am oblivious that the past is my North Star that fixes me in place as surely as geographic coordinates, themselves but imaginary lines etched by history. The past surrounds me, speaks through me as I navigate the present and motor into the future. My journey leaves its own faint trail on the human landscape that, I hope, may help orient a few travelers who pass by this way after me.    

Aliens Have Landed

The cosmos is a source of wonderment and deep mystery that rouses visions of possible celestial beings. Humans have gazed at the heavens and envisioned life beyond our earthly horizon since the first stirrings of civilization in Mesopotamia. The earliest form of this has been the worship of deities; sun gods, planet goddesses, heavenly personages, and divine beings. We are familiar with “the man in the moon”, green martins, and a host of space aliens portrayed in movies, books, comics, and television programs.

Alien Telescope Array

Space Aliens

Recently, in the last 100 years or so, we have become concerned about detecting, confirming, and contacting alien space life. UFO sightings, a subject of considerable controversy, might suggest they are already here, among us. Major efforts costing millions of dollars, including the Alien Telescope Array, search the cosmos for signals of extraterrestrial intelligence. Space probes have been sent to the moon, Mars, Venus, as well as Jupiter and Saturn’s moons, scouring for the slightest hints of life. We have even sent a space craft with images and presumed universal codes about our earthly existence into the deep unknown beyond our solar system; a message in a bottle tossed into the vast sea, announcing: “We are here!”

The chill of cosmic isolation, the yearning for companionship amidst the vast darkness of infinite space motivates the question that often accompanies these efforts: “Are we alone in the cosmos?” Our anxiety about being alone is accompanied by fear of what we might find, or what might find us. We have only developed the scientific capacities to seriously pursue these inquires in the last 50 years or so; a mere eye-blink in cosmological time. Any space alien capable of discovering us, communicating with us, or visiting us will likely be much more technologically advanced.

Will they be nice to us? Might we become their pet? Will we be exterminated as a troublesome varmint? Devoured for a snack? Hunted for sport? Killed for kicks? Our imaginings of space aliens are filled with these fears, with the amusing premise that our cosmic Stone Age weapons and technology can defend us against aliens who have  traversed unfathomable distances using unimaginably sophisticated means.

Obviously, we have no idea what space aliens might look like, should they exist, so they provide a blank screen where we project our primal anxieties. The images and stories we conjure bring us face to face, not with aliens, but with ourselves. We can learn much about ourselves through these projections. The most common images are mutant humanoids: ET and Yoda-like figures that can be wise and friendly, or more disturbingly formed humanoids that aim to destroy humanity. The default image is “something like us”, because, after all, this is what intelligent life must look like.

And then there are the horror films and stories seeking to evoke our deepest terror. These are not humanoids; they are creatures of our nightmares, often insectoids, with little outward resemblance to ourselves. Their mere form announces their malevolence, for anything this utterly foreign can only be a mortal threat.

Aliens Have Landed

We scour sky, expectant and fearful, searching for intelligent life to assuage our cosmic loneliness. Ironically, we fail to lower our gaze to our own planet. Intelligent life has landed. Millions of years ago. It swarms in our midst. Perhaps the most alien life forms among us are cephalopods; octopus, cuttlefish, and squid. No other animals have evolved as early from the all the rest of the Animal Kingdom as these creatures. Our common ancestor with cephalopods is the flatworm, from which we diverged over 600 million years ago. Unlike most other complex animals, octopuses do not have a central nervous system (CNS). Their information network is distributed along its 8 arms. Alien, indeed. And, by all evidence, very intelligent.

A crude way to assess how alien other earthlings are to us is by how distant our evolutionary ancestors are and by how they gather, organize, and use information critical to their survival. Insects also are very evolutionary distant from us; our last common ancestor is about 400 million years ago. Insects comprise the largest biomass of terrestrial animals consisting of over 30 million species. While they do have a CNS, it is quite primitive, giving rise to strange forms that can easily evoke horror.

A host of other dazzlingly bizarre creatures, also evolutionarily distant, throng our planet. The oceans, lakes, ponds, and waterways surge with over a 30,000 species of fish, many exotic and other-worldly. Birds, distant relatives to dinosaurs, fill the sky with their dazzling array of colors and songs.1 Fish, and especially birds, have  sophisticated central nervous systems and our shared ancestry with both is about 300 million year ago. While strange, they possess a vague resemblance to us that octopuses and insects do not.

Closer to home, closer to us, less evolutionarily alien, are mammals. While mammalian forms can be very different, from giraffes, to elephants, to tigers, we do share much in common with them; we are in the same biological family. We ride them for pleasure and transport, harness them to accomplish difficult tasks, hunt them for sport, food, and trophies attesting to our power, harvest them for food, wear their skins and furs for protection and fashion, train them to be our eyes and ears, even invite a select few into our homes for companionship, claiming some to be “our best friend.”  

Intelligent humanoid life forms also share our terrestrial home. We have biological brethren, primates, which consist  of around 200 species, including gorillas, orangutans, bonobos, and gibbons. We share 98% of our DNA with our closest relative, chimpanzees. They look much more like us than most alien humanoid images we have contrived, act uncannily like us, and we have even been able to communicate with some—they learned forms of human communication (e.g., sign languages); we have not learned theirs (who is more intelligent here?). We have hunted and hounded almost all to near extinction. And, to save them from ourselves, we encaged them in in zoos; jail-mates with a host of others in the Animal Kingdom we have pushed to near extinction.

Frightful Alien

If space aliens were to visit our planet, there is no reason why they would be more interested in us than, say, elephants, horses, crabs, or octopuses. If they examined earth from a dispassionate perspective, why shouldn’t they view us as a pestilence; a plague that has visited death, torture, and destruction on all the other earthly inhabitants? We have, after all, caused the extinction of many hundreds of species. Currently one in four mammals, one in eight birds, one in three amphibians, and one in three conifers and similar plants are at risk for extinction because of our actions. Almost no forms of plants or animals have escaped our ruinous hand.2

We fear space aliens might try to exterminate us, hunt us, devour us, “domesticate” us, murder us for profit or thrill. These fears spring from the fear that they might be too much like us.  If aliens are “intelligent” in the same manner we credit ourselves, then they might well treat us precisely as we have treated our fellow beings with whom we share this planet. Blind, groping creatures, we experience existential angst about our presumed cosmic solitary existence, while annihilating untold billions of fellow earthlings. A mirror, not a telescope is needed discover alien visitors: We are the aliens on this planet. Frightful aliens. The monsters that haunt our worst nightmares are us.

Play with Your Digits

You are about to enter a Twilight Zone of contradiction and paradox, where the boundaries between “real” and “not real” are blurred. We will travel to strange, seemingly incomprehensible places, only to discover we have arrived home. I begin our journey with an invitation: “Let’s Play!”

Digital Logic

Let’s begin with the digits that compose our modern world.

Our virtual world, a dreamscape projected through a screen, is anchored in electrical processes and material hardware that follow the dictates of the physical world and logical laws. Microchips containing a staggering number of electrical “switches” that are knitted together in quick changing patterns of electrical circuits are the basis of cyberspace. Each of these switches, or bits, possess only two possibilities: either “on” or “off”, which typically are assigned a digital value of 1 or 0.

Pictures, music, images, broadcasts, streaming, data—just about all forms of information and electronic communication—are now digitalized. This process is accomplished by segmenting analog signals, such as sound or light waves, which are continuous, into the tiniest of discrete bits of information, and converting letters, numbers, and other forms of information into a bit string. The binary format, either 1 or 0, allows for easy transmission, manipulation, and storage of information. Our digitalized world is organized by machines that can, at lightning speed, conduct huge calculations with bits composed of the binary, either 1 or 0.

This binary, either-or structure presupposes an assumption that a bit cannot be “on” and “off”, “0” and “1”, simultaneously. This assumption is so obvious, so basic to our understanding of how things exist in the world that it constitutes a fundamental law of logic: the Law of Non-Contradiction. This Law states that it is impossible for something to be A (i.e., “on” or “1”) and not-A  (i.e., “off” or “0”) at the same time. This is considered a first principle of logic upon which all else rests.  

Cosmic Logic

The cosmos, surprisingly, defies our logical sensibilities. Quantum mechanics, which is the fundamental physical process that constitutes everything in the universe, is based on the seeming impossibility that something can simultaneously be A AND not-A. So, for example, in quantum mechanics, light can be both a wave AND a particle; we just don’t know which until we look.

Efforts are underway to construct a computer with exponentially greater power than current computers that exploits the logic-befuddling quantum properties that something can be simultaneously A and not-A. These properties include quantum entanglement and spooky action at a distance. When particles are quantum entangled, the effects on one particle will instantaneously affect the others—even if they are located at opposite ends of the cosmos. These instantaneous effects over such great distances cannot be explained by our current understanding of the physical universe; spooky action at a distance is the phrase used to describe this.1

Quantum mechanics not only is revolutionizing computing, it also has shattered our assumptions about reality. Obvious principles of logic—overturned. Entanglement of particles with mutual, instantaneous influence across unimaginable distances—inexplicable. The world hangs suspended, particle AND wave, “0” AND “1” awaiting interaction with a macroscopic system— such as us—to give it form. We cannot assume a “gods eye” position of observing, measuring, and understanding the workings of the universe independent of our own place, time, and actions in it. The ground beneath our feet is unstable.

Playful Logic

Another AND subverts the seemingly obvious Law of Non-Contradiction: play. Playing is no simple matter, as it requires sophisticated communication abilities. A dog that playfully nips another must communicate, “This nip is not really a nip”, otherwise the nip could precipitate a fight rather than a playful response. This is typically communicated by exaggerated movements of some kind, such as paws out, head lowered, tail wagging, and galumphing about. These are metacommunicative messages that signal: The nip (A) is not really a nip (not-A).2

These metacommunications entail distancing from the ongoing stream of messages; stepping outside the stream to signal how these messages should be interpreted. Play requires self-referential awareness of being “the one doing the signaling,” as well as awareness of the other as the “one to whom the signals are meant” (and who, also, is “one who signals”).The player acts “as if” the bite is real and must have “double knowledge”: being able to tell the difference between the actual and pretend bite at the same time. It is both “real” and “not real.”

Play engenders a deeper kind of knowing; both a cognitive double knowledge of “as if,” and a social entanglement with another, forged in mutually sharing that a “bite is not a bite.” For these reasons, play is often used as a marker for assessing the relative cognitive and social capacities among species. It is also considered a nascent form of a much more sophisticated mental ability: representation.

Re-Present

“This is not a pipe.”

Representation consists of an object or referent (A) that is re-presented by a sign or symbol (not-A). The re-presentation is both A, as it acts “as if” it were A, AND not-A, because it stands in place of A. Representations can take many forms, from images, to words, numbers, and sounds. Untethered from reality, existing in its own created reality, a representation affords leeway from the represented, allowing us to play with it in symbolic ways, exploring novel, strange, fantastic, comical possibilities.

We easily can, and do, become swept into the represented reality without realizing that it is an “as if” one. We lead a “double life.” Magritte, a 20th century French painter, humorously points this out in his painting, “This is not a pipe.” We initially are confused by the painting because we immediately assume, it is a pipe. But it is, AND is not, a pipe. The painting amuses because it playfully exposes our taken-for-granted double life. We dwell within a representational world as strange, peculiar, and bizarre as quantum mechanics.3

Paradoxical Power of AND

While representation is not the real thing itself, it does, paradoxically, enable us to grasp reality in a more fundamental way. Quantum mechanics, for example, affords an exceedingly precise understanding of the cosmos, which is understood through mathematics, perhaps the most abstract and detached of all forms of representation. Indeed, quantum mechanics is an abstract mathematical reality whose representational fidelity to reality is tested, empirically, in extraordinarily complex and abstract ways. We are a like blind man, probing the world with a mathematical stick. The same can be said for language, art, and music—all of which, in themselves, are not “real”— but it is specifically their “as if” quality that affords us the ability to “see” with greater depth and clarity.

Logical contradictions and paradox, ironically, empower us. Thus do we catch a glimmer of the dizzying hall of mirrors that comprise ourselves and the cosmos that we inhabit.

.

.

.

Attention Shoppers!

Restlessness

Attentiveness is the natural prayer of the soul.1

Attentiveness is not alertness, which is a restless scanning. Attentiveness requires time, intention, and focus. It is a meditative thoughtfulness about something. It is in this meditative thoughtfulness that we encounter, and discover, our soul; the core of our being.

Attentiveness is in short supply. We live in a restless time, poised to check and respond to the many digital alerts, messages, and prompts that ping, ting, and ding on our phones, watches, Fitbits, computer screens, televisions, and dashboards. Our waking hours are saturated with the presence of a virtual world, beckoning us to something more urgent, more important, more vital than our being in the present moment. Some even place their phone on their bedstand, just in case…

Sin and Shopping

If attentiveness is a form of prayer, then what is sin? St. Paul describes the experience of sin: “For the good that I would, I do not: but the evil which I would not, that I do.”2 Why do we feel compelled to violate our highest ideals? Religious spiritual traditions answer: The Devil. And how does the Devil manage this? By appealing to the desires of the flesh, which are sins against the commitment and command to live a life of the spirit. Sin offers the alure of euphoric oblivion, of being transported beyond the drab humdrum of daily concerns. No wonder sin is so appealing. 

Secular humanist traditions do not share the metaphysical beliefs of most religions, but much wisdom resides in religious concepts that can be meaningfully translated into a humanist framework.3 St. Paul’s description of sin provides a succinct description of the experience of addiction: We feel compelled to do what we know we shouldn’t. The devil, like the religious Devil, tempts us by appealing to desires, needs, and interests that are against our best interests. This can take many forms for different people. One source, however, invades the life of nearly us all: cyberspace.

Cyberspace is not inherently devilish. The digital world is seamlessly woven into almost every aspect of our lives and has been magnified during the pandemic. Workplaces for many have shifted from office to home. In-person contact with friends and colleagues, attending church services and schooling, business meetings, social gatherings, meet-ups, babysitting, with people near and far, are now conducted through Zoom and FaceTime. Limitless entertainment, recreational, and educational activities are available in podcasts, e-books, videos, streaming, gaming, news reports, weather updates, etc., etc., etc., all at our fingertips. And shopping. We can buy almost anything, at any time, with a simple “click”. It is hard not to delight in this dazzling world.

Our ubiquitous virtual experience, however, is largely governed and orchestrated by corporations whose only aim is profit. Many of these corporations offer what appears to be free services; email, doc sharing and storage, search engines, community messaging, etc., etc. It seems free because we fail to appreciate that we are paying with our most precious resource: our attention.

Others within the virtual world also seek our attention, where profit is measured in influence that can be leveraged for financial, political, and personal gain. The internet is an attentional economy. This virtual world is a modern midway with carnival barkers employing “clickbait” to snare our attention. The bait beckons us, flashing punchy, brief phrases, provocative imagery, and eye-catching banners that offer promises to fulfill desires we often didn’t know we had; to be enlarged by our participating in groups of “friends” who “like” us; to have our life changed by a purchase—to be transported, however briefly, beyond mundane daily life. The momentary hit afforded by a simple “click” triggers the desire for more. And more. The aim is to create addiction.

ATT *#/ EN +@% TIO &^+ N !

The alluring payoffs of the attention economy carry hidden costs. We check our phones, on average, every 10 minutes, spend less that 2 minutes for 70% of these sessions, and webpage visits typically last less that 10 to 20 seconds. Phone prompts, even if we ignore them, can disrupt attention and negatively impact performance on ongoing mental and physical tasks. Even the mere presence of a phone can lead to poorer performance. Media multitasking is common feature of web browsing and, surprisingly, can lead to a decrease in effective multitasking in other contexts, as individuals become more prone to distraction by irrelevant material.4 Restless, unsettled, alert, we scan the world, but have difficulty giving anything our full attention. Perhaps most importantly, our attention is in danger of being hijacked, as our minds are lured to the addictive tug of “more.”

Greatest Hazard

We cannot escape the cyber world we live in; cannot hold our breath, plug our ears, or close our eyes and hope it will go away. Even if we could, most of us would not want to.

Attentiveness is what has been eroded and attentiveness is what needs to be reclaimed. Attentiveness, not to a virtual world, but to the one we live our lives in, the one pressing on our skin, the one with cries and laughter, with flowers and weeds, with loves and friendships. These may lack the immediate buzz-hit of the fast-moving virtual stream, but that stream threatens to sweep us away from life, away from ourselves, numbing us to the awe and mystery of being here, now, alive. We are imperiled and much is at stake. Kierkegaard, that early Casandra of our modern age, warns:

“The greatest hazard of all, losing the self, can occur very quietly in the world, as if it were nothing at all. No other loss can occur so quietly; any other loss—an arm, a leg, five dollars, a wife, etc.—is sure to be noticed.”

May you find peace in this frenzied holiday season.

.

.

.

Old Dog, New Tricks?

“You can’t teach an old dog new tricks.”                     

Old Dog

My first, almost reflexive, old-doggie response to this cliché is to disagree. “What do you mean!? How old is old? 50? 65? 80? Some can be demented at 60 years and others, at 80, can be sharper than many 20-year-old’s.” Once my indignation subsides, I am painfully aware of the daily reminders of the inescapable reality that the body ages, the mind slackens, and energy wanes. Information goes in, but doesn’t stick. Recall is diminished as I root around for a familiar name, word, or past event. Attention wanders. Complex reasoning is a trial with failure a common outcome; even no-so-complex reasoning sometimes befuddles. Focus is fragmented: “What was I just doing?” “Why was I doing it?” “Where did I put it?”1

Dementia. I have witnessed family members, friends, and acquaintances fall to this dreaded outcome. The tragedy is twofold, afflicting both the sufferer as well as their family and caretakers. Dementia is one of my biggest fears and, I know, also is for many others of my generation. Diminishing capacities are not only a source of concern, in themselves, but another, more general level of anxiety pervades: “Is this a sign of dementia?”

Research on the normative patterns of cognitive decline with age helps, somewhat, to normalize what I am experiencing. Normal declines include: information processing speed and reaction times; keeping multiple thoughts in mind at the same time; memory of recent events; remembering to do things in the future; screening out distracting information, multitasking, sustaining attention; recalling words, names, places; thinking abstractly, making complex decisions, reasoning, solving new problems, and regulating behavior.2

Quite a comprehensive list! Should I be relieved that these are “normal?” It certainly suggests that you cannot teach an old dog new tricks. Well, let’s look at some things that remain relatively stable or improve with age: Vocabulary and comprehension of language remain stable. Procedural memory, which is how to do things, like ride a bike, ice skate, swim, drive a car, also remains stable. Older adults have a lifetime of stored information to draw on when making decisions and responding to problems and situations. They also are quicker to recover from negative experiences, less reactive to stressful interpersonal encounters, more positive in their outlook, and, surprisingly, happier.3

So, when the elderly get in a car accident, which is much more likely, they will be able to talk about it fluently and be less emotionally reactive. They retain the ability to drive, but their mental and physical reaction times are slower, they forget where they are going, get confused, distracted, lose their attentional focus, and have diminished perceptual acuity making driving at night, in the rain, snow, etc., challenging. This is hardly reassuring. And it does not make me happier…

Old Tricks, New Tricks

The diminishment of my facilities poses a new challenge: How do I manage to continue effectively performing old, necessary tasks? Answer: New tricks must be learned. Most advice about coping with aging focuses on diet, exercise, and activities. Much less is offered about specific behavioral strategies to deal with everyday tasks that have become more challenging because of cognitive decline. Here are some that I have discovered.

“Houston, we have a problem.” The first, necessary, and most important step is to admit there is a problem. Nothing can happen until this admission is made. This can be difficult, for it means admitting that I am old, vulnerable, no longer as capable, and on the noticeable downward slide into the grave. Yikes!4

Speed Kills. This phrase has been used in reference to auto safety. It also applies, metaphorically, to the elderly. Diminished processing speed and attentional focus, greater distractibility, attenuated multitasking capabilities, etc., etc., can create serious problems when performing even routine tasks at the formerly habituated speed. I chant to myself, “Speed Kills”  to slow myself down when, for example, paying the bills, collecting my belongings at the gym as I prepare to leave, or trying to save time by doing two things at once. Chanting is especially important while driving when “Speed Kills” is not metaphorical. The challenge, of course, it to remember to chant…

Remember You Won’t Remember. I know I will not remember. When I do remember that I must mail a letter, write a bill, etc., I arrange the environment to remind me: The letter goes in front of the door exiting the house, the checkbook is placed in my shoes, etc. Also, yellow sticky notes placed in unusual and unavoidable spots. ”In place of memories, memorandums.”5

New Tricks for New Tricks

Believe that old dogs can do new tricks. It is easy to get discouraged in the face ever diminishing abilities and conclude, “I’m too old” or “Why bother.” One of the biggest obstacles to learning new tricks is thinking, “You can’t teach an old dog new tricks.” You can. It just takes more time, requires more repetition, more written notes to aid memory and focus, more practice, and more patience.

Adjust expectations. I will not write a novel in a new language, become a concert pianist or an IT expert. Time and ability impose limits. But I can, if I chose, learn a new language, acquire skills at the piano, or setup my own blog site. New tricks are possible within the compass of my energy, abilities, and interests.

Seek joy. I have less energy to sustain prolonged, intense tasks. This means it is absolutely essential that the new tricks I undertake, tricks not necessary for my welfare and health, bring me joy. Joy is my “Geiger counter” to detect those tricks that are worthy of my precious, dwindling, and ever more limited energy. Joy creates energy, passion, engagement in the world. Joy is what keeps me motivated to keep at it, to seek mastery—but at a level that does not seriously diminish joy.

Coordinating activities to maximize joy. I simply cannot sustain focused energy as I once could. Furthermore, if I attempt complex tasks when my energy is low, problems, frustration, and mistakes abound. Not much joy in that. So, I try to match my tricks with my schedule; difficult ones in the morning, and reserve the afternoons and evenings for less taxing and more entertaining activities, like reading novels, listening to baseball games, talking with friends.

One Big New Trick for an Old Dog

The new tricks above involve developing specific cognitive and behavioral strategies to cope with the loss of old tricks. A completely new challenge also arises that requires a new trick: The trick of making peace with old age. This is unlike other tricks; it does not entail learning new skills, acquiring more know-how, or improving proficiency. It requires a new attitude towards oneself, one’s life, one’s past, one’s future. It involves acceptance.

Acceptance is not surrender. Surrender is passive. Acceptance is much harder— it calls us to  actively embrace our plight. This challenge encompasses humility, forgiveness, and gratitude; to live vitally in the face of impending demise. I have witnessed family, friends, and acquaintances grow old and have learned from each. Some I admire greatly, and they have tutored me how to approach my end. Others offer a cautionary note to the dangers, traps, and pitfalls that I could so easily succumb.

So, can you teach an old dog new tricks?  Depends on the dog.

.

.

Forgetfulness

The name of the author is the first to go
followed obediently by the title, the plot,
the heartbreaking conclusion, the entire novel
which suddenly becomes one you have never read,
never even heard of,

as if, one by one, the memories you used to harbor
decided to retire to the southern hemisphere of the brain,
to a little fishing village where there are no phones.

Long ago you kissed the names of the nine Muses goodbye
and watched the quadratic equation pack its bag,
and even now as you memorize the order of the planets,

something else is slipping away, a state flower perhaps,
the address of an uncle, the capital of Paraguay.

Whatever it is you are struggling to remember
it is not poised on the tip of your tongue,
not even lurking in some obscure corner of your spleen.

It has floated away down a dark mythological river
whose name begins with an L as far as you can recall,
well on your own way to oblivion where you will join those
who have even forgotten how to swim and how to ride a bicycle.

No wonder you rise in the middle of the night
to look up the date of a famous battle in a book on war.
No wonder the moon in the window seems to have drifted
out of a love poem that you used to know by heart.
—Billy Collins

Mind and Body Problems

Mental illness has ravaged many a life, ruined many a family, and has dramatically increased during the pandemic. Most of us probably know someone—a loved one, family member, friend, or colleague—who has been afflicted.  

Hidden within all this suffering is a long standing, seemingly intractable, philosophical quandary residing at the center of Western philosophy since the 17th century: The mind-body problem. This Gordian knot was prompted by the rise of science, which permits only physical causes and effects. The problem is this: How can a material, biological entity that obeys the laws of the physical world (our body) give rise to our rich, subjective, deeply personal experience of the world (our mind)? The debate ranges and rages across a wide landscape. Are we, for example, only body and our mind is, at best, an epiphenomenon? Or, alternatively, can we only be sure of our immediate phenomenological experience—our mind—and our body, as a physical entity, is a mirage?

This debate appears to have little to do with our everyday lives; simply another pointless intellectual game played by philosophers.1 This debate, however, cannot be so easily dismissed. Many will likely make decisions that have life consequences, sometimes life and death consequences, without realizing that they are determined by unexamined assumptions about the mind-body problem.

Mental Illness

Mental illness. The mind-body problem adheres in its very name. The term, mental illness, is a concatenation of mind (mental) and body (illness). The term also encodes a presumed resolution to the mind-body problem. Illness is the subject, the noun. Mental is the descriptor of the subject, an adjective. It identifies the type of illness. To reverse the order, “illness mental”, makes no sense. Illness is not an adjective. Because human distress, anguish, and dysfunction are defined as an illness, a physical affliction of the body, then the identification, diagnosis, cause, and cure must be treated as all illnesses are; as a medical condition.

Pills and Power

The illness approach to mental suffering gives “ownership” to MD’s; to psychiatrists. The manual for identifying and diagnosing mental illness (the DSM) is issued, and frequently updated, by psychiatrists; new illnesses are identified, some deleted, and diagnostic criteria added and amended.   

Understanding and treating mental illness underwent a revolution in the 1960’s. Prior to this time, identification, diagnosis, cause, and cure were predominately determined by Freudian trained psychiatrists. The revolution was sparked by the discovery of medications that ameliorate symptoms associated with profoundly disabling types of mental illness; neuroleptics to treat psychosis, antidepressants for depression, and mood stabilizers for bipolar disorder.

The chemical actions of these drugs were investigated with the aim of uncovering the pathophysiology responsible for the illness that the medications address. This research led to “chemical imbalance” causal models of mental illness. So, for example, neuroleptics, which contain dopamine, was hypothesized to offset an imbalance in dopamine presumed to cause psychosis. Similarly, antidepressants, which inhibit the uptake of serotonin, was hypothesized to prevent an imbalance of that neurotransmitter presumed to cause depression.

Given the severity of impairment of these illnesses, and the prolonged Freudian treatments using medically dubious means (talk) with uncertain results, these discoveries flashed on the scene as miracle drugs. Psychiatry was upended. The Freudian-based approach was replaced by a medical model anchored in physical causality, symptom-based diagnosis, and treatment by medication.

In addition to the obvious and profound benefits of relief of suffering, other important consequences followed from the positioning of mental illness within the body-causes-illness medical model. Insurance coverage is now routine, lifting the huge financial burden of treating these illnesses, and billions of federal dollars are now dedicated to research investigating the bio-medical causes and cures of mental illness. Medication is the default option typically used to treat mental illness.2 While philosophers debate the mind-body problem, it appears to have been solved in the laboratory.

Flies in the Ointment

Despite the success of medication and its widespread use, however, a host of troubling difficulties plague foundational aspects of the medical model of mental illness. Three are especially noteworthy.

Causality. Decades of research has failed to support the premise that specific biochemical deviations in the brain correspond to particular mental illnesses. Furthermore, symptom amelioration can be accomplished with drugs that do not contain the hypothesized chemical causal agent.3

It also has become clear that psychological stress and trauma can precipitate all manner of mental illness, from PTSD and anxiety disorders to schizophrenia and depression. Both body (biology) and mind (psycho-social) can predispose, prompt, or give rise to mental illness. Causality is a complex entanglement of body-to-mind and mind-to-body interactions. The nature of the entanglements are fuzzy, vary among mental disorders, and even vary among individuals sharing the same illness.

Diagnosis and Cure. Mental illness does not share essential diagnostic attributes of physical illness. Question: How many mental illnesses can be diagnosed using biomedical indices, including brain scans, genetic markers, blood tests, body scans, laboratory assays of bodily fluids and functions, identification of pathogens, etc.? Answer: None. Not one. Furthermore, the illnesses themselves are not identified by medical or biological symptoms. The symptoms are all psycho-behavioral. And cures consist of the reduction of the listed psycho-behavioral symptoms. Diagnosis and cure fall entirely within the circumference of “mind,” not body.

Treatment. The treatment of mental illness is a much researched area and the complications are head spinning. The best practice, empirically supported treatments vary for types and severity of illnesses, and can differ between individuals and circumstances. Medications can certainly be effective, but they are not a panacea. Indeed, for many, if not most, of the 157 mental disorders listed in the recent diagnostic manual, medications are not the most effective treatment. Instead, behavioral and psychological interventions often are, and they come with the added benefit of no long-term side effects.4

Mind-Body and You

The debate about the mind-body solution to mental illness ranges widely and rages on. It has not been solved. Confusion, dispute, and debate about all aspects of mental illness, from identification, to diagnosis, to cause, to cure, indeed, to the very legitimacy of the term “mental illness” itself, highlight the intractable nature of the issue. This is not simply a philosophical debate, and cannot be waved off as irrelevant to our daily lives. It is a dispute with life-altering consequences. Should you find yourself in need of treatment, you must choose your solution, and the stakes couldn’t be higher. Be informed. Choose wisely.

« Older posts

© 2022 December Songs

Theme by Anders NorenUp ↑