"As the days dwindle down to a precious few..."

Category: Psycho-Excursions (Page 1 of 3)

Homo Economicus

We are no longer Homo sapiens—wise humans, which is what sapiens means. We have evolved into Homo economicus. These creatures are motivated by rational self-interest who seek to maximize their wealth and the “utility,” or satisfaction, derived from consumption of purchased goods. Homo economicus arose in the 17th-18th centuries in response to a dramatically altered environment where capital replaced land as the basis for wealth, and large scale production of goods in an industrialized landscape changed the living conditions of Homo sapiens.

The Wealth of Nations, published by Adam Smith in 1776 when market economies were emerging, is considered by many to be the progenitor of Homo economicus. This revolutionary work, offered at a tumultuous time when capital markets and democratic uprisings where transforming human life, offered a radically new way to understand wealth and government. Although he did not use the term, Homo economicus, Smith’s analysis hinges on the revolutionary assumptions that characterize Homo economicus: individuals, motivated by rational self-interest, seek to maximize their profit.1 

The Wealth of Nations has become a sacred text, often quoted chapter and verse by contemporary economists whose dizzyingly complex, mathematically based analyses typically begin with the assumptions of Homo economicus. The reach of these assumptions is not confined to economic concerns, but are presumed to motivate all human actions, behaviors, and habits. Here is what Nobel Prize winning economist Gary Becker has to say: “All human behavior can be regarded as participants who maximize their utility from a stable set of preferences and accumulate an optimal amount of information and other inputs in a variety of markets.”

Becker articulates, in the jargon of his profession, what is often presumed by many Homo economicus supporters, and these assumptions have guided analysis of almost all areas of human life, from the most intimate matters of love and family to government policies and practices. His analysis of the family, for example, addresses “marital-specific capital” (i.e., children) and argues that child services are “the commodity that provides the utility a couple receives” from this “marital-specific capital”. And here is an example of an analysis at the institutional level: A World Health Organization initiative to end river blindness in Africa prevented hundreds of thousands of people from going blind. The World Bank’s cost analysis of the program, however, concluded that these benefits were not measurable; the benefits were conferred to people so poor that there was no measurable profit from the treatment.2

“Marital-Specific Capital”

These analytic, quantitative appraisals and balance-sheet conclusions of human endeavors clearly and unambiguously state the value, or profit, to be derived from them. Debate, then, centers on whether the profit justifies the cost. But is this a success, or a reason for concern? When children become “marital-specific capital,” and a wildly successful intervention that prevent blindness in hundreds of thousands of people is questioned because the recipients are poor, then maybe this is evidence that something is amiss. Perhaps Homo economicus is a mutation that needs to be modified or eliminated.

The central value of Homo economicus is profit. The balance-sheet determines worth. But why, for example, was a major initiative, at considerable cost, undertaken to prevent river blindness in impoverished areas of Africa? Certainty not to gain a profit on investments. And children are more than capital goods for most parents. Other values are at play.

Adam Smith would certainly agree. He considered himself a moral philosopher, and in his first book, The Theory of Moral Sentiments, he argues that we are inherently social and moral beings. We care about others and this social-moral sense is essential if we are to live together without destroying each other. Adam Smith’s self-interest is not greed. It is in everyone’s self-interest to conduct commerce and exchange within a moral framework of trust and honesty, which are essential for a successful society. Theory of Moral Sentiments, established the foundation for The Wealth of Nations; wealth is built within a social-moral framework.3

Economic, social, and moral values differ starkly in the kind of exchanges and relationships presumed, and the expected benefits.4 Economic values are individual and impersonal, and economic analysis treats families, communities, and societies as a collection of individuals. Exchange is contractual and who the other person is in the transaction does not matter. Economic values are instrumental; you expect to get something of equal or greater value in return. If it has no profit, it has no value.

Social values are personal and depend on the relationships involved. Our relationship with our children, spouse, or friends may be the most important thing in our life. They have no economic value, as we cannot buy, sell, or trade it. And social relationships are reciprocal, not instrumental. Each participant gains from it, but there is nothing definite about what we will gain, when we will gain it, or even if we will get anything tangible from it.

Moral values are altruistic; things are done for their intrinsic worth with no expectation of getting anything in return. They are neither individual or relational. They are universal. We do things for others because it is morally the right thing to do. These values are not proved or supported by evidence or analysis—they are axiomatic.

The lack of proof or evidential support for moral values does not diminish their importance. Quite the contrary. Their universality can give them the animating power of ideals. People commit their lives to fulfilling these values; care for others at great personal cost, risk their safety and well-being, even give their lives for a just cause and, too, commit unspeakable atrocities in the name of the good.

The Declaration of Independence embodies foundational moral values upon which our nation stands. Axioms:
1. “All (men) people are created equal.”
2. “They are endowed with unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.”5
No justifications or proofs are provided, as “these truths are self-evident.” Moral and social values provide the grounding assumptions for how we govern ourselves; they embody our values of who we are, who we want to be, and how we define our relationships with each other. Commerce, economic transactions, and trade are conducted within the framework provided by the moral and social values that are the basis of our governance.

It is an inherently destructive act to treat all human life as some sort of financial transaction. Preventing river blindness is a moral and social value, not an economic one. Children, family, and friends are not commodities of exchange. Economic analysis should be used in the service of our social and moral values, not the determiner of them. Homo economicus hollows us out. If we do not harken the call of our foundational social and moral values, if we allow Homo economicus to become who are, we surrender our wisdom. And our humanity.

.

.

.

Free Will?

Do we have free will? This question has bedeviled Western thought for millennia, spanning the centuries from Aristotle and the Greeks, to Thomas Aquinas and medieval Christian theology, to Einstein and modern science. The power of the question derives from a primal concern about the source of our sense of agency and responsibility for our decisions and actions.

The question has practical implications that reach well beyond a debate among philosophers and scholars. So, for example, should those who are severely mentally ill or cognitively impaired be held responsible for their criminal actions? What about children? Or someone who is drugged? Or forced at gunpoint to comply with a command?

No surprise—I do not have the answer. But I offer thoughts that help me to make sense of this question and enable me to reach an understanding that has some practical value. I will confine my comments to the psychological domain (although I think they can be applied to the physical sciences as well).

Psychological Science

Provocative findings from two areas of contemporary psychological research challenge the belief in free will. One research thread suggests that automatic cognitive processes occurring out of awareness control our actions. Our sense of agency occurs after our choices have been made; choice is an illusion. The second thread offers evidence that the neural impetus to act is already under way before the conscious intention to act occurs. Again, choice is an illusion. These data and conclusions, not surprisingly, have generated pointed challenges and heated disputes among scientists.

This debate is not new to psychology, as behaviorist researchers of previous generations also argued against free will. The most notable was B. F. Skinner, who in his book, “Beyond Freedom and Dignity,” argued that we are the product of our reinforcement history and, consequently, freedom, dignity, as well as all other terms describing human traits and ideals are empty fictions. Needless to say, a great uproar ensued.

I would like to focus on four conceptual issues that are often overlooked in these disputes about the methods and meaning of these research findings: science and truth; the nature of will; yin-yang; and the pragmatics of “I can”.

Science and Truth

M. C. Escher: Drawing Hand

Scientist are independent thinkers who consider a question, pose a hypothesis to answer the question, devise experiments to test the hypothesis, and use the evidence derived from the experiments to determine if the hypothesis is true. If the evidence supports the hypothesis, the scientists have empirical justification for drawing valid, if provisional, conclusions about the question.

Now, if we accept the scientific evidence from the recent psychological research suggesting we lack free will, surely this conclusion applies to the scientists as well; that their hypotheses, decisions, tests, and conclusions are all determined.1 Validity and truth are, thus, comforting fictions, and the entire scientific enterprise of discovering an objective truth untainted by our personal opinions, subjective biases, and blinkered perspectives is a pointless exercise. What legitimate claim can scientists make that their findings should receive any more credibility than, say, astrology, reading palms, or divining from bird entrails? Paradoxically, if we accept the validity of the findings we must conclude that the findings cannot be objectively true. Furthermore, if we accept the soundness of this paradoxical conclusion, we also must, then, accept that this conclusion is not really the result of our judiciously considering the logic and meaning of the argument. Rather, it is just another round of self-delusion. And round and round it goes. Hmm…

What is Will?

Heave of Effort

It is typically assumed, both in common understanding and in the methods and interpretation of the research, that will is a single, conscious, heave of effort that leads to action. An alternative, more comprehensive understanding is that will is a “mental agency that transforms awareness into action, it is the bridge between desire and action.”2

Consider St. Paul’s definition of sin: “I do not understand my own actions. For I do not do what I want, but I do the very thing I hate.… I can will what is right, but I cannot do it.”3 I can will what is right, but I cannot do it. This is a situation that we all understand and often is a major focus of psychotherapy. A simple exhortation to engage will, to “get up off your butt and do it,” in these circumstances often fails. A conscious heave of effort fails because we harbor contradictory intentions and desires, some of which may be out of awareness. Will, the bridge between desire and action, is conflicted.

Psychotherapy is often focused on bringing hidden desires and conflicts into awareness, thus disencumbering will, enabling deeper conscious understanding of the conflicts, allowing deliberative consideration of the choices, and embracing responsibility for action. Cognitive science provides evidence for the power of unconscious factors, and neuroscience documents the neurological run-up to a conscious “heave of effort.” Will includes both these and much more.

Either-Or or Yin-Yang?

An assumption often made is that the question of free will is a dichotomous choice: Yes or no. It might be more fruitful to assume it to be a yin-yang relationship; freedom and determinism are interdependent, two sides of the same coin. So, for example, a piano has 88 keys. It is a deterministic structure that limits the possible notes that can be played. This relatively small set of keys, however, allows for the composition of an infinite number of songs. The 88 keys constrain or determine the possibilities, but within these constraints resides freedom.

Now consider the psychological and neuropsychological evidence of unconscious influences and neurological readiness potential that precedes conscious intention. We are a biological piano. We are able to make music, but are constrained in many, multiple ways by our anatomy. We do not have complete freedom and much happens out of awareness (and be thankful that it does, for we would be overcome to the point of paralysis by the Niagara of sensations, perceptions, ideas, and choices flooding our every waking moment). We are bound by the constraints of our being, which provide us with the means to create our music, sing our (December) songs.

I Can

One of the most important steps in psychotherapy is believing “I can.” It is understandable why many who are in dire straits, who have endured harrowing trauma, personal loss and anguish, or whose lives have been blighted by misfortune, feel doomed, condemned, and believe “I can’t”. The often difficult first step of seeking therapy carries the tentative hope that, “Maybe I can”. It is the beginning of the journey from “I can,” to “I will,” to “I did.”

William James, who is (in my view) the greatest American psychologist and philosopher, experienced a bout of severe depression, unable to rise from his bed, convinced that he had no free will to alter his situation. He overcame his depression by deciding that regardless of the evidence and compelling arguments for determinism, he would act as if he had free will; believe “I can.” His “will to believe”4 created the reality of free will.

James was a leader of the philosophical school called Pragmatism and he coined the term “Cash Value” to describe criteria to assess the merit and truth of an assertion or belief. Cash Value is used metaphorically, meaning “does the assertion have practical utility; does it have real-world consequences or is it merely empty words”. Cash Value can be applied to the question of free will.

Therapy is an act of courage, requiring effort, commitment, humility, resiliency, and honesty. All of these attributes, along with truth, justice, guilt, responsibility, dignity, etc., etc., etc., are self-delusional fictions if we seriously believe we are puppets of deterministic forces. The question of free will is a captivating puzzle and an impetus to interesting research and lively dispute. But, ultimately, only a conclusion of free will has any cash value. “I can” gives us our world. The choice between “I can” or “I can’t” is a hard-fought, life or death decision for many in anguish and distress, and scientific evidence and philosophical arguments are irrelevant. I take a stand for free will. I have no choice. . .5

.

.

.

What is Intelligence?

What is intelligence?

This has been among the most longstanding, fiercely debated questions across a wide range of disciplines, from philosophy and psychology, to sociology and education, to anthropology and comparative biology. The practical consequences of this debate touch every one of us. Intelligence tests mark us, route us, and shape the expectations of our teachers, parents, peers and ourselves, which have a profound effect on who we are, who we think we are, and who we become. IQ score: A single figure, so much power.

Intelligence Tests

The choice of how we measure intelligence reveals what we think intelligence is, and the history of this testing exposes assumptions that underlie the testing. Knowing these assumptions allows us to critically examine what is typically understood to be intelligence.

The first tests of mental abilities were initiated at the end of the 19th century in the newly established scientific field of psychology. Unlike the philosophical speculation about the nature of mind that dominated Western thought for millennia, scientific psychology is rooted in laboratory experimentation. Psychophysics, the measurement of the physical properties of mental activity, was the predominant approach to the measurement of mind, which assessed reaction time, memory, and various measures of sensory acuity and discrimination (e.g. visual, auditory, touch).

A revolutionary study by Wissler in 1901 overturned this entire approach to measuring intelligence. He used the newly developed statistical measure, the correlation coefficient, to demonstrate that these tests were not correlated with school performance. This is very telling. The skills necessary for success in school are presumed to be essential to intelligence. This seems so transparently obvious that we fail to see the far reaching implications.

This is understandable, as success in school is critically important in today’s modern world. Intelligence tests arose hand-in-glove with the emergence of industrialization, which required mandatory schooling to provide workers with the skills necessary for this new form of society.

The first practical intelligence test, the Binet-Simon, was first published in 1905 and became the standard for assessing school children, and a revised form is still used today. Binet’s aim was to help teachers identify children who struggled in traditional school settings and could profit from alternative settings. Binet thought that intelligence is flexible, influenced by motivational issues and personal circumstances, and that the test failed to assess other important traits, such as creativity and emotional intelligence. The measurement of intelligence quickly became swept into the eugenics movement and engulfed in raging controversies about nature-nurture, race, cultural fairness, treatment of those who score low, and a host of other emotionally charged issues; contentious issues that remain with us.

Environmental Fit

We think that we are measuring and debating a universal capacity measured by these tests, but it is an intelligence that “fits” today’s techno-industrial world. Ours is a human-made environment; a very narrow, artificial environmental niche. Humans have adapted to the most diverse environments on the planet: deserts, jungles, Arctic regions, mountain tops, tropical islands, caves, savannas. Almost everywhere. Homo sapiens evolved approximately 300,000 years ago. Written language emerged about 5000 years ago and the first educational system was created about 4000 years ago. Schools have been absent for 99% of humans’ time on the planet.1

If we think of intelligence as the ability of a species to adapt to its environment, and failure on this test has mortal consequences, then there must be much more to intelligence that school-related abilities. Consider, for example, the skills required for early Pacific Islanders who, in small canoes, navigated the vast Pacific using stars, sun, moon, wind, clouds, ocean currents, fish, birds, waves, etc., etc., to reach a destination thousands of miles away (e.g., Hawaii). What kind of “intelligence test” would they develop?2 Certainly nothing like ours. And what would Eskimo, or Pigmy “intelligence tests” consist of? Very different again.

Much speculation has been given to the abilities that underly human capacity to survive across these varied environments, and include bipedalism, opposing thumbs, a complex brain, language, tool-use, genetic changes (e.g., genetic protection against malaria; capacity to digest a variety of foods) which take generations, and non-genetic capacities to flexibly adjust to environmental changes (e.g., culture practices, individual learning, transmission of skills across individuals). None of these require literacy or schooling.

Species’ Intelligence

Human intelligence has long been considered the pinnacle of a hierarchy of intelligence among species. More than a century of research has sought to examine the comparative intelligence of other species using, of course, human abilities as the yardstick. These abilities include tool use, language skills (those species able to learn analogues to human communication ranked as the smartest) and self-reflective consciousness, which encompasses the ability to consider the mental state of others and evidenced in deception, empathy, grief, envy and cooperative action with others.

The species that we have long considered to occupy the next rung below human intelligence are primates; species that look like us. More recently, this has changed, as many other species have been identified that possess these capacities, including dolphins, whales, elephants, birds and dogs. What has also changed, dramatically, is that species’ intelligence is no longer considered a totem pole but a bush. Each species possesses an intelligence, an ensemble of capacities that enable it to adapt and survive, often involving capacities humans do not possess, such as flight, echolocation, and the ability to perceive sensations invisible to humans (e.g., infra-red light; high and low sound frequencies, etc., etc.). If a fly were to construct an intelligence test, how well do you think we would do?

Our Species Intelligence Test

Ironically, the type of intelligence that “fits” our human-made techo-industrial niche supersizes our ability to survive and thrive almost anywhere on the planet. And also, to dramatically alter the entire biosphere—so much so that it threatens our very existence.3 Is this a measure of our superior intelligence or proof that it is very limited? Have we out-smarted ourselves?

The scope of the climate catastrophe will require more that individual intelligence. Our techno-industrial world is the product of a collective, collaborative intelligence that can solve problems beyond what individuals could not even dream possible on their own. We each may be very smart, but alone we are incapable of providing the many essentials required for living in the modern world, from producing a simple screw that holds things together, to electricity that makes everything run. We are part of a “Hive Mind”.4

The products of this Mind are the source of our imperiled biosphere, as well as the wellspring of the cornucopia of riches enjoyed in our modern society. And, perhaps, it may be our salvation. Each of us is but a single “neuron” in this vast Mind. But we are in networks with other “neurons”, and by influencing them, and they in turn influencing their connections, which then influence other networks, a cascade of changes can result that alters the workings of the Mind. Individually, we can recognize threats to our species survival and adaptively respond, but our actions must be a part of a larger systemic response of the Mind if we are to survive.

How intelligent are we? Individually? Collectively? We will soon find out. We cannot afford to fail this intelligence test.

.

.

Play with Your Digits

You are about to enter a Twilight Zone of contradiction and paradox, where the boundaries between “real” and “not real” are blurred. We will travel to strange, seemingly incomprehensible places, only to discover we have arrived home. I begin our journey with an invitation: “Let’s Play!”

Digital Logic

Let’s begin with the digits that compose our modern world.

Our virtual world, a dreamscape projected through a screen, is anchored in electrical processes and material hardware that follow the dictates of the physical world and logical laws. Microchips containing a staggering number of electrical “switches” that are knitted together in quick changing patterns of electrical circuits are the basis of cyberspace. Each of these switches, or bits, possess only two possibilities: either “on” or “off”, which typically are assigned a digital value of 1 or 0.

Pictures, music, images, broadcasts, streaming, data—just about all forms of information and electronic communication—are now digitalized. This process is accomplished by segmenting analog signals, such as sound or light waves, which are continuous, into the tiniest of discrete bits of information, and converting letters, numbers, and other forms of information into a bit string. The binary format, either 1 or 0, allows for easy transmission, manipulation, and storage of information. Our digitalized world is organized by machines that can, at lightning speed, conduct huge calculations with bits composed of the binary, either 1 or 0.

This binary, either-or structure presupposes an assumption that a bit cannot be “on” and “off”, “0” and “1”, simultaneously. This assumption is so obvious, so basic to our understanding of how things exist in the world that it constitutes a fundamental law of logic: the Law of Non-Contradiction. This Law states that it is impossible for something to be A (i.e., “on” or “1”) and not-A  (i.e., “off” or “0”) at the same time. This is considered a first principle of logic upon which all else rests.  

Cosmic Logic

The cosmos, surprisingly, defies our logical sensibilities. Quantum mechanics, which is the fundamental physical process that constitutes everything in the universe, is based on the seeming impossibility that something can simultaneously be A AND not-A. So, for example, in quantum mechanics, light can be both a wave AND a particle; we just don’t know which until we look.

Efforts are underway to construct a computer with exponentially greater power than current computers that exploits the logic-befuddling quantum properties that something can be simultaneously A and not-A. These properties include quantum entanglement and spooky action at a distance. When particles are quantum entangled, the effects on one particle will instantaneously affect the others—even if they are located at opposite ends of the cosmos. These instantaneous effects over such great distances cannot be explained by our current understanding of the physical universe; spooky action at a distance is the phrase used to describe this.1

Quantum mechanics not only is revolutionizing computing, it also has shattered our assumptions about reality. Obvious principles of logic—overturned. Entanglement of particles with mutual, instantaneous influence across unimaginable distances—inexplicable. The world hangs suspended, particle AND wave, “0” AND “1” awaiting interaction with a macroscopic system— such as us—to give it form. We cannot assume a “gods eye” position of observing, measuring, and understanding the workings of the universe independent of our own place, time, and actions in it. The ground beneath our feet is unstable.

Playful Logic

Another AND subverts the seemingly obvious Law of Non-Contradiction: play. Playing is no simple matter, as it requires sophisticated communication abilities. A dog that playfully nips another must communicate, “This nip is not really a nip”, otherwise the nip could precipitate a fight rather than a playful response. This is typically communicated by exaggerated movements of some kind, such as paws out, head lowered, tail wagging, and galumphing about. These are metacommunicative messages that signal: The nip (A) is not really a nip (not-A).2

These metacommunications entail distancing from the ongoing stream of messages; stepping outside the stream to signal how these messages should be interpreted. Play requires self-referential awareness of being “the one doing the signaling,” as well as awareness of the other as the “one to whom the signals are meant” (and who, also, is “one who signals”).The player acts “as if” the bite is real and must have “double knowledge”: being able to tell the difference between the actual and pretend bite at the same time. It is both “real” and “not real.”

Play engenders a deeper kind of knowing; both a cognitive double knowledge of “as if,” and a social entanglement with another, forged in mutually sharing that a “bite is not a bite.” For these reasons, play is often used as a marker for assessing the relative cognitive and social capacities among species. It is also considered a nascent form of a much more sophisticated mental ability: representation.

Re-Present

“This is not a pipe.”

Representation consists of an object or referent (A) that is re-presented by a sign or symbol (not-A). The re-presentation is both A, as it acts “as if” it were A, AND not-A, because it stands in place of A. Representations can take many forms, from images, to words, numbers, and sounds. Untethered from reality, existing in its own created reality, a representation affords leeway from the represented, allowing us to play with it in symbolic ways, exploring novel, strange, fantastic, comical possibilities.

We easily can, and do, become swept into the represented reality without realizing that it is an “as if” one. We lead a “double life.” Magritte, a 20th century French painter, humorously points this out in his painting, “This is not a pipe.” We initially are confused by the painting because we immediately assume, it is a pipe. But it is, AND is not, a pipe. The painting amuses because it playfully exposes our taken-for-granted double life. We dwell within a representational world as strange, peculiar, and bizarre as quantum mechanics.3

Paradoxical Power of AND

While representation is not the real thing itself, it does, paradoxically, enable us to grasp reality in a more fundamental way. Quantum mechanics, for example, affords an exceedingly precise understanding of the cosmos, which is understood through mathematics, perhaps the most abstract and detached of all forms of representation. Indeed, quantum mechanics is an abstract mathematical reality whose representational fidelity to reality is tested, empirically, in extraordinarily complex and abstract ways. We are a like blind man, probing the world with a mathematical stick. The same can be said for language, art, and music—all of which, in themselves, are not “real”— but it is specifically their “as if” quality that affords us the ability to “see” with greater depth and clarity.

Logical contradictions and paradox, ironically, empower us. Thus do we catch a glimmer of the dizzying hall of mirrors that comprise ourselves and the cosmos that we inhabit.

.

.

.

Attention Shoppers!

Restlessness

Attentiveness is the natural prayer of the soul.1

Attentiveness is not alertness, which is a restless scanning. Attentiveness requires time, intention, and focus. It is a meditative thoughtfulness about something. It is in this meditative thoughtfulness that we encounter, and discover, our soul; the core of our being.

Attentiveness is in short supply. We live in a restless time, poised to check and respond to the many digital alerts, messages, and prompts that ping, ting, and ding on our phones, watches, Fitbits, computer screens, televisions, and dashboards. Our waking hours are saturated with the presence of a virtual world, beckoning us to something more urgent, more important, more vital than our being in the present moment. Some even place their phone on their bedstand, just in case…

Sin and Shopping

If attentiveness is a form of prayer, then what is sin? St. Paul describes the experience of sin: “For the good that I would, I do not: but the evil which I would not, that I do.”2 Why do we feel compelled to violate our highest ideals? Religious spiritual traditions answer: The Devil. And how does the Devil manage this? By appealing to the desires of the flesh, which are sins against the commitment and command to live a life of the spirit. Sin offers the alure of euphoric oblivion, of being transported beyond the drab humdrum of daily concerns. No wonder sin is so appealing. 

Secular humanist traditions do not share the metaphysical beliefs of most religions, but much wisdom resides in religious concepts that can be meaningfully translated into a humanist framework.3 St. Paul’s description of sin provides a succinct description of the experience of addiction: We feel compelled to do what we know we shouldn’t. The devil, like the religious Devil, tempts us by appealing to desires, needs, and interests that are against our best interests. This can take many forms for different people. One source, however, invades the life of nearly us all: cyberspace.

Cyberspace is not inherently devilish. The digital world is seamlessly woven into almost every aspect of our lives and has been magnified during the pandemic. Workplaces for many have shifted from office to home. In-person contact with friends and colleagues, attending church services and schooling, business meetings, social gatherings, meet-ups, babysitting, with people near and far, are now conducted through Zoom and FaceTime. Limitless entertainment, recreational, and educational activities are available in podcasts, e-books, videos, streaming, gaming, news reports, weather updates, etc., etc., etc., all at our fingertips. And shopping. We can buy almost anything, at any time, with a simple “click”. It is hard not to delight in this dazzling world.

Our ubiquitous virtual experience, however, is largely governed and orchestrated by corporations whose only aim is profit. Many of these corporations offer what appears to be free services; email, doc sharing and storage, search engines, community messaging, etc., etc. It seems free because we fail to appreciate that we are paying with our most precious resource: our attention.

Others within the virtual world also seek our attention, where profit is measured in influence that can be leveraged for financial, political, and personal gain. The internet is an attentional economy. This virtual world is a modern midway with carnival barkers employing “clickbait” to snare our attention. The bait beckons us, flashing punchy, brief phrases, provocative imagery, and eye-catching banners that offer promises to fulfill desires we often didn’t know we had; to be enlarged by our participating in groups of “friends” who “like” us; to have our life changed by a purchase—to be transported, however briefly, beyond mundane daily life. The momentary hit afforded by a simple “click” triggers the desire for more. And more. The aim is to create addiction.

ATT *#/ EN +@% TIO &^+ N !

The alluring payoffs of the attention economy carry hidden costs. We check our phones, on average, every 10 minutes, spend less that 2 minutes for 70% of these sessions, and webpage visits typically last less that 10 to 20 seconds. Phone prompts, even if we ignore them, can disrupt attention and negatively impact performance on ongoing mental and physical tasks. Even the mere presence of a phone can lead to poorer performance. Media multitasking is common feature of web browsing and, surprisingly, can lead to a decrease in effective multitasking in other contexts, as individuals become more prone to distraction by irrelevant material.4 Restless, unsettled, alert, we scan the world, but have difficulty giving anything our full attention. Perhaps most importantly, our attention is in danger of being hijacked, as our minds are lured to the addictive tug of “more.”

Greatest Hazard

We cannot escape the cyber world we live in; cannot hold our breath, plug our ears, or close our eyes and hope it will go away. Even if we could, most of us would not want to.

Attentiveness is what has been eroded and attentiveness is what needs to be reclaimed. Attentiveness, not to a virtual world, but to the one we live our lives in, the one pressing on our skin, the one with cries and laughter, with flowers and weeds, with loves and friendships. These may lack the immediate buzz-hit of the fast-moving virtual stream, but that stream threatens to sweep us away from life, away from ourselves, numbing us to the awe and mystery of being here, now, alive. We are imperiled and much is at stake. Kierkegaard, that early Casandra of our modern age, warns:

“The greatest hazard of all, losing the self, can occur very quietly in the world, as if it were nothing at all. No other loss can occur so quietly; any other loss—an arm, a leg, five dollars, a wife, etc.—is sure to be noticed.”

May you find peace in this frenzied holiday season.

.

.

.

Mind and Body Problems

Mental illness has ravaged many a life, ruined many a family, and has dramatically increased during the pandemic. Most of us probably know someone—a loved one, family member, friend, or colleague—who has been afflicted.  

Hidden within all this suffering is a long standing, seemingly intractable, philosophical quandary residing at the center of Western philosophy since the 17th century: The mind-body problem. This Gordian knot was prompted by the rise of science, which permits only physical causes and effects. The problem is this: How can a material, biological entity that obeys the laws of the physical world (our body) give rise to our rich, subjective, deeply personal experience of the world (our mind)? The debate ranges and rages across a wide landscape. Are we, for example, only body and our mind is, at best, an epiphenomenon? Or, alternatively, can we only be sure of our immediate phenomenological experience—our mind—and our body, as a physical entity, is a mirage?

This debate appears to have little to do with our everyday lives; simply another pointless intellectual game played by philosophers.1 This debate, however, cannot be so easily dismissed. Many will likely make decisions that have life consequences, sometimes life and death consequences, without realizing that they are determined by unexamined assumptions about the mind-body problem.

Mental Illness

Mental illness. The mind-body problem adheres in its very name. The term, mental illness, is a concatenation of mind (mental) and body (illness). The term also encodes a presumed resolution to the mind-body problem. Illness is the subject, the noun. Mental is the descriptor of the subject, an adjective. It identifies the type of illness. To reverse the order, “illness mental”, makes no sense. Illness is not an adjective. Because human distress, anguish, and dysfunction are defined as an illness, a physical affliction of the body, then the identification, diagnosis, cause, and cure must be treated as all illnesses are; as a medical condition.

Pills and Power

The illness approach to mental suffering gives “ownership” to MD’s; to psychiatrists. The manual for identifying and diagnosing mental illness (the DSM) is issued, and frequently updated, by psychiatrists; new illnesses are identified, some deleted, and diagnostic criteria added and amended.   

Understanding and treating mental illness underwent a revolution in the 1960’s. Prior to this time, identification, diagnosis, cause, and cure were predominately determined by Freudian trained psychiatrists. The revolution was sparked by the discovery of medications that ameliorate symptoms associated with profoundly disabling types of mental illness; neuroleptics to treat psychosis, antidepressants for depression, and mood stabilizers for bipolar disorder.

The chemical actions of these drugs were investigated with the aim of uncovering the pathophysiology responsible for the illness that the medications address. This research led to “chemical imbalance” causal models of mental illness. So, for example, neuroleptics, which contain dopamine, was hypothesized to offset an imbalance in dopamine presumed to cause psychosis. Similarly, antidepressants, which inhibit the uptake of serotonin, was hypothesized to prevent an imbalance of that neurotransmitter presumed to cause depression.

Given the severity of impairment of these illnesses, and the prolonged Freudian treatments using medically dubious means (talk) with uncertain results, these discoveries flashed on the scene as miracle drugs. Psychiatry was upended. The Freudian-based approach was replaced by a medical model anchored in physical causality, symptom-based diagnosis, and treatment by medication.

In addition to the obvious and profound benefits of relief of suffering, other important consequences followed from the positioning of mental illness within the body-causes-illness medical model. Insurance coverage is now routine, lifting the huge financial burden of treating these illnesses, and billions of federal dollars are now dedicated to research investigating the bio-medical causes and cures of mental illness. Medication is the default option typically used to treat mental illness.2 While philosophers debate the mind-body problem, it appears to have been solved in the laboratory.

Flies in the Ointment

Despite the success of medication and its widespread use, however, a host of troubling difficulties plague foundational aspects of the medical model of mental illness. Three are especially noteworthy.

Causality. Decades of research has failed to support the premise that specific biochemical deviations in the brain correspond to particular mental illnesses. Furthermore, symptom amelioration can be accomplished with drugs that do not contain the hypothesized chemical causal agent.3

It also has become clear that psychological stress and trauma can precipitate all manner of mental illness, from PTSD and anxiety disorders to schizophrenia and depression. Both body (biology) and mind (psycho-social) can predispose, prompt, or give rise to mental illness. Causality is a complex entanglement of body-to-mind and mind-to-body interactions. The nature of the entanglements are fuzzy, vary among mental disorders, and even vary among individuals sharing the same illness.

Diagnosis and Cure. Mental illness does not share essential diagnostic attributes of physical illness. Question: How many mental illnesses can be diagnosed using biomedical indices, including brain scans, genetic markers, blood tests, body scans, laboratory assays of bodily fluids and functions, identification of pathogens, etc.? Answer: None. Not one. Furthermore, the illnesses themselves are not identified by medical or biological symptoms. The symptoms are all psycho-behavioral. And cures consist of the reduction of the listed psycho-behavioral symptoms. Diagnosis and cure fall entirely within the circumference of “mind,” not body.

Treatment. The treatment of mental illness is a much researched area and the complications are head spinning. The best practice, empirically supported treatments vary for types and severity of illnesses, and can differ between individuals and circumstances. Medications can certainly be effective, but they are not a panacea. Indeed, for many, if not most, of the 157 mental disorders listed in the recent diagnostic manual, medications are not the most effective treatment. Instead, behavioral and psychological interventions often are, and they come with the added benefit of no long-term side effects.4

Mind-Body and You

The debate about the mind-body solution to mental illness ranges widely and rages on. It has not been solved. Confusion, dispute, and debate about all aspects of mental illness, from identification, to diagnosis, to cause, to cure, indeed, to the very legitimacy of the term “mental illness” itself, highlight the intractable nature of the issue. This is not simply a philosophical debate, and cannot be waved off as irrelevant to our daily lives. It is a dispute with life-altering consequences. Should you find yourself in need of treatment, you must choose your solution, and the stakes couldn’t be higher. Be informed. Choose wisely.

Abracadabra!

Consider. . . . . . . . . a tiger.

If I were speaking this, you would experience a silence after “Consider”. And when I eventually spoke again, you would then think of a tiger. Out of the anticipatory silence, the word conjured this remarkable creature—so utterly removed from your immediate experience—into awareness. The word “tiger” is “magic”. It “tricks” you into inhabiting an imaginary experience that overshadows the concrete reality that directly presents itself to your senses.

Language untethers us, liberates us from the immediate, allows us to reference distant things beyond our immediate experience; time travel forward to eternity or back to the Big Bang; contemplate zero and infinity, tennis and tigers, unicorns and utility bills; think about, and conduct, a “dialog” (!) with ourselves. Words have the power to charm, to evoke, to cast a spell. Words—puffs of air—are chants that enchant.1

We dwell within a reality that is conjured into existence through words. We typically think of enchantment as the experience of being transported into a state of wonderment apart from the mundane reality we navigate in our everyday lives. But our everyday world is defined, orchestrated, governed by words, and our body is habitually attuned to respond to these air-puffs. We dwell within the word meanings; they are grafted onto our brain and viscera evoking immediate, sometimes strong responses usually so automatic we are unaware of how we are being organically influenced and swayed. We live this enchanted life, walking, talking, thinking, feeling it—mesmerized. We experience it so directly and unquestionably that we think those who might suggest we live in a dream world slightly mad.

Collective Spell-Casting

This conjured reality is essential for our individual and collective survival—so essential that our brains are hard-wired to acquire and use language. We are, fundamentally, a social species, born so helpless that we must rely, for many years, on the intense care and attention of adults if we are to survive. As adults, we also cannot survive on our own; we must communicate and collaborate with others. Language makes this possible. Miraculously, our biology primes us, in our early years, to rapidly acquire words, regardless of the language.

Words are cultural covenants that exist before and beyond any individual, enabling those who share the word to inhabit a common intention. We speak through these covenants, breathing life into these ready-made forms, using them to express our personal meanings and intentions. When we “give our word” we make a promise to honor the shared intention of the word. And we “give our word” whenever we converse.

Power of Nonsense

Words are nonsense. Say a word over and over, fast. Doing so strips it of its power to charm, draining its meaning, exposing its sonic absurdity. Learning a foreign language has the same effect only in reverse. It is an act of faith that the nonsense sounds we try so hard to master will, in the right linguistic context, exert their magic. I spent many months laboring to learn basic French in preparation for a two month visit to France. When there I finally was able to utter the rehearsed sounds. . . others responded—as promised! I experienced toddlerhood, saying sounds and being startled and delighted to see others respond to me. It was an “abracadabra” moment.

Words are power. My French gave me power to participate with others in a meaningful way and navigate the simple cultural pathways of everyday engagements. My thrill derived from the remarkable power of words to reach beyond myself and animate reactions in others. I was initially elated: “I can speak French!” Over time, however, I became aware of just how limited I was. Culturally, I was a babe. As I fumbled my way through exchanges, I was met with kindness—not unlike that accorded a young child. I became aware of my vulnerability. I could not command adult authority, could not understand much of what was said or happening around me. I became concerned of being cheated, duped, mislead, and worried that if I was to find myself in a disagreement or situation of some personal consequence, I was a toddler. l discovered what it is like to be a “foreigner,” a migrant, an immigrant, an “alien”. Powerless.

Writing

Now consider writing. The sound-stream of words frozen into visual images no less absurd than the sounds themselves. Time is stopped. The author is disemboweled, a non-present presence made manifest. The words of those not only absent but long-ago dead can now be conjured, speaking to the those now-present and to those yet-to-be-born. Immortality. Unimagined places, ideas, experiences, lives, and so much more, whispered to us across the chasms of time, space, and death. Writing is magic. And those who can read and those who can write—conjurers. No wonder the written word has oft been considered sacred.

So, dear reader, as you read this you bring me to life in your life. Words and writing, conjuring, allowing us to share these personal, private moments together. You—alive. Me—who knows?  Abracadabra!

Holey-Moley—Pointless Absurdity!

“What is the meaning of life?
Golf.”
“Golf—That’s absurd!”
“Precisely.”

Golf

Golf is the most gravely serious of sports. Tournaments occur in very exclusive, private clubs typically off-limits to the general public, attire is country club conservative, caddies carry the golfer’s clubs and serve up the desired club for each shot, the crowds are silent during play, commentators offer their remarks in whispered voice, and players adhere to strict codes of honor and etiquette.1 Each golf course is unique and specially designed with obstacles of various types to challenge the golfer; narrow fairways, strategically placed bunkers, sand traps, trees and water obstacles, fringed tall grasses and plants to make it “rough” on a player who strays there, and tilted, undulating putting greens with difficult cup placements that prevent simple straight-line putts to the hole.

Miniature

Golf is a sport ripe for parody. And miniature golf delivers, upending the serious solemnity of golf. It is played on very short holes, requiring only a putter, using a colored ball, and real-world course challenges are replaced by manufactured funhouse obstacles. You can putt your way through a Victorian house and partake of microbrewery refreshments while navigating holes based on a labyrinth board game and a duck shooting gallery; play in the basement of a funeral home with death-themed holes including a coffin, of course, and guillotine hazards; or stroke away on a glow-in-the dark course with underwater themes of mermaids, sharks, and giant clams.

Holey Moley!

Rotating Hotdog
(with Mustard)

A good friend of mine who is a golf enthusiast told me about a series he watches, “Holey Moley.” It is a very bizarre show involving “golf.” Participants compete on a course with a variety of unusual “super-sized” obstacles similar to those found in miniature golf. What makes “Holey Moley’ unique, however, is that the players must run a gauntlet of crazy obstacles after putting, such as crossing a narrow plank over a pool of water while jets of water are sprayed across the plank to knock the player off, or mounting a giant rotating plastic hot dog. All the while, commentators make jokes about the competitors, about each other, and poke fun at the entire enterprise. Fans are loud and raucous, players come from all walks of life, prizes include a dorky, checkered green jacket (a wink at the Masters Tournament’s Green Jacket), and sports celebrities encourage contestants by, among other things, playing a tuba. It is riotous fun.

Playful Upending

Humor, jokes, puns, satire, and parody derive their force by upending our assumptive frame of expectations. We are surprised, delighted and, perhaps, shocked because our anticipations about how things are, or should be, are overturned—in a playful way. The playful upending occurs because we are “just kidding,” “don’t really mean it,” which allows for transgressing without consequence.

Deep truths can be stated or hinted at while “just kidding.” Indeed, these playful romps are subversive, allowing us to explore alternative possibilities and truths, safely. Laughter is the joyful delight of “playing with the world.” Jokes, satire, and parody unmask the taken-for-granted. Humorists often are the ones who offer the most biting and penetrating observations of contemporary morals and mores, politics and religion. They are our modern day tricksters and fools who, throughout the ages, have poked fun at kings and princes, mocked the sacred and the cherished, sneered at tradition and convention.

“Holey-Moley” is a parody of miniature golf, which is, itself, a parody, multiplying the absurdity quotient and leaving in its frothing wake a wink about of the absurdity of golf itself: “All this serious effort just to hit a ball into a hole! Why bother playing golf—such a pointless enterprise?!”

Pointless Absurdity and Despair

The absurdist mask and garb of the trickster and clown can be frightening if you fail to see the humor; that is, if you see the reality the mask is unveiling—that the assumptive ground beneath our feet is unstable.2 Life is golf writ large: “All this serious effort to what end? Why bother playing life—such a pointless enterprise?!”

Many of us have experienced bouts of pointless absurdity, the sense that it is all a silly waste of time. Although this experience is quite common and usually brief, it can be, in the extreme, a heavy, pressing weight of angst, depression, and despair. Ironically, there also is an appeal to this stance toward life—it affords a measure of detachment from the injuries, cares, turmoil, and disappointments of life. “If is doesn’t matter, it can’t hurt.” “If I don’t try, I can’t fail.” “If I don’t love, I won’t grieve.”

Play The Game

Golf affords a path out of this existential despair. In my experience, those who are most likely to say “All this serious pursuit just to hit a ball into a hole! Why bother playing  golf—such a pointless enterprise!?” are those who do not play golf. That is because once you play the game, open yourself to the divots, roughs, and hazards, you also experience the joys of the game: the satisfying smack of the club-to-ball, the shots out-of-the-rough-onto-the-green, the putts-into-the-hole.3

“Play the game” is the answer to the question, “Why bother to play the game?” whether the game is golf or life. The challenge, of course, is why, and how, should you begin to play a game that is obviously pointless? This self-perpetuating spiral offers no apparent exit. But if you have no games in your life, no loves, no joys of even he smallest kind, this too, is a self-perpetuating spiral—of despair.

Overcoming pointless absurdity in both golf and life is the same. The first step is the biggest—getting up off the couch; a willingness to “give it a try.” It is important to seek momentary joys, not life changing transformations; to take pleasure in the simple act of taking action, not criticize all efforts as a failure; to embrace divots and flubs as part of the game, not proof of inadequacy; and to find an understanding and knowledgeable coach who can teach how to “get a grip”, ” take a stance”, and “get into the swing” that increase the joys of playing.4

The answer to pointless absurdity—and despair: “Tee it up.”

.

.

.

Creative Genius

Leonardo. Michelangelo. Newton. Mozart. Einstein. They have altered the shape of Western civilization; some of whom, while dead for over 500 years, we know by their first names. We venerate them, hold them in awe—we have even pickled Einstein’s brain in the hope of someday grasping the source of his brilliance. Their contributions are so extraordinary that when we hear the term, “creative genius,” our first association is likely to one of them. Indeed, one of the synonyms for genius listed in the Thesaurus is “Einstein.”

The term, “creative genius”, is freighted with intimations of the divine. Creation is a sacred act. All religions have creation stories about our origins to explain the miracle of our existence so as to answer the most haunting existential question: “Why is there something rather than nothing?” Something appears, inexplicably, beyond our imagination, and we are left only to wonder. So, too, do we wonder about the unfathomable acts of creation by people like Leonardo and Einstein. The word, “genius,” also intimates wonderment and suggests supernatural origins: The Latin origins of the word refers to an “inborn deity or spirit that guides a person through life.”1 And we do confer a quasi-deity status to them.

This deification, however, misleads. Although, indeed, remarkable, these individuals could not have conjured their creations without the benefit of being in a very special place and time, afforded opportunities, and being embedded in a deep and extensive socio-cultural network that not only gives rise to the occasion for their creative acts, but also provides the means for their execution.

Newton’s Law of Gravity

Newton is reported to have discovered the law of gravity after observing an apple fall from a tree. Miraculous, indeed. But if Newton was an Eskimo, I can confidently say: no Newtonian discovery.  Same for Mozart and the rest of the pantheon of creative geniuses. Not simply because the Arctic lacks apple trees or pianos, and certainly not because Eskimos lack the capacity for such genius. Consider Newton’s law of gravity with its bizarre jumble of letters and numbers. What the hell does this mean? You need many years of very intense, specialized schooling to grasp its meaning. Or, take a score of Mozart’s music. Again, impenetrable, except with years of specialized study.  

Each of these geniuses profited from the good fortune of opportunity and circumstances. Newton was educated—a extremely rare and privileged opportunity—and was a professor at Cambridge University in the 17th century. At the time, England was the driving force of the scientific revolution, and Cambridge was the white-hot center of English science. A similar pattern of good fortune and opportunity is found for the rest of the quintet of geniuses.

We venerate these individuals but fail to venerate the remarkable cultural networks, contexts, and opportunities that provided the soil for their individual genius to arise and flourish.

These notable individuals are not apart from the rest of us, but are the apex of the creativity that is our human birthright. Our survival as a species depends on our inborn capacity to create and share ingenious discoveries that are drawn from and, in return, profit the larger community. Genetic mutation enables biological adaption, but the process is very slow; many generations long. The creation of new, adaptive innovations, in contrast, can occur in leaps and gain widespread use within a generation. Human adaptation occurs primarily through memes, not genes.

Our lives are densely packed with the gifts of the long history of human ingenuity, almost all by nameless individuals. The simple screw affords an example. Who invented the screw? We don’t know. When was it invented? Again, not sure, maybe ancient Greece, maybe ancient Egypt. Our world is literally held together by screws; by this original innovation created by unknown, ingenious individuals passed to us from long ago.

What is true for screws holds for sewing needles, bread, chocolate, and most of the rest of what compose our modern lives—-all products of nameless creative geniuses. We are part of this ongoing history, making our contributions, some far reaching, most more local. We each contribute to the network in our own, often humble way. It takes a village for a Newton to thrive. For a screw to be invented. And to create a village.

Robots, Hives and Heroes

Robots

We are now able to engineer machines to perform feats that were, only a few short years ago, thought to be very distant possibilities in an imagined future. Self-driving vehicles, medical advances that outstrip the diagnostic abilities of the most able and experienced physicians, robots capable of accomplishing tasks of great complexity, are some examples. These futuristic achievements resulted from a breakthrough in how to program computers perform tasks.

Computers are programed using algorithms, which are simply formulas that define the organization and order for systematically performing operations needed to execute a task. These, of course, can be very complex, as the tasks become more complex, but are typically also rigid; once programed, the sequencing organization cannot be changed unless the programmers make modifications. The learning—the altering the organization based on feedback from the results —is done by the programmers.

All this changed with the advent of “machine intelligence”, where learning occurs within the machine itself. The algorithms responsible for machine learning are not completely rigid. They can self-modify, based on the results of its action. Exposure to many, many, situations creates many, many different outcomes that provide feedback, generating iterative adjustments (learning) that refine and perfect the performance. Machines become experts, capable of discriminations and decisions that can surpass the best human experts.1

Brain

The human brain is the model for how machine learning is programmed. The brain is composed of billions of neurons knitted together in complex networks. Each neuron operates like an on-off switch; it is either “on”—firing an electrical impulse—or “off”. The firing occurs when the electrical potential between neurons reaches a critical value, generating a spark that jumps the gap between the neurons. This becomes a link in a neural pathway that is part of an incomprehensibly vast web of networks. The networks are constantly changing as circumstances change. Habits create established neural pathways that occur when confronted with familiar circumstances. Learning occurs when feedback from familiar situations is sufficiently different from expected, prompting alteration of the response, changing electrical potentials between neurons, and thus changing the neural networks.

Machine learning is composed of silicon, rather than neural on-off switches, and the networks are very simple, not infinitely complex, but in both, feedback changes the firing potentials between switches, which alters the networks, which alters the responses.2 Simple, individual components capable of only the most elementary and inflexible on-off responses, when combined into complex networks of coordinated action, give rise to a system capable of solving impossibly complex tasks, and self-correcting as is goes. Thus occurs a promethean leap from silicon and neurons to intelligence and mind.3

Hive

What is this? It is not a sand hill. It is a termite mound. It is also housing for, and integral to, a mind; a hive mind. Each individual termite can perform simple functions, certainly more complex that an on-off switch, but quite limited in flexibility and function. The biology of the termite (reflexes, nervous system, exoskeleton, etc.) constrain the scope and functioning of individuals, but, most importantly, also encompass the ability to communicate and cooperate with other termites. This is a critical component for survival, for like the on-off switches of computers and brains, individuals become part of networks of collaborative action, which gives rise to a hive mind.

This mind is capable intelligent actions and evidenced in the termite mound itself. The structure is among the largest of any constructed by non-human species and acts as a huge lung, allowing the entire colony to inhale oxygen, exhale carbon dioxide; houses underground cultivated gardens and specialized chambers; and is under continual alteration to adjust to changes in weather and humidity to keep a constant livable environment for the inhabitants. Single individuals are incapable of learning and lacking in memory. A hive is capable of both, in very complex ways; foraging widely for food and bringing it back to the hive, adjusting to changes in the environment, developing creative solutions to the problems encountered.4 The hive is made possible by the biology of the individual to establish collaborative networks. The survival of the individual is dependent on the survival of the hive.

Humans

What is this? It is not a metal and glass hill. It is a human mound. It is also housing for, and integral to, a mind; a hive mind. Each individual is certainly more complex than an on-off switch or a termite. Each possesses a mind capable of intelligent, creative actions and adaptive responses. Despite individual sophistication, however, they cannot survive independent of the hive.5 Biology (reflexes, nervous system, endoskeleton, etc.) constrains the scope and adaptability of individuals, but, most importantly, also encompasses the ability to communicate and cooperate with others. This is critical for survival, and like the on-off “switches” of computers and brains and the biology of termites, allows individuals to become part of networks of collaborative action that give rise to a hive mind. The survival of the individual is dependent on the survival of the hive. One becomes the many. The many protect the one.

The Screw

The hive mind, that is, the collective capacity to understand and undertake projects that allow the human hive to adapt to demands and changing conditions that help insure the welfare of the collective, are beyond what any one of us could possibly conceive or execute. They also typically are hidden from view, in the background, as we attend to the foreground that preoccupies our daily lives. We drive to the market, unaware and unappreciative that every single act is made possible through the hive mind.6 What single individual could build a car from scratch; scratch here meaning, produce even a simple screw needed for the task? Indeed, the human hive-mind not only encompasses the hum and buzz of the living, but also resonates with the deeper register of the hum and buzz of the long past; those who learned to make metal from dirt, the physics of the screw, and the machine tools to make a screw, for example.

The Heroic Individual

We Americans are especially blind to the humming significance of the hive mind, as our model of the heroic individual pervades all aspects of our life, from economics, to politics, to psychotherapy. Certainly, individual initiative, determination, intelligence, and adaptability are important attributes that can contribute to our individual accomplishments and fate. Often, however, the model also includes the assumption that the individual is pitted against the world—the collective “they”; that our fate is totally in our hands and we are solely responsible for our success or failure, and the collective is a barrier to achieving success.7

The Heroic Ones and the Many

Rescue

Crises that threaten the hive, such as pandemics, most forcefully reveal the limitations of the individual, however able, to survive on their own. Our collective welfare and survival, and our individual welfare and survival, are inseparable. And the most heroic individuals are those who are ready to sacrifice their welfare, even their lives, for the collective; health care workers, police and firefighters, to name but a few. We use the term “heroic” only for those who sacrifice themselves for the greater good. We understand, at a primitive level, that individual sacrifice that only benefits ourselves is not heroic. It may be admirable, encompassing individual pluck and initiative, but it is not “heroic”. One becomes the many. The many protect the one. The heroic ones protect the many.

.

.

.

« Older posts

© 2022 December Songs

Theme by Anders NorenUp ↑