Reflection 60: The (Not So) Mysterious Absence of Public Role Models

If we hope to craft more effective change strategies, we need to come to grips with the dynamism of the predominant culture. A marvelously intricate and evolving system, it perpetuates and entrenches itself in so many ways.

Some of these processes are obvious: The aggressive, bullying, and self-aggrandizing attitudes and behaviors that pervade our culture. But many others are hidden and subtle, and we need to come to grips with these processes as well. Why? Because failing to do so, they operate unseen and without restraint in our lives, defeating by indirection our efforts to create a more decent life and world.

A number of these phenomena are discussed in earlier Reflections: # 8, Why We Aren’t Good Students; Why It Matters (the decline of critical thinking); # 22, Consumerism – and the Passivity it Breeds, #29, Losing Our Communal Roots; # 31, Perfectionism; and #51, Monumental Self-Absorption (our culturally distorted view of history).

In this Reflection, I discuss another of these processes: The ways in which we are deprived of public role models to guide and inspire us. In this area, as in so many others, there are multiple, mutually reinforcing cultural forces that lead to this result. Key aspects of this phenomenon are discussed below.

  1. Disqualifying potential leaders and role models.

This process flows directly out of the fact that we live in a culture permeated by a competitive, win/lose mindset: If someone else is up, I must be down.

Because we habitually view the world from this perspective – because we are in competition with everyone else – we reflexively judge others, looking for weaknesses and shortcomings. See Reflection 16, Mainstream Thinking – The Tyranny of Opinion and Judgment. As a result, we are experts, not at identifying and nurturing leaders, but at tearing them down.

When a person emerges as a potential leader, the mainstream media’s coverage is not saturated with stories that explore his or her strengths. Instead the hunt is on for disqualifying flaws and “gotcha” moments: Sarah Palin’s “I can see Russia from my front porch;” Howard Dean’s scream; Bill Clinton’s sex life; Dan Quayle’s “you’re no John Kennedy” moment; Gary Hart’s illicit romp on the Monkey Business; Edmund Muskie’s tears in the snows of New Hampshire; and so on.

The result of this process is a debasement of the entire process of finding leaders and role models. Many of our best people avoid the public arena entirely. And those who don’t – and survive this cultural witch hunt – are, typically, cautious and deeply conventional people who have long since learned to hide, rather than share, their true humanity; hardly the sort of people who are capable of leading and inspiring by their example.

  1. Our confused understanding of the leaders we do have.

A second reason for the absence of inspiring role models lies in our confusion about the qualities we are looking for. We may think that we are seeking wise and decent leaders, but the truth is far more complicated. Over the last 40 years, a number of Presidents were seemingly decent men attempting to make thoughtful and responsible decisions including, for example, Gerald Ford, Jimmy Carter, George H.W. Bush.

The fact that Ford, Carter, and the elder Bush each failed to get re-elected is not, it seems to me, a coincidence. Why? Because, in a culture that puts its highest priority on winning, moderation, reflection, and decency are associated with weakness and the lack of a killer instinct. The result? We have visceral doubts about leaders who exhibit these qualities.

Note, importantly, that the need to feel like a winner – and, with it, the tendency to associate decency with weakness – deeply infiltrates the worldview even of people who view themselves as progressive. It is not just conservatives who view Jimmy Carter as a failed President. And the reason, I think, has less to do with what he did or didn’t do and more to do with the fact that he “lost.”

Progressives may say they want leaders and role models who transcend the mainstream culture’s values. But, then, they judge our leaders by the very win/lose values they purport to condemn. So, for example, Obama was negatively judged for persisting in his efforts to nurture a fruitful dialogue during the budget crises that have marked his years in office. Why? Because he dominate, control, and “win.” And yet – granting that the compromises he agreed to had real consequences – isn’t the pursuit of a civil dialogue as, or more, important than Congress’ vote on the issue du jour?

Progressives seemed far more comfortable with Bill Clinton who “won” by triangulating the opposition – code for embracing dismantling the welfare system and financial deregulation. Thus, while he may have given away the store substantively, he allowed mainstream progressives feel like “winners” in their competition against the right.

  1. Domesticating and marginalizing our heroes.

When a leader who is the real deal does actually emerge, the mainstream culture’s first line of defense is the tearing down process described above. But when that fails, a more subtle process takes hold. The leader is “embraced” by the mainstream culture but is, in the process, transformed into a pale, domesticated version of himself. Over time, as increasingly mainstream stories are told and re-told about him, he is absorbed into a larger cultural narrative that supports and reinforces the very mainstream ways of operating he worked so hard to change.

The most vivid, recent example is Martin Luther King. Here is a man who was committed to fundamental change. He fought against inequity and injustice wherever he saw it; fearlessly risking his life and freedom for the cause; dying as he lived, working to bring economic justice to Memphis’ sanitation workers. His activism, tireless organizing, and nonviolent tactics offered a vivid roadmap for more effectively confronting entrenched privilege and power.

But, now, 40 years after his death, we are left with a safely domesticated, hollowed out version of the man. In our collective, mainstream memory he is remembered, and celebrated, as the leader of the movement – now a fading historical artifact – to end de jure segregation in the South.

De-emphasized to the point of invisibility are the broader, more enduring aspects of his legacy: His campaigns against systemic racism, economic injustice, and American imperialism, as well as his legacy of activism, organizing, and nonviolent confrontation. In other words, the culture has obscured the very things that could make him a vital role model for those of us who long to create a better world.

Historically, the most significant example of this domestication process is Jesus. In The First Coming, a book that exhaustively teases out the known details of his life, the philosopher, Thomas Sheehan, describes a man who was wholly committed to challenging power and fundamentally changing the world in which he lived. But Sheehan then describes a process that, within 60 years of his death, relegated his radical “here and now” vision to the relative margins of the movement, created in his name.

In Sheehan’s telling, as each gospel was written, Jesus was progressively transformed into a messiah who, instead of challenging us to create God’s kingdom in this world, promised salvation in the next. And so, for the last two thousand years, his presence in our lives as an role model for activism and change has been largely superseded by the vision of a transcendent, other worldly messiah who, solely by his grace, bestows salvation; a vision that – not at all accidentally – condones and encourages passivity in the face of systemic injustice.


Radical Decency offers a roadmap that, by counteracting the processes described above, can support us in naming and reclaiming our role models and heroes.

It supports us in viewing others with respect, understanding, and empathy. And, as that mindset becomes habitual, we will become far more curious about what our leaders have to offer and far less willing to engage in the mainstream culture’s “gotcha” game of judgment and dismissal.

In addition, our ability to identify worthy leaders will increase as we evaluate them according Radical Decency’s values, asking over and over: Are they are actively looking for ways to be decent to themselves, others, and the world? Doing so, we will be much less susceptible to seduction by leaders who “talk the talk” but, then, compromise their goals – and ours – in order to provide the mainstream drug of “winning.”

Finally, Radical Decency will support us in the continuing the effort to reclaim the public stories of Jesus, Martin Luther King, and other authentic heroes, past and present, infusing them with the vision, activism, commitment, and fearlessness that made them great; reclaiming them as teachers and vital sources of inspiration.

Reflection 58: Infiltration and Co-Optation — The Disease That Ails Us

One of our biggest challenges, as we seek to craft more effective strategies for living more decently, is to understand the precise nature of the problem that makes this seemingly straightforward goal so difficult. For starters, we need to understand that compete and win, dominate and control – the values that are so wildly over emphasized in our culture and so frequently referred to in these Reflections – are not the fundamental problem.

To the contrary, properly managed, these qualities are helpful aspects of our overall human arsenal. In appropriate situations, a competitive spirit sharpens our wits, motivates us to higher levels of performance, and creates an intimate bond with co-competitors. And far from being wrong, lying to a would-be rapist or the Gestapo – control by deception – is an invaluable skill. See Reflection # 30, In Defense of Our Troubling Values.

In a similar way, focusing our reform energy on specific attributes of the culture also misses the mark. Efforts to reform the financial system or clean up the environment – while vitally important – will never lead to a fundamental alteration in the ways in which we live.

Instead, the last 40 years have taught us that, for example, if we limit the flow of money in one area of the political process, it will almost immediately be redirected into other channels; defeating efforts at campaign finance reform. And if an impeccably humanistic education became the official norm – and nothing else changed – the great bulk of us would simply tolerate this impractical, airy/fairy curriculum, finding other venues in which to focus on the art of competing and winning.

So if the fundamental issue isn’t specific aspects of the culture or the values it promotes, what is the crux of the problem? It is the process by which these values infiltrate into virtually every area of our lives. This process is like a giant, voracious amoeba that, silently and unseen, oozes into – and co-opts to its competitive, acquisitive outlook – virtually every institution, movement, relationship, and way of operating in the world.


This Reflection offers examples of how deeply this process infects two of our most private and, seemingly, benign of human activities: Humor and reason. By focusing on these less obvious examples, I hope to persuasively illustrate how shockingly deep and widespread this phenomenon really is.

Doing so, I am not suggesting that humor and reason are bad. To the contrary, logical thought, and the ideas and theories it fosters, are indispensible tools as we seek to create better lives and a better world. See Reflection 21, Theory Matters. And humor, done well, can offer highly effective, cut-to-the-bone social commentary (as well as good fun!). But because humor and reason are such critical tools in our effort to make things different and better, we need to be especially alert to the mainstream culture’s remarkable ability to twist – even them – into mechanisms that perpetuate and expand its vise-like grip on our lives.

  1. Humor

Jokes, quick quips, irony, and sarcasm are deeply woven into the fabric of our lives. The little jolt of pleasure that a funny remark provokes is a constant, very welcome companion as we tend to our day-by-day chores. But if we hope to be a force for change, we cannot uncritically give ourselves over to our instinct for teasing and sarcasm. Why? Because of the (largely unacknowledged) role humor plays in reinforcing and perpetuating the mainstream culture’s dominant values.

Anger is an integral part of our fight or flight brain and is specifically designed to overpower someone else’s will. Given the culture’s emphasis on domination and control, it is no surprise that anger and aggression are endemic. But explicit anger risks unwanted consequences: Alienation of an important person, social stigmatization and, of course, retaliation.

So one of humor’s unstated but very important roles is to offer an acceptable social cover for anger. A joke can be utterly benign – even warm and loving. But the same joke, told with different intent and timing, can also be a searing putdown.

In this way, humor provides a double cloak of non-accountability for anger. First, it is often difficult to gauge the joke teller’s intent. Is this a manipulative act of aggression? It certainly feels that way, but how can I be sure? In addition, even when the intent is clear, effective counter-measures are almost impossible. Making the effort, the victim is likely to be greeted with one of these all too familiar, accountability denying response: “Just kidding!” or “What’s the matter, can’t you take a joke?

Humor is also a very important bullying tactic in the context of a debate or dialogue. When I was a practicing lawyer, a smart aphorism I frequently heard was this: “The first person to get angry, loses.” So a very common, but unacknowledged tactic of a smart attorney is to needle your opponent into an attack that makes the other participants uncomfortable.

And, of course, when humor is employed as a more direct mode of attack – as ridicule – it can be an enormously effective tool of domination and control. One dismissive comment, provided it is funny and will-timed, can be a devastatingly effective way of disqualifying the position of the person on the receiving end.

This phenomenon may seem relatively benign, but it isn’t. We are a culture that has largely lost its ability to engage in civil dialogue; one that acknowledges and respects difference and looks for common ground. So if we are serious about counteracting the massive infiltration of the mainstream’s culture values into our lives, we cannot engage in indecent humor just because we enjoy its emotional “hit” and are susceptible to its disarming charm.

  1. Reason

Many of us think of reason as an unalloyed good. While our emotions often seem unreliable and potentially damaging, we view our ability to think calmly and logically as a mature and stabilizing force.

The problem with this view is that it ignores the reality of our biology. Our emotional brain is, actually, far more powerful than our thinking brain. In fact, all data initially enters our brain through its emotional side. Why? So that before anything else happens we can determine whether something is highly pleasurable – to be pursued – or dangerous – thereby triggering our fight or flight system. Only then does the data migrate into our thinking/reasoning brain.

Thus, while the mainstream view is that the rational brain limits and controls the emotional brain, the opposite is closer to the truth. It is the emotional brain that, far more typically, harnesses the thinking brain to its purposes.

As Jonathan Haight describes it, our thinking brain is predominately a lawyer, advocating for the things our emotional brain impels us toward. And, as Edward O. Wilson notes, “we make decisions for reasons we often sense only vaguely, and seldom if ever understand fully.”

Trusting our reasoning abilities as cool and objective – when, in in fact, they are anything but – they are ripe for infiltration and co-optation by the culture’s mainstream values. All too often, we weave webs of logic that are, unknown to our thinking brain, a cover for emotional drives that are – given the culture we live in – aggressive, controlling, and manipulative.

In this chilling quote, the psychologist and social theorist, Jordan Peterson describes the deadly extremes to which this process can go:

I understand and having understood, I impose order on reality. That’s what every ideologue and utopian does. It’s convincing and, I think, the reason people do this is partly because they want an explanation for their being. More important than that, however, is that they want a mask that covers up their tendency to atrocity with the appearance of virtue. Most utopian thinking is of that sort even though the mask can be very well argued.

The consequences of this process can wreck havoc in our lives, at both a personal and political level. Operating unseen and unacknowledged this process has led, over and over, to murderous rampages by political and religious zealots. Equally, it has more quietly shredded one intimate relationship after another as the parties battle about who is “right,” certain that their problems would be solve – if only the other person could understand.


If we hope to create better lives and a better world, the fullest possible understanding of this process of infiltration and co-optation is vitally important. Why? Because, failing to understand its breadth and depth, we will never be able to craft strategies that are equal to the challenge we face.

Absent this understanding, the best of us – those who actually care – will continue to be channeled into activities that seek to soften our indecent system’s excesses: Elections, legislation, lawsuits and, of course, a myriad of (shamefully underfunded) services to the culture’s endless victims. And with our good energy and attention diverted away from the disease that really ails us, the mainstream culture’s headlong pursuit of private wealth and power will continue unabated.

Radical Decency, by offering an alternative set of values – applicable in all areas of living – offers a way to deal with this core issue. It is not designed to supplant the very useful, but more limited, reform efforts that are our current focus. Instead, it offers a more comprehensive context in which each of these activities can be pursued.

In this way, the good people who promote current reform efforts can expand their potential impact and, crucially, understand how deeply interrelated and mutually reinforcing their seemingly separate pursuits really are. Then, hopefully, they can be knitted together into a unified and far more effective movement for change. See Reflection # 45, Re-visioning Social Change Work, and Reflection 56, Religion – Debasement, Inspiration, Lessons Learned.

Reflection 56: Religion: Debasement, Inspiration, Lessons Learned

The philosopher Charles Taylor provided this insight that has deeply affected my view of the world: Just because we are continually confronted with debased versions of an idea doesn’t mean the idea itself is necessarily debased. It may be but, then again, it may not. As I look back on my personal journey with religion, this concept seems particularly apt: A rich mix of debasement and inspiration.

In this Reflection, I offer my experiences with this compelling area of living and seek to draw some lessons about how religion can be more effectively translated into a force for positive change.


The son of secular parents, a Protestant and a Jew, I grew up indifferently associated with the First Congregational Church of Scarsdale, New York. One clear memory from those years is leaving services with this thought: They told me to love my neighbor. But it’s now 11:30 a.m. on Sunday and I won’t get another word of guidance until next Sunday at 10 a.m. So what I am supposed to do?

Another memory: A “charming” anecdote about the minister who, in response to a prospective member’s concern about hypocrites in the congregation, responded by saying, “we can always use another.” No inspiration there – for an earnest teenager.

With this tepid introduction, I have, as an adult, strived to maintain openness and curiosity about religion. After all, billions of people across thousands of years have been deeply attached to it. Who am I to dismiss it? However, I have been continually been brought up short by the staggeringly debased versions I see all around me.

An obvious example is religion’s lethality. When Moses discovered the Hebrews worshipping a golden calf, he had 3,000 of his people massacred (Exodus, 32:29). And their triumphal entry into the holy land was an unprovoked attack on a people whose cardinal sin was worshipping gods other than Yahweh.

Then there is the last 2,000 years of history, a period riddled with Christian, Islamic, and other religiously motivated crusades, jihads, wars of aggression, and massacres. And the religious carnage continues: Jews and Muslims killing each other in the Middle East; Protestants and Catholics in Northern Ireland; Hindus and Muslims in Kashmir.

This murderous aspect of so many religions is not some weird coincidence. One of the prime lessons of history is that entrenched power co-opts movements that have the ability to move people and, thus, to challenge its authority. So, it is utterly predictable that the great religious traditions, whatever their original intent, have been repeatedly co-opted; enlisted as apologists for those in power. In this domesticated state, their prime function – the rationale for their privileged existence – is the “divinely inspired” moral rationale they provide for the ruling class’ relentless push for more and more power, by whatever means necessary.

This co-opted version of religion is how I remember the Church of my childhood: Holding its expressed values lightly; soft-soaping – with an easy quip, as above – hpocrisy and other deeply consequential moral issues; sending the message, in large ways and small, that wealth and power excuse all but the most aberrant and blatant ethical lapses; offering programs and messages that felt good but made no uncomfortable demands. So too, in the Jewish world – my religious community of choice for the last 40 years – where we lavish praise on the biggest donors, quietly overlooking the problematic choices that, in so many instances, allowed their outsized private fortunes to accumulate.


Another area where religion’s message is endemically debased is in the intellectual sphere. As Howard Lesnick points out in Listening for God, religious stories are meant to inspire. At their best, they are poetry, touching our hearts in ways that a carefully reasoned ethical treatise never can.

But when the intent of these religious texts is misunderstood, the damage is incalculable: Condemning birth control as our population approaches 7 billion; denying social and, often, political legitimacy to dissenters and nonbelievers; teaching young people that masturbation, sexual fantasies and premarital sex are sinful; provoking murderous attacks on Shi’ite neighbors, abortion doctors, and so many other demonized individuals and groups.

Much of this intellectual confusion results from religion’s excessive pre-occupation with speculative thinking, ungrounded in empirical evidence. “Miracles happen.” “We can speak with God or commune with the one-ness of the universe through prayer, meditation, or altered states of consciousness.” “Ours is the path to everlasting life.”

There is nothing wrong with this sort of thinking. To the contrary, for a self-conscious species, speculation beyond the four walls of our perceptual capacities allows us to more fully explore our potential. But our mainstream religious traditions have extended this sort of thinking far beyond its appropriate boundaries. Far too often, it has become a replacement for critical thinking instead of an important complement to it.

The result? Far too many of us slip into a place of conformance with one set of spiritual beliefs or another. And, with our ideas continually reinforced by co-believers, we wind up believing that we have found the ultimate answer. Then, pre-occupied and distract by our chosen sect’s answers, we fail to adequately focus on life’s most important questions:

  • Who are we? What are our capabilities and limitations?
  • What choices can we make that will allow us to live more nourishing lives and contribute to a better world?

For compelling evidence of this process at work, one need only look at the dismal state of our efforts to change our habitually indecent ways of living. Is there any doubt but that religion, in this debased form, plays a key role?


One the other hand . . .

Religious rituals, as I have experienced it in my affluent suburban community, have always seemed mechanical and uninspired. But, then, my wife and I attended 6 a.m. mass in a one room, cinder block church in one of the poorest neighborhoods in Port Au Prince, Haiti. Watching the nuns and lay Catholic workers take communion before they left for their work at a nearby orphanage, the idea of taking in the blood and body of Christ suddenly seemed powerful, real, and inspirational. And I couldn’t help but notice that most of our fellow service workers were religious, either Catholic or evangelical Christian.

Several years before that, I was a key attorney in a $500 million Ponzi scheme that began in the evangelical community and, ultimately, swallowed up a significant number of secular nonprofit organizations as well. The fuzzy religious thinking, I described earlier, fueled the scheme. Believing in miracles – that 2 plus 2 could equal 5, if God willed it – many Evangelical groups were particularly susceptible to the “too good to be true” scheme that the promoter, speaking their language, proposed.

But what was remarkable was the response of my evangelical clients. Two days after the bankruptcy filing, Steve Douglas of Campus Crusade for Christ convened 50 of his community’s leaders and, quoting principles taken from scripture, proposed a cooperative approach the workout.

Then, over the next 4 years, a coalition of 800 Evangelical groups did something truly unique in the bankruptcy world. Pouring their time, money and inspirational leadership into the effort, they crafted a plan that was premised, not on everyone grabbing what they could, but on fairness. The “winners” (those who took out more out of the Ponzi scheme than they put in) voluntary returned a percentage of their winnings; the losers divided the resulting pool of money equally; and small, endangered nonprofits were able to file for hardship exceptions.

Then, finally, there is the example of my half-sister, Judy, and Delle McCormick.

Judy, 10 years my senior, became a nun while I was still in junior high school. I didn’t understand the choice at the time. But over the years I have been struck by her clarity of purpose, devotion to service, and ease and zest in living.

Delle is a woman I met on a service trip about 10 years ago. Inspired by her faith, she left a comfortable suburban life to devote herself to social justice work. She too is suffused with clarity of purpose and a passionate sense of mission.

By their example, Judy and Delle have deeply affected my outlook and choices. The fact that they were both inspired by their religious beliefs is, I believe, no coincidence.

Lessons Learned

I draw two primary lessons from my journey with religion.

The first is positive. At its aspirational best, religion aims high, seeking to make sense out of our existence.

Focused on this really big issue, it has produced great wisdom and inspiring role models. Moreover, the language, rituals, and traditions that are deeply interwoven into our religious traditions offer enormous comfort and inspiration. If we turn our backs on this legacy we will be immeasurably diminished.

Radical Decency, with its focus on respect, understanding, empathy, acceptance and appreciation guides us away from dismissive judgment and toward a deep and abiding curiosity. As I see it, we are far better served if we view our religious traditions through this lens; gleaning the best, not just from our own tradition but from other traditions as well.

A recent conversation with a Catholic brother illustrates the rewards of this approach. Visiting a disturbed young man at his home in the middle of a workday, the brother was asked how he could take the time out of his busy schedule. His response: My vows – poverty, chastity, and obedience – free me to tend to life’s truly important tasks.

Bringing Radical Decency’s attitude of openness and curiosity to our discussion, what flashed for me was how I, too, could find inspiration and wisdom in his vows. My version of “chastity” – a committed marriage – frees me from an over pre-occupation with sex. And I can infuse the spirit of “poverty” into my life, not by giving my possessions away, but by turning more and more fully away from the (false) belief that my well being depends upon them. Finally, if I am fully “obedient” to my core values – Radical Decency – I will be freed from the selfish and grasping values that dominate our culture and so powerfully distract me from my larger life goals.


The second lesson I draw from my journey with religion is cautionary. Even as they offer inspiration and wisdom, our religious traditions are – with depressing regularity – co-opted by those in power. Sometimes the examples are spectacularly obvious to all but the truest of believers. But far more often they are quite subtle and, for this reason, more insidious and pernicious. So, even as we embrace the nourishment and guidance religion can offer, we need – always – to be vigilant. We must never temporize on the crucial task of exploring the implications of “this attitude” or “that choice.”

Over the years, I have discussed Radical Decency with a significant number of religiously committed people, from a wide variety of traditions. And as these experiences have accumulated so too has my confidence that the philosophy can provide an important anchor in this vital process. Decency to self, others, and the world, at all times, in every context, and without exception – this approach to living distills, I believe, what is best in our religious traditions.

Fully committed to Radical Decency’s values, my hope is this: Each of us will embody the best in our chosen religious tradition and, crucially, be a clear voice, within that tradition, for resisting the ever present temptation to compromise these ideals for the sake of money, members, and power. Then (to complete my dream), these like-minded religious people, and their secular sisters and brothers – with a growing recognition of their common purposes – will knit together into a powerful, perhaps even irresistible force for creating better lives and a more humane and decent world.

One can only hope . . . and have faith.

Reflection 51: Monumental Self-Absorption

As we got acquainted with our Novgorod guide, during our trip to Russia a few years ago, she mentioned that she taught world history. Right away I knew what she meant. Her history course went all the way back to “the beginning,” to the “dawn of civilization” about 7,000 years ago. This is what “world history” meant when I was in high school in New York, in the 1960s, and what it means today, half way around the world, two generations later.

Most of us never give this definition a second thought. But when we do, its weirdness is impossible to avoid. The “world” of which it purports to be a “history of” has actually existed, not for 7,000 years, but about 4 billion years. Moreover, we have existed as Homo sapiens for 300,000 years and as a distinct line of primates for another 6 million years. So even if we accept the idea that “world” history is legitimately limited to “human” history, the mainstream definition is still woefully incomplete, ignoring all but a small fraction of our species’ history.

What is going on?

As I see it, three fundamental factors are at work.


The first is fairly apparent, once you begin to reflect on the mainstream culture’s wildly distorted vision of world history: Our breathtaking self-absorption.

World history is about “us,” and us alone. Other species that coexist with us or preceded us – even the dinosaurs that dominated the world far longer than we have – are written out of world history. Equally absent, with the sweep of our conceptual pen, is any physical phenomenon that is not directly implicated in “our” dramas.

Moreover, the “us” we are talking about isn’t even all humans. History only begins when people like us first appeared; modern folks who live in sedentary communities, have a written language, and organize themselves in hierarchical/ authoritarian patterns. Everyone who lived before then is consigned to “pre-history,” the implicit message being that– having nothing to teach us – these people can and should be ignored.

Notice also, that world history is further limited to a very distinct subgroup within this already limited group. Virtually every society and ideology that earns history’s attention has one key element in common: Its ability to dominate large numbers of people during the time in which it is of historical interest. That is the common thread that draws into a coherent story characters as diverse as the Egyptian pharaohs, the ancient Greeks, Roman Emperors, Christian and Muslim thinkers and rulers, Napoleon, the British Empire, Hitler, Stalin, and the United States.

In other words, history is about winners; the people who best exemplify a dominant culture in which competition, dominance and control are valued above all else. In this myopic view, everyone else is either a foil in the winners’ drama or a non-entity, literally ignored out of existence.


The second factor that the mainstream definition of world history highlights is the extent to which our extreme self-absorption goes unnoticed. How is it that so many teachers, students, textbook writers, and professional historians can so easily and comfortably accept such an obviously distorted definition of world history?

The answer is not stupidity. It lies instead in the fact that from birth – from all sides – and, literally, for millennia – we have been massively brainwashed to think in this way. And because we are continually bombarded with myopic, self-absorbed ways of thinking, we exist in a context in which our distorted definition of “world history” is commonplace – unremarkable and, thus, seldom noticed or commented upon.

Examples of this taken for granted self-absorption are everywhere. Serious historians, for example, continue to argue the merits of American exceptionalism; the view that our country is different and unique.

Really? Seriously? Exceptionalism has been the cry of every empire and petty despot since, well, the dawn of world history. In fact, the only thing that is exceptional about the claim of American exceptionalism is how truly unexceptional it is.

Similarly, every generation’s financial bubble, including the run-up of housing prices leading up to 2008’s financial meltdown, has been an extolled as an exception to the hitherto normal rules of economics. Every 20 years or so, we are told – and millions believe – that our current investment strategies are somehow different and special.

Another rather stunning example is intelligent design; the idea that only a being with a brain like ours could have possibly created the world. Here again, massive self-absorption is at work.

Physicists, systems theorists, and students of ants have all persuasively demonstrated that many intelligences are not housed within a single skull.

In addition, contemporary neuroscientists, such as Daniel Siegel, point out that human intelligence does not arise out of a single brain in isolation, but instead results from the ongoing communion of one brain with others.

Nevertheless, intelligent design, in a classic example of blind egotism, simply asserts that “of course” our brain – that is, intelligence residing within a single human skull – is the highest expression of intelligence and, as such, is the only form of intelligence that could have possibly created such a complex universe.


The final factor that our weird definition of “world history” points to is to the extent to which our massive self-absorption is viewed as someone else’s problem. So, in writing about the ego-centrism of intelligent design, I confidently imagine the head-nodding agreement of my more secular readers. And yet, how many of these readers fall into the equally myopic trap of dismissing non-scientific thought as something from a primitive and outmoded past; a past that has been thoroughly superseded by civilization’s “progress” to its current “superior” state?

As this example illustrates, while I may see – and judge – the myopia and self-absorption in your way of viewing the world, I seldom see it in mine. Thus:

  • Religious fundamentalists believe they have found the way – and reject any history that contradicts their sacred texts.
  • Secularists view pre-scientific thought as primitive and intellectually bankrupt.
  • My country/culture/sect is unique and special.
  • Women judge men as “less than” even as men judge women as overly emotional.
  • My school/job/neighborhood/car/handbag sets me apart.

The list is endless but the common thread is this: We – that is, I and people like me – are different and better.


One very fair response to this rant about self-absorption is to ask why it is so objectionable. Can’t a passionately partisan love of country – or group – or family be an effective and fulfilling approach to living? My answer is no.

While the immediate psychic pay-offs are real, these self-absorbed approaches to living are, in the end, self-defeating strategies. When primary loyalty is to a group, it too easily puts important areas of our psyche at risk, suppressing the nonconforming ideas, temperaments, emotions, and drives that inevitably exist within our endlessly complex psyches.

In addition, it ignores the fact that we humans are intensely creatures of habit. For this reason, a split approach to living – being judgmental and dismissive of “others,” even as we seek to create an island of empathy and understanding in our smaller, self-selected group – can never work. Inevitably, the attitudes we habitually practice, out there in the larger world, will infiltrate and infect ways in which we deal with members of our group and, sadly, with our selves as well.

The proof? Living a world where a split approach is the norm has produced just such a dismal outcome: A culture in which injustice and inequity – together with anxiety, depression, and a wide variety of other addictive and self-destructive behaviors – are rampant.


Radical Decency offers a more hopeful alternative, in two fundamental ways.

First, it is based on behaviors – being decent – and not a set of beliefs. As a result, it avoids the trap of confusing and compromising our vocation of decency with a priori notions about who we’re supposed to be.

In addition, it is inclusive. By challenging us to be just as attentive to others and the world as we are to our selves, it specifically excludes the possibility of privileging one group over another – of making “world history” only about “us.”

Reflection 42: It’s Not As Bad As You Think – It’s Worse

I thought that – after six plus decades of living, two careers, and innumerable experiences in both the public and private sectors – my capacity for surprise at the depth of our moral and intellectual corruption had played itself out. Then, I read Michael Lewis’ book, The Big Short: Inside the Doomsday Machine.

Lewis tells a story that we know all too well: The explosive growth and ultimate collapse of the subprime mortgage market, in 2008. When I started the book, I was reasonably well informed about most of the movable parts that led to this historic economic debacle. I also had a sense of the greed, herd mentality, and short-term mindsets that fueled it.

But Lewis provides an unusually vivid and detailed roadmap for how it all worked and, equally, for the attitudes and taken-for-granted ways of operating that made it possible. Knowing in a general way that something is corrupt and unseemly is one thing. Getting a blow by blow description of the many, many wildly corrupt choices that so many made, at so many different levels, is quite another.

One of the story’s most powerful lessons is the sheer depth and virulence of the manipulation and self-aggrandizement that seemed to be the unquestioned mindset of virtually every participant. This was no benign financial bubble, where a product (“dot com” companies, silver, tulips in 17th century Holland) caught fire and had its price driven up by irrational optimism and the market’s herd mentality.

In this case, really clever people used their enormous economic power, and an unbounded lust for outsized profits, to create a highly suspect product: Subprime mortgages. They then transformed them, through financial sleight-of-hand, into a blue chip seeming investments – triple A rated bonds – to be sold (and re-sold) to unsuspecting investors. The pay-off: Enormous fees for the corporate originators of the mortgages and bonds, and multi-million dollar salaries and bonuses.

Lewis’ story vividly illustrates the extent to which we have devolved into an atomistic, every person for himself society. The fate of our fellow citizens, the financial system, and the country – all of these are someone else’s problem. Indeed, even Lewis’ “heroes” – a handful of people who saw what was coming – were focused, not on its social consequences, but on how to “short” this ill conceived market in order to make their own financial killing (hence, the book’s title).


An aspect of Lewis’ narrative that graphically illustrates this larger pattern of pervasive, systemically engrained corruption involves the role of the rating agencies – Moody’s, Standard and Poor’s, and Fitch. These companies provide risk assessing “grades” for bonds and other financial products. And their importance is unquestioned. Indeed, many pension funds and other investors are limited by law or internal guidelines to “safe” Triple A rated investments.

One glaring problem with the system – one no one hides – is the fact that the investment banks pay the rating agencies to grade their bonds. A reasonably intelligent investor would, you would think, be concerned that the agencies might go easy on the people who pay their bills. But as Lewis explains, the structural problems go much deeper. And this is where his story – here and elsewhere – becomes revelatory.

In many respects, the mistakes of the rating agencies and banks were identical. Wanting to flog the money machine – rather than slow it down – no one, seemingly, thought to systematically examine the underlying mortgages. Instead, the prevailing belief was that the bonds’ diversity (the mortgages were drawn from all over the country), an ever-rising housing market, and other macro factors ensured their safety.

Because their sole reason for being is to assess risk, you would think the rating agencies would have gone farther. And, in fact, they did have “secret” formulas for assessing each offering’s risk. But, as Lewis points out, rating agencies are populated with people who can’t get jobs at Goldman Sachs and the other, sexier banks and hedge funds; people who, in terms of intelligence and drive, are typically overmatched.

So when it came to ensuring the quality of the bonds backed by subprime mortgage pools, here is what we, the public, were left with: Secret formulas crafted by relative lightweights – whose dedication was already compromised by their firms’ dependence on fees received from the very companies whose products they were rating.

And – no surprise here – the financial heavy weights at the investment banks systematically gamed the rating agencies so-called secret formulas. They quickly figured out their weaknesses, exploiting them so that lousy products could still get a Triple A rating.

In other words, bald cheating was routine and, indeed, was seen as smart, aggressive business. Never mind that the junk that flooded the system was sold to the investment banks’ own customers; people to whom, you would think, they felt at least some duty of loyalty and fair dealing.

To illustrate how this gaming process worked, Lewis describes one aspect of the rating agency’s formula: Their use of FICO scores to measure the credit worthiness of the borrowers who held the underlying mortgages. To earn a Triple A rating, the FICO scores of the borrowers, in the pool of mortgages being rated, had to average out to at least 515.

Quickly figuring this out, the investment bankers realized that no distinction was being made between a “thick” FICO score – based on years of credit history – and a “thin” one. So, they would bring the overall average up by finding borrowers with high FICO scores but no reliable credit history; a person, for example, who once got a credit card, paid the bill, and never bought on credit again. And rather that craft a portfolio of 515s, they further gamed the system by balancing the 400s (almost certain to default) with an equal number of 650s – thin or otherwise.

This cynical manipulation of the rating agencies and, in turn, the purchasers who relied upon them, is just one of many stories that Lewis tells. We also learn about CEOs who didn’t understand the markets their most profitable products were traded in; reckless and sociopathic traders who were rewarded with multi-million dollar bonuses; and a system where virtually every major player’s reflexive response to the market’s looming collapse was to hide the truth as long possible – so they could sell as many of their bad investments to others, including (with no apparent compunction) their own customers.

In short we are given an in-depth X-ray into a system where lying is routine, loyalty nonexistent, and profits the only measure of success. And, sad to say, the tepid reforms that have been passed in the aftermath of the market’s meltdown have done far too little to alter this culture.


Lewis’ story, by driving home the depth and pervasiveness of these behaviors, reminds us that reform efforts are far more challenging, today, than they were even 40 years ago. When a behavior becomes the norm, we lose our ability to view it as dysfunctional. That is why entire populations can embrace fascism (as in Germany and Italy); genocide (as in Rwanda or the Balkans); and countless, senseless wars throughout history.

My sense is that we have reached that point in business. Many smaller businesses continue to operate in the old fashioned way – offering good products at a fair price; treating employees and others with some modicum of respect. But as you move up the pyramid in terms of size, the qualities that Lewis describes are, increasingly, the unquestioned norm.

We live, after all, in a world where Donald Trump is celebrated media celebrity even as he sells his name to unscrupulous developers and a bogus university. So one very serious challenge we face, if we hope to make things better, is to remold our collective consciousness so that, once again, fraud, recklessness, negligence, self-dealing, price gouging, and so on are viewed as disreputable – and not as business as usual.

Lewis’ narrative is also a dramatic reminder that, as ordinary citizens, there is so much we don’t know. Being reminded of that fact, another important take away is the huge price we pay when our leaders temporize in their critiques – as they habitually do – when it is politically expedient.

When Bush invaded Iraq, for example, virtually every political leader went along with it because, given the country’s prevailing mood, it was the “smart” move. In that case, however, since the issues were clear and the arguments against readily available, the consequences were somewhat contained.

But the subprime mortgage crisis stands in stark contrast to Iraq. As Lewis’ detailed accounting vividly demonstrates, we ordinary citizens had virtually no ability to understand the crisis as it unfolded – or, even now, after the fact. In this case, our willingness to tolerate endemically cautious, politically driven leaders – leaders who refuse to lead – is even more dangerous. Given the unavoidable, and increasing complexity of the world in which we live, we desperately need leaders who will actively identify and explain problems – and, crucially, speak aggressively and fearlessly to power.

Reflection 40: Size Matters

  • In 1964, Joe Namath signed a $400,000 contract. It was huge news. Today, $100 million plus contracts, for second tier sports stars, are commonplace.
  • In 1960, America’s 5 largest companies had, on average, $498 million in profits. By 2010, that number had grown to $12.2 billion.
  • In 1982 – its first year – the average net worth of Forbes’ list of the 400 wealthiest Americans was $285 million. By 2008: Almost $4 billion.

Wrapping our brains around the true dimensions of this explosion of private wealth is an extraordinarily difficult task.

Equally hard to understand is a similar explosion in the size and reach of the mainstream culture’s propaganda and reality molding machine; an apt term for the de-centralized but highly coherent set of values-based messages and cultural cues – compete and win, dominate and control – in which we are immersed.

Coming to gripes with these seismic shifts in the context within which we live is vitally important. Failing to do so, we will never grasp the enormity of the challenge we face as we seek to meaningfully contribute to a different and better world. We will too easily settle for change strategies that are far too tepid and limited in scope.

This is the issue I discuss below.


Understanding this vast shift in wealth is, at bottom, an order of magnitude problem. A billion isn’t just bigger than a million. It’s a lot bigger. And a trillion is way, way bigger than a billion.

Here’s one way to look at it. Suppose you had decided to count your money, dollar by dollar, with each dollar counted consuming one second. Also assume that your the goal was to finish the job just as we reached the year 2000. If you had $1 million, your count would have to start the morning of December 21, 1999. If you had $1 billion, you would start in April 1969. And if you had $1 trillion, your starting point would have been in 29,710 BCE – more than 20,000 years before we humans developed our first written numbering systems.

Going back to the numbers quoted earlier: In 1960 America’s 5 largest companies would have started to count their profits, on average, in March 1984. By 2008, however, their counts would have started in March 1614 (two years before Shakespeare’s death). And the counting time for the net worth of the Forbes 400 would have been pushed back from January 1991 (in 1982) to June 1875 (in 2008).

Notice, also, the “plight” of our best professional athletes who make a lot of money but who are, we need to remember, hired employees and not owners or investors. Thus, while they make enormous sums of money their slice of the pie is, in relative terms, chump change – and increasingly so. While Joe Namath would have started his count around noon on December 27, 1999, today’s $100 million athlete would only be pushed back to October 1996, a graphic reminder of where true economic power lies.


This order of magnitude analysis provides a hard dose of financial reality as we assess the effectiveness of conventional change efforts. Increasingly the nonprofit sector is being asked to fill the void created by the steady erosion of the government’s social safety net. And yet in contrast with the exponential growth in private wealth, the increase in charitable giving has been tepid –from $55 billion in 1980 to $217 billion in 2010

In comparative terms, while – in the early 1980s – the net worth of America’s 400 richest people outstripped the accumulated wealth of the entire nonprofit sector by a factor of 5 to 1, this differential had grown to 20 to 1 by 2010.

In short, an always-present fiscal mismatch has turned into a route. Our current reality is this: Massively outgunned in terms of lobbyists, lawyers, political contributions, and advertising budgets, the possibility of effecting meaningful reform through traditional political processes has become more and more implausible.


These same years have also experienced a comparable, explosive growth in the mainstream culture’s propaganda/reality molding machine. But because its emergence has been gradual, it is difficult to fully grasp its scope. And in contrast to the shifts in private wealth, our understandings in this area are further complicated by the fact that the change is so diffuse and difficult to quantify. For these reasons, its effects are even more pernicious.

This sort of cultural brainwashing is, needless to say, not new. Embedded cultural cues that make people “wrong” when they don’t do what their “betters” expect have always been with us. Indeed, George Bernard Shaw iconic example – Eliza Doolittle, the poor flower girl who could pass for a duchess but only after she learned the “right” way to talk, walk, and dress – was created over 100 years ago.

The last half-century, however, has been different. The culture’s reality molding machine has expanded to unprecedented levels, driven by two key factors:

  1. The enormous increase in wealth wielded by the individuals and institutions with the greatest stake in reinforcing and intensifying our mainstream ways of operating; and
  2. The vast array of technological advances that have so greatly expanded the intensity, persistence, and reach of their messages.

To begin to appreciate this seismic growth, it is useful to compare the 1950s – when I came of age – with today’s world.

Back then there were just a handful of TV stations – which stop broadcasting at midnight – a couple of local newspapers, and a handful of weekly and monthly magazines. So each day offered any number of taken-for-granted places of refuge from the messages of the mainstream culture:

  • Late at night when there was, literally, nothing to watch;
  • In the evening hours between your favorite TV shows;
  • On weekend mornings when all that TV offered was Sunrise Semester and cartoons;
  • On your daily drive to and from work;
  • During the natural lulls that occurred at work, because letters took days to arrive.

In this pre-computer/Xbox world, leisure activities were also, far more commonly, our own creations: Card and board games, playing catch with the kids, riding a bike, reading a book. It was also a time when having friends and family over to your house for drinks and dinner – a taken for granted activity in 19th century novels – was still a regular part of life’s routine.

All of that is now gone or strikingly diminished. We are plugged in all the time.

  • Our computers and smart phones are our constant companions;
  • Texting, face book and email saturate our lives with instantaneous communication;
  • The TV is a nonstop source of whatever entertainment suites our fancy – news, sports, shopping, movies, even pornography.

And it’s all available – or inconveniently present – on demand: In the car, at the beach, even in the bathroom.


While these new toys are delightfully distracting, they extract a heavy price. Why? Because the subtext of so much of what they offer embodies and reinforces the corrosive values that dominate our culture: Compete and win, dominate and control.

We are awash in nonstop messages that push us to want more, to buy more and, in general, to be perfect and invulnerable: Poised and articulate; youthful, thin, and attractive; hard working, successful, and rich; winners in whatever we do.

At times these messages are explicit, offered as product ads or commentary. But far more pervasive and influential are their implicit expressions: The story lines and characters in the shows we watch; and, equally, the ways in which our celebrities – actors, entertainers, TV hosts, reporters, commentators, and politicians – present themselves and conduct their lives.

For me, the depth to which these messages have taken root is exemplified by NPR’s routine editing of interviews to eliminate every “ah,” “umm,” and other verbal stumble. Even at NPR, apparently, we are not ok – not publicly presentable – until every pimple and unseemly bulge has been made to disappear.


These pervasive messages deeply impact our effort to create better lives and a better world. To begin with, we cannot avoid them. We are all in the dirty bathtub. And in the last 50 years, the bathtub has gotten a lot dirtier.

In addition, it has become more and more difficult to find kindred spirits with whom we can align in our effort to create better lives and a better world.

When it comes to the culture’s predominant values, we are literally drenched in cues that define us. Our jobs and schools – where we live – how we dress and accessorize – how we talk – what we eat and drink – they all point to where (and how well) we fit in, in the mainstream culture.

But what are the reliable indicators of a person who consistently seeks to be decent to themselves, others and the world? While these people do exist, the catalogue of social cues that allow you to identify them is strikingly thin. Such a person could be sitting across from you at lunch or be working in the office down the hall, and you might never have a clue.

And, unfortunately, we live in a world in which expressions of concern – a potentially important marker in our search for kindred spirits – have been co-opted by the mainstream culture. Empathic words and symbolic acts of charity have become a kind of affective camouflage, used to make our competitive, self-aggrandizing pre-occupations more acceptable to others – and to ourselves. In this environment, how do you tease out the genuine article, your real allies, from this endless stream of faux reformers?


With these examples I hope to demonstrate how important orders of magnitude are in understanding the enormous impact that the values, predominant in the mainstream culture, have in our lives.

But as much as size matters in understanding the dimensions of the challenge, it matters even more as we craft our responses. We need to conceive of change strategies that, as they take root, can become comparable in scope and impact to the problems they seek to address. In other Reflections I seek to make a creative contribute to that effort. See, for example, Reflection #15 (identifying business as a key strategic focus); and Reflection #45 (describing a more deeply collaborative approach to social change).

Reflection 36: Indecency – A Historical Overview

Through virtually all of our 6 million years of existence as a distinct line of primates and 300,000 years as Homo sapiens, the rhythm of our lives was dictated by the physical world. We foraged and hunted; in the winter we sought warmth and shelter and, in the summer, shade. Daily chores started at sun up and ended when the sun went down.

As Jared Diamond points out, however, a dramatic turning point occurred about 10,000 years ago with the domestication of crops and animals. What we call civilization – the history of the last 5,000 years or so – is a direct outgrowth of the exponential increase in the food supply and population that these innovations made possible.

Two powerful trends were unleashed by these events – that continue into the present:

  1. The ability of one group of people to dominate another through control of the food supply and, with it, the growth of nations, empires, religious movements, and other complex hierarchical and – more typically than not – authoritarian organizations; and
  2. An accelerating ability to harness nature to our purposes.

Given these extraordinary developments, major shifts in our traditional ways of being were inevitable. But because the catalyst for change was technological – and not moral or spiritual – there was nothing to guarantee that these cultural adjustments would be wise and humane.

In fact, they have been anything but. Instead of using these evolving technologies to meet our emotional and spiritual needs, we have moved in the opposite direction: We have subordinated our needs to the demands of the increasingly powerful authoritarian organizations that the technological advances have spawned. And those organizations have, in turn, spurred additional technological advances used to further entrench their authority.

A prime example is our response to innovations that improve productivity. While they could be used to reduce our workload – thus freeing time for family and leisure – they almost never are. Instead, the time they free up is used to work even harder in service of our culture’s singular obsession with more and more productivity and material wealth. We have, in short, been indoctrinated into a way of living that makes us cogs in an enormous, endlessly voracious “productivity machine.”

The system’s self-perpetuating momentum is then sealed by our induction into the culture’s equally voracious “consuming machine.” Conditioned to always want more, we are driven in our jobs to produce (and earn) more, which in turns feeds our addiction to wanting more, and so on, in an endless cycle what chews up our days and leaves less and less room for the expression of other aspects of our humanity.


While this trend has been gathering steam for thousands of years, I want to call special attention to the last two centuries. As recently as 200 years ago, our lives were still largely rooted in the rhythms of nature.

Then, our accumulating technologies reached critical mass. Massive reality-altering change swept the world:

  • Electricity eliminated night as a meaningful limit on our activities.
  • Central heat and air conditioning eliminated summer and winter.
  • With the advent of modern travel and instantaneous communication, time and distance – to a hitherto unimaginable degree – ceased to be limiting factors.

The result? The physical environment is no longer a defining factor in our lives. We can now work and consume day and night, 365 days a years. Remote locations and private moments – something we used to take for granted – are rapidly disappearing. The Internet instantaneously connects a missionary in Borneo with his or her family in Phoenix, and computers and smart phones keep us fully connected during the morning commute – as we sit on the beach – even when we go to the bathroom.

The scope and magnitude of these changes is, of course, very important. But so too is the speed with which they have occurred. In my lifetime, for example, the implications of the telephone, car, radio and television were barely digested, when jet travel was introduced, followed by the pill. These changes were then followed by a revolution in office technology (Xerox machines, word processors, email), and the arrival of instantaneous access – to virtually everything – via computers and smart phones.

Why is this acceleration in the speed of change so important? Because it hampers our ability to craft reasoned and humane responses. We scarcely digest and adjust to one seismic change when another and, then, another is upon us.

As the scope and pace of change has accelerated, so too has the corrosive impact of our obsessive, work and consume habits of living. In earlier Reflections, I discuss some of their consequences:

  • A massive decline in communal connections (#29 Losing/Revitalizing Our Communal Roots) and intellectual vitality (#21 Theory Matters);
  • The pain that comes from perfectionism (#31 Perfectionism);
  • A denial of vulnerability (#14 Dying – and Our Epidemic of Immortality);
  • A marked shrinking of the intimate connections we share with one another (#22 Consumerism — and the Passivity it Breeds).

But these examples do not tell the full story. The cultural adaptations of the last 200 years have also fundamentally distorted our most basic neurobiological wiring.


Across millions of years, we humans have evolved as profoundly affiliative beings, the result being that our emotional and intellectual growth – and continued vitality – depends upon ongoing, intimate contact with one another.

According to Daniel Siegel, one of our leading neuroscientists, the brain is a complex nonlinear system that exists within a larger complex nonlinear system consisting of it and other brains. In other words, thinking about a single brain – a single person – makes no sense. We only exist in connection with others.

But nature has also provided us with an auxiliary fight or flight brain. Designed to deal with danger, it’s fast – 10 times faster than our thinking brain – and powerful in its effects. Energy chemicals (cortisol and adrenaline) are pumped into our system, blood rushes to our large muscles groups, and the activity of the thinking brain shrinks – in order to avoid indecision at a time of crisis. Faced with a potentially life-threatening emergency, we are ready to act quickly, forcefully, and instinctually.

When the natural world dictated the rhythm of our lives, a natural balance was maintained between our fight/flight and thinking/affiliative brains. Most of our hours and days were spent in a nonreactive emotional state as we went about the highly routinized chores of daily living. Then, occasionally, there would be flashes of danger – a predatory animal, enemy, or natural disaster – that would activate our fight or flight brain. When the crisis ended, we would return to our normal, more relaxed state of mind.

But in today’s world – after 200 years of momentous change – everything is different.

Groomed to be competitors and “winners,” we are “on,” more or less constantly – both because we can be and because an endless stream of cultural cues, incentives, and sanctions tell us that that is what successful people do.

To get ahead, we move through our days anticipating danger; striving for a competitive edge; viewing setbacks as unacceptable and traumatic; exhausting ourselves, physically and emotionally. In other words, we have taken fight or flight – an auxiliary system, designed to deal with isolated moments of danger and, to truly unprecedented levels, made it our base-line operating system.

Some of the fallout from this seismic shift in consciousness is easy to identify: Heightened levels of stress and anxiety, drug abuse and alcoholism, verbal and physical abuse. But the damage goes further.

Fight or flight is specifically designed to neutralize or “annihilate” the will of the other – either through aggressive force (fight) or withdrawal (flight). These choices are, however, the antithesis of intimacy, a pattern of interaction that requires a willingness to engage others with empathy and curiosity.

So, it is no accident that so many couples and families are locked in an endless cycle of criticism, counter criticism and withdrawal – or that self-criticism and judgment (indicating a fight/flight stance with our self) are so pervasive – or that combative/attacking behaviors have become ever more dominant in our politics. The disquieting reality is that the cultural choices of the last 10,000 – and, in particular, the last 200 – years have led to a marked deterioration in our intimacy instincts and skills.

Compounding the problem is the fact that fight or flight is highly infectious, with attacks provoking counter attacks even from ordinarily more conciliatory people. For this reason as well, overcoming this “new normal” state of conscious is a huge challenge.


Radical Decency – decency to self, others, and the world; practiced at all times, in every area of living, and without exception – is an approach to living that, at a personal level, can make a real difference as we seek to diverge from these increasingly engrained, fight/flight habits of living.

At a societal level, a perceptible shift in ways of operating that have their roots in 10,000 years worth of history is a long shot, to say the least. But the future is inherently uncertain. And the hopeful thought, implicit in this analysis, is this: Because our current situation is the result of historical choice – and not the inevitable product of our inherent human nature – it can also be undone by the choices we make going forward.

Reflection 31: Perfectionism

One troubling aspect of psychotherapy is its focus on symptoms, rather that causes. Depression and related conditions, for example, consume 9 different DSM categories and more than 30 subcategories. And while many clients are chemically prone to depression – so that symptom alleviation is, in fact, a key issue – the great majority are dealing with non-organic issues as well.

Symptom relief is, without question, an urgent goal. But the growing tendency is to stop there; to see psychotropic medications and cognitive/behavioral interventions, not as important tactics in a larger fight, but as ends in themselves.

Today, more than 90% of psychiatrists – the most educated and highly compensated clinicians – prescribe drugs and do nothing more. In addition, more and more “talk based” clinicians have adopted short-term approaches to therapy, driven by insurance companies’ demands for “measurable” success toward “concrete” goals.

The reason for this trend is, to me, self-evident. The mental health establishment, like every other industry of any size and persistence, is not interested in pursuing problems to their root causes. Why? Because so many of the real culprits, lurking behind our emotional issues, are the unexamined values that keep us locked into our roles as compliant workers and consumers. Implicitly recognizing this reality, the mainstream culture – with its genius for self-perpetuation – will financially starve and marginalize healing strategies that seriously challenge its central outlooks and beliefs.

Salvatore Menuchin’s career is an object lesson in this phenomenon. His systems approach to family therapy was widely recognized and became a generative force in the profession. But his later work – applying these same ideas to larger social structures – was mostly ignored. Why? Because it challenged our mainstream ways of operating.


In this Reflection, I deal with one of the root, non-organic causes of so much of our psychic dysfunction: Perfectionism. This mindset – an almost impossible to resist byproduct of our obsession with competition, dominance and control – is one of the more obvious causes, not just of depression, but also of our epidemic of anxiety, shame, and self-judgment.

Notwithstanding this reality, perfectionism is not a condition that is dealt with in the DSM. Indeed, the culture’s ability to deflect attention from the real drivers of our pain is exemplified by this remarkable fact: Far from being seen as a problem, perfectionism – dressed up in more acceptable language – is widely seen as a positive value, to be celebrated and encouraged.

This rhetoric is so pervasive that we scarcely notice, and rarely comment on, its perversity.

For me, the archetypal example is the culture’s constant reminder that “we can do anything we want, if we just try hard enough.” What is so chilling about this pervasive cultural rallying cry is this: It studiously omits the aphorism’s inescapable second clause: “And if you don’t accomplish your goals, there is something wrong with you.”

This statement is, of course demonstrably false. The odds of a poor African American child going to an Ivy League college, after 12 years at a ghetto based public school, are astronomically small. Similarly, if you work in a dying industry or seek a job in a saturated market, you may not find any work at all let alone the position of your dreams.

Notwithstanding the mainstream culture’s perfectionist rhetoric – exemplified by this phrase – the primary reason for these and most other “failures” is not a lack of effort. To the contrary,

  1. The game is fixed. Those with money and connections have a long head start; and
  2. It is arbitrary. Determined or not, we will fail if – for whatever reason, good or bad – you get on the wrong side of the boss; and,
  3. Like it or not, we are all limited by our human frailties.


What is interesting is that we know all this. And yet, at a personal level, utterly fail to follow through on its implications.

For most of us – when it comes to our situation – there are no excuses. Falling short, my automatic response is that “I” am the problem. Pointing to external causes feels wimpy and shameful. I need to “man up,” take responsibility, redouble my efforts to do better the next time. Never mind:

  • That there were massive lay offs (I, somehow, should be the exception); or
  • That I was sick – or distracted by my child’s crisis at school; or
  • That I am not, and never will be, a good pubic speaker.

None of these things matter. My presentation should have been crisp, tight, and compelling.

It is as though we walk around with a measuring stick in our heads, remorselessly assessing our value, judging any outcome that doesn’t approach 100% as a failure. And in this unforgiving landscape, “wins” – for most of us – become fleeting visits to an all but impossible to attain mountaintop; moments of surcease in a larger system in which losing is the norm.

Thus, for example, I vividly remember a friend’s powerful feelings of failure when, as one of four finalists for a position sought by over 300 applicants; she failed to get the job. And, too, the client’s intense feelings of shame because his boss – a man he didn’t like or respect – told him he wasn’t measuring up.

These sorts of deeply engrained, automatic responses breed a wide variety of psychically discouraging mindsets:

  • Ashamed of our failure, we isolate.
  • Reflexively judging and doubting ourselves, we become cautious, indecisive, and defensive.
  • Unable to shake the sense than we are “defective, “less than,” “a fraud,” we stop trying, content to go through the motions.

Thus, while the mainstream rhetoric is about achieving great things, our perfectionist mindsets actually move most of us in the opposite direction, with chilling effectiveness. This outcome is wonderfully effective – if the goal is to create a pandemic of spirit-sapping mindsets, a result that – not coincidentally – deeply discourages efforts to challenge and change our current, mainstream ways of operating.


Note, importantly, that our obsession with individual perfection deeply obscures the systemic factors that contribute to what ails us – reinforcing our status quo ways of operating in this way as well.

Thus, millions of people, financially leveled by the economic downturn that occurred after the 2008 housing and financial meltdown, took second jobs and economized to a point of real pain. And yet, remarkably, there was no perceptible movement to reform our patently corrupt financial system. Similarly, a handful of “bad actors” were prosecuted for torturing prisoners in Iraq while the policies they carried out, and the people who created them, were ignored.

The implicit message in each of these examples? Bad policies and malevolent systems don’t matter. “Good” people should just know what the right thing to do is – and have the will to do it.


Because perfectionism is a byproduct of the culture’s deeply engrained, win/lose values system, programs such as Radical Decency – that seek to systematically implement more humane ways of operating – are the most strategically viable response. These comprehensive, values-based strategies are the strong medicine we need to deal with this virulent cultural disease.

As we re-orient our energy toward the consuming task of being decent in all that we do, perfectionism will increasingly be seen as an unwanted distraction; an attention and energy draining habit of mind that diverts us from our more ennobling goal. With time, it will wither and recede.

Getting from “here” to “there” is, however, an enormous challenge. Being creatures of habit, there is no easy way to wean our selves from our perfectionist mindsets. But while the work is hard, the pay-offs are, potentially, life changing. While it is a long shot, to be sure, it is – as I see it – the most realistic path toward creating a more nourishing lives and meaningfully contributing to a more decent and humane world.

Reflection 30: In Defense of Our Troubling Values

Central to Radical Decency is the belief that:

  1. A specific set of values – compete and win, dominate and control – are pre-eminent in our culture and, thus, wildly over-emphasized in our day by day choices;
  2. That the result is incalculable damage our selves, others, and the world; and,
  3. If we hope to live differently and better, we need to wean ourselves from the corrosive habits of living, spawned by the relentless emphasis on these values, replacing them with more decent ways of being.

Repeating this formulation over and over, it is easy to create a pantheon of good and bad values: Respect, understanding and empathy, acceptance and appreciation, fairness and justice – good; compete and win, dominate and control – bad.

Doing so, however, misses the point. The problem is not inherent in the values themselves. It lies, instead, in their over-emphasis and the relentless, culturally based pressure to conform to their strictures.

Radical Decency puts its priority on modeling and promoting virtues that are, in our culture, chronically neglected: Attending to the well being of the socially and economically disenfranchised; treating others with respect; being empathic and fair even when it draws energy from our competitive aspirations; focusing – with the seriousness it deserves – on our need for rest, reflection, novelty, and play.

But promoting these neglected values is not the full story. We are multi-faceted beings, with a wide range of dispositions – from the most loving and affiliative to highly aggressive and dominating. We also operate in diverse and, all too frequently, indifferent and unforgiving environments.

So even as we pursue our aspirational “decency” goals, we need to constructively employ and manage our diverse biological instincts, and realistically come to grips with these harsh cultural realities that surround us. For these reasons, the culture’s predominant “compete and win” values have an important – though far more limited – role to play in our lives.


Take competition, for example. We are socialized in schools where the emphasis on testing, grades, and achievement is pervasive; the goal being to create successful adult competitors; “winners” in life. Sadly – inevitably – this has led: (1) to an epidemic of self-judgment, anxiety, and depression as we strive, in vain, for unending success and perfection; and (2) to a myriad of self-medicating strategies (work, sex, alcohol) as we seek to maintain this psychically compromised approach to living.

Given these disheartening realities, it is easy to lose sight of the fact that a competitive spirit, properly used, sharpens our wits, motivates us to higher levels of performance and, at its best, creates an intimate bond with co-competitors. An innate part of our nature, it can add its own unique zest to the fabric of our lives.

In other words, competitiveness is not the problem. It is, instead, the grim, “winning is the only point” attitude that threatens to entirely eclipse its nourishing aspects.

How far gone are we? Pretty far – and, I am afraid, farther than we think.

As things stand now, the coaches and parents of 10 year olds, who scream at referees – and at kids who don’t play well – are a cultural commonplace. And our “normal” expectation is that businesses will distort the truth, skimp on quality, and overreach on pricing, all to improve profitability; that is, to win.

Contrast these attitudes with the Talmud’s injunction that a losing litigant should thank the judge for enlightening him as to the correct behavior. Reading that as a young attorney, I was brought up short. It seemed so sensible and appealing – and so utterly foreign to the world in which I operated.

Now, 30 years later, that sensibility seems even more farfetched. But imagine how different things would be if an attitude of curiosity, possibility, openness and ease were more present in our attitude as lawyers and litigants – and in other competitions as well?


We also need to look beyond the inhumane versions of domination and control that are rampant in our culture. Like competitiveness, these are aspects of our psychic make-up that, used judiciously, are useful and, at times, indispensable.

Every day, and in virtually every area of living, we are surrounded by people who operate by the culture’s mainstream values. As a result, we continually confront this dilemma: How can I be appropriately self-protective – decency to self – without sacrificing decency to others and the world?

In many instances, the best approach is to create a firm boundary – a form of control.

As I often remind clients that, sharing your anger with a total stranger – the guy who shoves his way to the front of the line, for example – is an act of intimacy. You are disclosing, to him, exactly how you feel.

With that, your vulnerability increases and an emotional connection is created with a person with whom you actually want no connection at all. Better to let his behavior pass without comment, managing your feelings either alone or with the support of someone you trust.

But sometimes this option is not viable. The bully persists. Or the bully is your boss or your child’s teacher. Or you are dealing with a person that seems intent on harming you. In these situations, other acts of control or domination may be called for.

Thus, far from being wrong, lying to a would-be rapist – control by deception – is an invaluable skill. And, after exhausting more respectful options, appropriately modulated counter aggression may be the best option when confronted with an implacable foe, intent on dominating and controlling you. Indeed, even a physical attack may be appropriate when the only other option is serious injury or death from an unprovoked attack.


A final thought: While understanding the “good” side of these mainstream values is an important exercise, so too is an openness and curiosity about why these values developed in the first place and, with that, the role play in our lives. While the primary goal is, without question, to limit their outsized influence, we should strive not to throw the baby out with the bathwater.

Our traditional gender roles offer a good example. A passive/placating woman and unemotional/unresponsive/work-first man – these patriarchal archetypes are poster children for our pattern of dominance and control and the incalculable pain it causes. But we need to understand why patriarchy evolved in the first place: Its role in our evolutionary history.

Women evolved, across our 300,000-year history as Homo sapiens, to be our early warning system; the folks who scan for danger. And since duplicating this process made no sense, men evolved as reactors – not to the environment – but to women’s emotions.

Given this evolutionary division of labor, men and women developed different emotional sensitivities. Woman – wired to react to danger – are especially susceptible to safety issues whereas men. Men on the other hand – wired to their women – strive to be good providers, protectors and lovers and, for that reason, are more susceptible to shame.

These emotional predispositions, deeply embedded in our psyches through millennia of evolution, continue to influence our behaviors. Understanding this, the behavior of a placating woman is much more understandable.

Her steady message to her mate – that he is a good provider, protector and lover – minimizes his shame and frees him to play his traditional role more effectively. In an analogous way, a stoic man – keeping his fears and anxieties to himself – is better able to attend to his spouse’s immediate, potentially safety-threatening concerns.

Since we no longer live as hunter/gatherers, these restricted gender roles no longer serve us. However, teasing out these sorts of behavioral nuggets in patriarchy’s otherwise highly destructive pattern of dominance and control, allow us to make smarter more modulated choices; choices that are egalitarian but, at the same time, attend (for example) to “her” sensitivity to safety issues and “his” susceptibility to shame.

Reflection 25: The Vise of Money

Money is rivetingly important. What topic is more shrouded in secrecy, or more fraught with emotion? Some years ago Madonna had a filmmaker record her life. She told him everything was fair game. He could film her having sex. He could film her going to her bathroom. But when she met with her financial advisors, no cameras.

In our tell-all world, ask yourself this question: How many people tell-all about their finances?

One experiment I used to run with groups was to ask them to reflect silently on two sets of questions.

  • The first: Who do you have sex with? How often? How do you do it?
  • The second: How much money do you make? How much do you spend? What is your net worth?

After silently contemplating their answers for a couple of minutes, I would then ask which set of questions caused greater anxiety as they thought about sharing their responses. Typically, 80-90% of participants chose the money questions.

The obvious lesson? Our most pervasive and powerful taboos are around money.

Introducing this exercise to one of my men’s groups, events took an unexpected turn when one of the participants, unprompted, simply answered the money questions. Influenced by his example, the others followed suit. The conversation that followed was fascinating. Financially undressed, every man confessed to an area of marked shame or fear: One about over spending; another about his income; still another about unwise investment decisions.


None of this is happenstance. The values that predominate in our culture are compete and win, dominate and control – and money is their single most compelling measure. Why? To begin with, it is so quantifiable. An $80,000 income is, unquestionably, more than a $60,000 income.

Money is moreover wonderfully fungible, providing a universally applicable measuring stick that judges all of us without regard to our interests, passions or disposition. Artists, academics, and religious leaders – just like business people – are typically honored in proportion to their ability to sell the “product” (books, paintings, etc.) and to command large audiences and fees. And stay-at-homes moms – who offer leadership in raising our children and organizing our family and social lives, but don’t make money – struggle with issues of self worth far more than, say, accountants and lawyers.

In the movie Inside Job, academic department heads at Columbia and Harvard were asked if they perceived any conflict of interest in the extraordinary fees they and their colleagues received from the industries they studied. Their almost identical response was a disingenuous “no.” The lesson? Even in this supposedly more principled world, making money trumps other competing values – even academic integrity.


This obsession with money is chillingly effective in locking us into lives that condone and promote the culture’s mainstream values. The prospect of economic instability pushes the vast majority of us into a lifetime of indenture to mainstream jobs for which we feel little or no passion.

Indeed, in my psychotherapy practice, I am shocked at the number of clients who don’t even dream of something better. Even contemplating a choice that might place the mortgage and health insurance at risk – and, possibly, consign them to society’s bin of financial losers – is, it seems, too scary or discouraging. Since there is no way out, why even try? Just play the game and do best you can to make peace with it.

The sad part in all of this is that almost no one wins the money game. Since it is a comparative sport, someone is always doing better. And today’s “winner” will, in the great majority cases, be tomorrow’s loser.

Moreover, even when the comparative aspect of the money game is ignored, there are few winners. The person that said “we live up to our means” was right. In Bonfire of the Vanities (written over 20 years ago) Tom Wolfe explained how a bond broker making $900,000 a year was just getting by, what with the expense of private schools, a Park Avenue condo, and a summer home in the Hamptons.

Closer to home, I will always remember a young law partner in the 1990s – with a wife, kids, and house in the suburbs – who explained to me that he could give nothing to the United Way because he was “broke.” His annual income: $125,000.


An important first step in coming to grips with money’s vise-like hold on our lives is to challenge the culture’s conspiracy of silence. We need to move beyond the idea that it is unseemly or impolite to talk openly about what we, and others, make and accumulate, and how these assets are used.

This social taboo has a serious purpose, and it is not good manners. To the contrary, it is designed to shield all of us – but particularly the wealthy – from virtually any personal responsibility around money. Who in our midst is committing meaningful resources to the needs of the disadvantaged? And who is doing nothing? Beyond occasional bits of information – usually volunteered for a self-interested purpose and seldom critically examined – we simply don’t know.

This silence spares all of us from any extrinsic pressure to examine our behavior when it comes to money. And while it is easy to take comfort in this escape from responsibility, the price we pay, individually and as a society, is far too great.

At a macro level, here is the shell game that this conspiracy of silence has made possible: First, we have progressively privatized the programs that support the disadvantaged; starving governmental programs and, then, relying on donor-supported nonprofits to fill the void. Celebrating the virtues of volunteerism and individual initiative, we leave the financing of vast parts of the safety net to the whims of individuals who, cloaked in anonymity, feel virtually no social pressure to step up to the plate.

The results are utterly predictable. Wealthy people – with statistically insignificant exceptions – invest either nothing or a grotesquely tiny proportion of their resources in programs for the needy.

In truth, rich people have been given license – even encouragement – to abdicate any sense of social responsibility even as, in their quest for ever greater wealth, they tighten their grip on the levers of power. Unchecked, this is a prescription for an unraveling of society. Lacking a larger sense of responsibility, what is to stop them from relocating their assets overseas, to better maximize profits? And, indeed, this is happening every day, at an accelerating pace.

Our secrecy around money also inflicts an unacceptably high price in our personal lives. If we hope to live differently and better, we need the support of intimate communities than can help to move us through and beyond our paralyzing fears.       But doing so is an impossibility so long as money and the pressures and fears that surround it – the very issues that lead to so many of our sleepless nights – are enveloped in a cone of silence. Thus, at a personal level as well, a frank and open discussion of about money is a vital.


Another key step toward improving our unhealthy relationship around money is to ease its hold on our sense of well-being. We think that we will be safe if only we have “enough” money. And yet, the opposite is actually much closer to the truth: No amount of money, reasonably within our grasp, will ever make everything ok. Given the risks and uncertainties that are at the very center of our competitive economic system, almost no one is immune from financial peril.

Embracing this hard reality can, in fact, be empowering and life changing. Doing so, we are in a much better position, psychologically, to wean ourselves from the reflexive tendency to view financial security as life’s unquestioned priority.

And what should replace it? An approach to living that, while tending to financial realities, makes our hopes and dreams the central focus.

Beyond that, we need to persistently experiment at the edge of our fears around money: Foregoing a work opportunity to attend our daughter’s swim meet; increasing our charitable commitments beyond a place of comfortable tokenism; considering a new, lower paying job that more closely reflects our life’s passion.

The work is hard but, with focus and persistence, it has the potential to make us far more effective agents for change – in our lives and in the world.

Reflection 22: Consumerism — and the Passivity it Breeds

The predominant culture relentlessly promotes two things. One is economic success. Endless cues, incentives and sanctions push us prepare for a career as we grow up and push us to devote enormous amounts of energy to it, when we come of age.

The other is consumerism. Our children are so inundated with toys that skipping a rock, kicking a can down the street, and tree climbing are becoming lost arts. And throughout our lives, there are endless opportunities to shop – promoted by nonstop ads and our uncritical celebration of the latest ingenious gadgets, and the newest and fanciest clothes, cars, and houses.

In this Reflection, I describe the ways in which this consumer mindset infiltrates our lives and hamstrings our efforts to live differently and better.


Several years ago I participated in a service trip to Mexico. Early one morning, our hosts took us, in open-air trucks, to work on an organic farm. We returned to our guesthouse to a lunch of macaroni salad and bologna and cheese sandwiches. As I ate my lunch and talked with my companions, I noticed how good I felt. My body had a wonderful ache from the work. My spirit felt energized from the shared experience and solidarity I felt with my companions. Even my bologna sandwich seemed tasty.

Our group consisted of people like me, privileged North Americans thoroughly habituated to a consumer-oriented way of living. So as eager and expert consumers, we planned a dinner, that night, at one of the fanciest restaurants in Cuernavaca.

Drinks were served on a gorgeous lawn where peacocks quietly grazed. When there was a sudden downpour, waiters with oversized umbrellas appeared, in an instant, to escort us to our tables. The place settings were elegant in every detail, the food perfectly presented and delicious.

The stark juxtaposition of lunch and dinner stunned me. Sitting at dinner I realized that the seductive beauty of what others had created had lulled me into a state of passivity. That morning and at lunch, I was an active participant in creating my experience. At dinner, I reverted to the habitual consumer posture that I know so well. In that role, I was the passive recipient of someone else’s creation. I was inert, infantilized.

This posture of passivity flows inevitably out of our engrained consumer habits. Our implicit expectation is that most everything we need has been prepared by others and can be purchased. Our only job is to choose this product or that one.

And what makes this mindset so problematic is that it extends far beyond clothes, cars, and electronics, permeating virtually every area of our lives.


Take intimate romantic relationship, for example. Properly conceived, it is a journey. People are drawn to a partner by our back of the brain “love” chemicals. Then, as the relationship evolves, its success is measured by the partners’ ability to heal and grow together; to share themselves and express their needs in contactful ways; to see the other and stretch to meet that person’s needs.

The norm that exists in our consumer-oriented culture is, however, very different. Making no distinction between people and things, it encourages us to evaluate both solely in terms of what they can do for us. So choosing a partner becomes an exercise in comparative shopping, not very different from the search for the right car or laundry detergent. If a partner meets (and continues to meet) our criteria, we keep her. If he falls short, he is replaced. And, sadly, this outlook often persists even after children are added to the equation.

Habitually adopting this approach, we pay an incalculable price.

Our neurobiology makes intimate connection an indispensible part of our self-regulatory structures, both physically and emotionally. For that reason, we need to persevere in our relationships, not only with our intimate partners but also with family, friends, and others with whom we share our lives. There is no other path if we hope to develop the intimacy that sustains us.

But with our consumer oriented focus on “what can you do for me,” we squander opportunities for intimacy. Instead of doing the hard work of relationship, we move on. In the end, our relationships are, far too often, limited and transient. The close, mutually cooperative, and enduring connections with others – so essential to our emotional well being – perpetually elude us.

People who take this approach to relationship often think they are taking charge of their lives. But this belief is illusory. Like me, sitting at a banquet prepared by others, their stance is passive. As consumers, their options are actually limited and constricted: Either take what is being offer – or leave it. There is no opportunity to struggle, learn and grow in the crucible of relationship; to be an active participant in the creation of the relationship.


This same process – at work in our intimate relationships – has massively infected our larger communities as well. In Bowling Alone, Robert Putnam documents a massive decline in our communal involvements in the last half of the 20th century. And our consumer mindset is a prime cause.

When it comes to our communal organizations, most of us are like shoppers pushing a cart down the Acme aisle. The question we instinctually ask is this: What can this organization do for me?

What is lost in the process is a sense of involvement and ownership; an instinct to contribute to the organization’s growth and effectiveness. Instead, we join to get something and feel little, if any, obligation to volunteer for the many necessary but thankless jobs that keep the organization alive and vibrant. And, of course, we are all too ready to leave when difficulties arise (as they inevitably must), and the fun part of our participation is compromised in any meaningful way.


This same process shows up in the workplace. While management’s lack of loyalty to workers is no surprise, the extent to which workers, themselves, passively accept this attitude of casual indifference is truly astonishing.

Unions have been in decline for 50 years or more and our pervasive consumerism is one of the less appreciated causes. We now live in a world in which the prevailing attitude is that workers – like virtually everything else in our culture – are commodities, to be bought and sold. Implicitly accepting this perspective, most workers take for granted management’s unfettered right to treat them in any way they see fit. The idea of resisting management’s dictates – or, even more farfetched, organizing in opposition – seems beyond most workers’ imagination.


This consumer-oriented mindset also defines our politics. Instead of being active participants in co-creating our public policies, we look for a magic candidate – still another type of product – to cure our ills.

Barack Obama’s 2008 election is a perfect example of this process. As Peter Gabel pointed out in a 2010 article in Tikkun magazine:

“A major weakness with that 2008 moment is that it was constituted by 6 months of watching Obama on television, by an overreliance by each of us in our separate space on watching that remarkable smile and listening to that sometimes-transcendent oratory. It was not constituted out of our own social movements, emerging from our own idealistic actions over time through which we stitched ourselves together in real social relations. It was mainly a cheer led by one person through TV. Without his ‘mediation,’ we didn’t exist.”


If we hope to effectively deal with consumerism’s pervasive influence, we need to understand the breadth of its influence, as well as its debilitating effect on our ability to be active agents in our lives.

Beyond that, we need to understand that we are in a war of attrition. The only way to wean our selves from this engrained, self-defeating consumer mindset is to systematically practice new habits of living that more effectively serve our purposes. And that, of course, is what Radical Decency seeks to provide.

Reflection 21: Theory Matters

We live in a world where theory has a bad name. In business, the mainstream rhetoric emphasizes decisive action: “Lead, follow, or get out of the way.” A one- page summary is the preferred method of communication while a lengthier analysis, offering context or complex causation, is commonly greeted with impatience and, frequently, suspicion about the author’s clarity and decisiveness.

Theory also has a bad name in many personal growth and spiritual circles. People who claim to be in touch with a unifying spiritual force, when asked to explain what they mean, frequently say, “I just know.” And when the conversation in support and therapy groups turn to theory, it is likely to be cut off with the critical directive to “talk about your feelings.”

This theory-less approach to living comes at a high price. According to Irvin Yalom, one our most important contemporary psychoanalytic theorist, a thought in therapy, unattached to an emotional experience, has little lasting impact. But, as Yalom makes clear, the converse is also true. An emotional experience that isn’t anchored in a coherent theoretical frame is equally short lived. Both are required if we hope to maximize our healing and growth.

In addition, our widespread disdain for theory is still another way in which the values of the predominant culture are reinforced and perpetuated. That is the point Vikki Reynolds makes when she speaks of her more conventional office mate’s request that she remove her peace sign, gay rights poster and other “political” material from their shared office. When her response – I will, if you do the same – was greeted with incomprehension, she pointed to his wedding ring, the photo of his wife and kids in front of their suburban home, and his framed diplomas.

To the same effect is Meryl Streep/Miranda Priestley’s withering speech to her young assistant in the movie, The Devil Wears Prada:

“Oh, I see, you think this has nothing to do with you; that you selected that lumpy blue sweater because you’re too serious to care about fashion. But what you don’t know is that the sweater isn’t blue, it’s cerulean. You’re also blithely unaware of the fact that Oscar de la Renta did a collection of cerulean gowns in 2002, that Yves Ste. Laurent then showed cerulean military jackets, and that it quickly showed up in 8 different designer collections. Thereafter, it filtered through the department stores into some tragic casual corner where you no doubt fished it out of a clearance bin. It’s comical. You think you’re exempt from the fashion industry when in fact you’re wearing a sweater that was selected for you.”

As Vikki and Miranda point out, an apolitical, non-ideological position – about fashion, social justice or, indeed, any issue of significance – is an illusion. Like it or not, our choices have consequences in the world. What we think of as neutral or apolitical is really a stance of passivity; a failure to formulate an informing theory of our own.

The results are unfortunate. Failing to cultivate our own perspective, we, like Vikki’s office mate and Miranda’s assistant, easily confuse the culture’s “default settings” – that is, its prevailing attitudes – with issue neutrality. At that point, these mainstream perspectives – and the theoretical underpinnings out of which they arise – become invisible; part of the air we breathe. And being invisible, they are able to operate in, through and around us with impunity.

So how do we cultivate a new, more engaged relationship with theory? Here are a few thoughts.

First, we need to accept the fact that all theory distorts. The world provides virtually endless data to our senses and theory attempts to make this data more understandable, by identifying patterns. Doing so, some facts and factual patterns are emphasized while others are minimized or ignored. Distortion, hopefully helpful distortion, is the essence of theory.

With this in mind, we should not be asking whether a theory is “true.” No theory can be. But that does not mean that careful attention to facts isn’t important. To the contrary, we live in a world where a debased version of relativism – “every thought is as good as any other” – is rampant. In this context especially, we need theories that strive to be congruent with the facts, as they are currently known. Equally, we need theories that can evolve and change as the discovery process adds new facts and, at times, unravels what once appeared to be inarguable truths.

This threshold factual question is key because theories – particularly those that persist over time – can so easily become dogma. At that point, facts are made to fit theory rather than vice versa. As this process accelerates, the theory’s continuing value becomes increasingly suspect even as its potential to harm increases.

Examples of this phenomenon abound. Some are blatant – a refusal to recognize evolution. But others are less obvious and, for that reason, more pernicious.

Take mental health, for example. Current evidence leaves little doubt that healing occurs through the emotional brain (psychodynamic theory), thinking brain (cognitive/behavioral theory), brain chemistry (psycho-pharmacology), and the body (acupuncture, yoga, etc.). Equally important are our intimate relationships, support communities, and engagements with the larger culture (a particular concern of Radical Decency).

Unfortunately, our theories endemically privilege one set of facts over others. Mainstream cognitive/behavioral theories are dismissive of empirically unverifiable psychodynamic approaches. And, body work and creative engagements with the larger culture are, in the great majority of cases, effectively ignored by both.

What mental health exemplifies is endemic in our culture. Manipulation of facts to fit theory — ignoring or rejecting other possibilities in the process — infects our economic, political, religious, and philosophical theories as well. If we hope to use theory effectively, we need to be vigilant in recognizing this process and attentive to finding theories that resist it.

This does not mean, however, that old theories should be discarded because far more facts are available today. To the contrary, people who lived 2,500 years ago were every bit as smart as we are. The insights of Jesus, the Buddha, and the Greek philosophers need to be cherished. Moreover, enduring ideas in their teaching – because they are affiliated with institutions and historical traditions – can, if used well, have enormous positive impact. But if we chose that path, we cannot temporize with the very real dangers of dogma and, with it, co-optation by status quo interests.

Once this crucial threshold issue of credibility has been dealt with, the questions we need to ask about theory are practical.

  • What does it seek to explain and how compelling are its explanations?
  • What are its limits, intended or unintended?
  • Do its explanations fit with what I know of the world and how it operates?
  • Does it expand or further invigorate those understandings?
  • Does it clarify my choices and improve my decision-making?

Equally, the question we need to avoid this: Does the theory represent the “truth. Why? First, because as the post-modernists persuasively argue, the very notion of an objective truth, “out there” waiting to be discovered is illusory. And, even if it did exist, the idea that our neurologically limited brains could possibly perceive all relevant data and, then, mold it into an accurate description of that reality is wildly implausible. Finally, at a more practical level, our preoccupation with this ultimate question is a massive and historically tragic distraction from the more pertinent – and important – “how we live” questions, listed above.

Most of us have a “home base,” a theory or theories that are our base-line point of departure. For me, it’s Radical Decency. For others it is may be Christianity, Judaism, Buddhism, or a more personal spiritual or ethical code. And this, I think, makes sense.

But the world is far too complex, and the challenges in living well too great, to stop there. We need to cultivate an active engagement with theory, without regard to source. Doing so will enrich and transform out lives, as these examples from my life attest:

  • Jared Diamond and others have expanded my historical perspective to include 300,000 years of homo sapiens history, 7 million years of distinct primate history, and 3 billion years of life.
  • Daniel Siegel, Henri Nouwen and others have helped me understand our biologically wired affiliative nature and its implications for living well.
  • Paulo Frieire and Philip Lichtenberg have explained the psychological mechanisms that play such an important role in perpetuating injustice and exploitation, in the world and in our intimate relationships.

All theories distort — including the ones we use to define who we are. Remembering that, we need to seek out, embrace, and incorporate into our larger world-view the creative insights of others, regardless of source. If our goal is to create better lives and a better world, it is an indispensible part of the process.

Reflection 16: Mainstream Thinking – The Tyranny of Opinion and Judgment

One key area we tend to gloss over as we seek to craft more nourishing and generative ways of operating in the world is how we think.  This may seem like a theoretical issue, but it isn’t.  Our habitual, cultural conditioned ways of thinking vitally affect our outlook and choices in life.

What are these habitual ways of thinking?  Put simply, we live in a world where opinions and judgments are all important.  Lacking them or, even worse, expressing tentativeness or confusion, we are likely to be judged as indecisive and wishy-washy. 

Opinions and judgments are, of course, important.  But what is troubling is the central role they play in our conversations and ways of thinking.  Far too often, they are substitutes for, rather than conclusions drawn from, a careful marshaling of evidence and sustained reflection.

Where does this opinion-based thinking show up?  Everywhere. In politics, for example, most of us are wedded to a belief in our “extraordinary” experiment in “democracy” and “free-market capitalism.”  But what is obvious, when you stop and think about it, is that these are simply statements of faith.  Over the years, there have been dramatic shifts in our system of governance and ways of managing the economy.  But our belief in the unique virtues of our system – however it happens to look at the moment – remains.

The result?  Even as evidence of the system’s inefficiencies, indecencies, and inequities accumulates, we maintain our belief in it.  Whether conservative or liberal, we persist in believing that our problems can be solved by working the system rather than changing it; by electing new and better leaders.

Maybe this confidence is well placed and maybe it isn’t.  But what is clear – my essential point – is that we are treating an opinion as fact.  And what atrophies in the process are our critical faculties:  Our ability to absorb new information; to integrate it into our pre-existing notions of how things are; and to allow new, more discerning understandings to emerge.

In our personal lives, a similar dynamic is at work.  When people fail to meet our expectations, we don’t instinctually become curious – sifting the evidence, attempting to understand how they are different and why they act the way they do.  Instead, we judge and dismiss. They are insensitive – or selfish – or lazy – or (the ultimate judgment) an asshole.  And this pattern applies even when the other person is our spouse or child.


Why are these habits of thinking so pervasive?  Because they so effectively promote and reinforce the culture’s predominant values: Compete and win, dominate and control. 

Thinking in this way, the goal – in perfect alignment with these values – is not to engage with and persuade others but to overpower their will.  How does this work?  A firm opinion becomes our chosen instrument of aggression.  Then, reflexively judging people who don’t share that opinion, we push for dominance and control; saying, implicitly or explicitly, either agree with me or be pushed aside.

Notice too that the opposite approach – openness to differing points of view and a careful weighing of evidence – cultivates curiosity, reflection, dialogue, respect, and appreciation; all deeply relational qualities.  And being relational, it is utterly inconsistent the mainstream culture’s (non-relational) “certainty/ judgment/dominate and control” mindset.  

So, intent on getting ahead in the world as it is, we instinctually de-emphasize this approach, understanding that – whatever its substantive merits – the far more pressing concern is to avoid being labeled as weak, wishy-washy and indecisiveness.


The good news in all of this is that the habits of thought we are seeking to undo are not the result of “stupidity;” of an innate inability to engage in reasoned thought and analysis.  Indeed, jumping to this all too easy conclusion is itself just another manifestation of the judgmental and dismissive mindset we are seeking to overcome.

But the fact that we are not dealing with an unalterable biological defect does not mean the pattern is easily changed.  To the contrary, we are dealing with mindsets that are deeply embedded in our habitual, mainstream ways of operating.

So how do we begin to undo them?  A good starting place is to identify the common conceptual pitfalls that allow these habitual ways of thinking to infiltrate and colonize our psyches.

Here are some key examples.

Assuming the best about “us”

One particularly corrosive example is our tendency to assume the best about members of our group.  Thus, I vividly recall an episode of the Daily Show, a few years ago, in which Jon Stewart presented side-by-side videos of Barack Obama and George W. Bush saying the exact same things on a series of foreign policy issues.  The show’s “reporter” reacted with mock exasperation, saying that Obama is “different.”  Why? “Because he doesn’t mean it.”

Stewart’s point is, of course, a serious one.  Our tendency to assume the best about people like us is chronic – and seldom acknowledged. So, as discussed above, most of us refuse to connect the negative dots about America’s system of government, seeing repeated examples of cruelty and injustice as unfortunate exceptions in an overall landscape of fairness, decency, and justice.

Assuming the worst about “them”

The converse is also true.  We instinctually judge others by their worst examples, a tendency made more virulent by the media’s eagerness to amplify the shrillest voices; those that promote the most strident and debased versions of the communities they represent.  

This point was driven home for me in the 1990s when I became deeply immersed, as an attorney, in the evangelical world.  Prior to that experience I judged that community by its worst examples – the Jimmy Swaggarts and Tammy Faye Bakers.  Being exposed to many thoughtful and dedicated evangelicals leaders, however, laid bare my reflexively dismissive attitude and guided me toward a more nuanced and respectful view. 

That experience was a stark reminder of how easily I slip into a judgmental frame of mind.  Unless I am vigilant, my habitual, gut response – when presented with people, groups and ideas that are different – is to judge them as “less than,” suspect in their motives, and “wrong.”  “Not knowing” and curiosity are not my instinctual vocabulary.  Compounding the problem is the striking absence of any meaningful social norms, cues, and sanctions to steer me away from this judgmental and dismissive mainstream mindset.

Looking for a single cause

Another equally pervasive pitfall is to look for a singular, value-laden cause. Working with couples is a continual reminder of how widespread this pattern is.  A typical couple will come to counseling with her saying (for example) that “the” problem is that he doesn’t share his feelings and he, in turn, identifying her critical ways as “the” problem.

The reality?  There is no single cause and, typically, no fault.  Instead, there are a series of a mutually reinforcing acts, all taken in good faith, that lead to unfortunate results.  He feels anxious and protects himself by going silent.  Sensing that, she responds with her own protective behavior – a complaint – which triggers a renewed, more escalated response from him; and so on.  Just two good people doing the best they can.

What is true in our intimate relationships is also true in every other area of living.  That malevolent boss or co-worker is almost never the singular cause of our woes at work.  And Wall Street – or Big Government – or Trump – or Clinton (chose your villain) is not “the cause” of our political woes. However, our tendency, over and over again, is to oversimplify and demonize; to feed the certainty/ judgment machine.

Excessive faith in our own instincts and beliefs

The final conceptual pitfall I want to highlight is what Francis Bacon calls the “idiosyncrasies of individual belief and passion” and identifies as one of the key “distorting prisms of human nature.” 

We live in a world that celebrates individualism and, as a corollary, promotes a debased version of relativism:  That everything that everyone thinks is fine.  The result is that when we “feel” something or have a “spiritual experience” we all too easily assign a sweeping meaning to it. 

My problem is not with experience but the with uncritical nature of this meaning making process.  Wouldn’t we be better served if we were more cautious about labeling things as messages from God or the universe?  Wouldn’t we also be better served if we felt culturally empowered to critically question our friends and acquaintances when they offer these sorts of explanations?


Needless to say, there are many other ways in which the mainstream culture’s habitual ways of thinking insinuate themselves into our lives.  Hopefully, a deeper understanding of these processes – and the intent behind them – will allow us to cultivate more curious, accepting, and reflective habits of mind.

These are, it seems to me, essential building blocks if we hope to create more nourishing lives and a more decent world.

Reflection 14: Dying – and Our Epidemic of Immortality

The goal of Radical Decency is to be decent to our self, others, and the world, at all times, in every context, and without exception. But across-the-board decency – as opposed to pick-and-chose decency – is impossible if our habitual beliefs and behaviors are not in tune with our biological realities.

When such a disconnection occurs, the physical realities that define us – and limit of our possibilities – inevitably emerge. And the conflict between our biology and these “unnatural” thoughts and actions brings, with it, a high risk of pain for our self and others.

One the obvious example of this phenomenon is the suppression of female sexuality at so many points in our history. Think of the incalculable damage that it has caused in the lives of countless generations of women?

In this Reflection, I discuss another pervasive and deeply consequential distortion of our innate biology: The way in which we view dying and incorporate that reality into our lives.


There are two events that define us more than any others: Birth and death.

The first just happens, with no awareness or anticipation on our part.

Dying, however, is different. An awareness of our mortality is inescapably with us throughout our lives, and how we deal with it is vital to our quality of life. As Irvin Yalom, one of our foremost psychotherapeutic theorists, flatly states: Whether acknowledged or not, mortality is a key issue in every clinical relationship – every one.

Unfortunately, the values that drive our culture, and mold our choices, deeply marginalize this reality. If asked, we agree that death is inevitable. But the ways in which we compose our lives speak to a very different, if unspoken, operative reality.

We live in a world where the fantasy of dominance and control is pre-eminent. We can do anything if we try hard enough – and are “less than,” losers, if we don’t.

Thoroughly interwoven into this larger message is the implicit belief that, through shrewd choices and sheer force of will, we can make ourselves invulnerable to the effects of time. The right combination of food, vitamins, supplements, exercise, and stretching will allow us to always feel great and never get sick.

And we supplement this fantasy of actual invincibility with an increasingly mainstream regiment of artifice. We dye our hair; surgically alter our faces, breasts, and thighs; inject botox; and consume viagara – all strategies designed to maintain the illusion of perpetual youth, not only for others but for ourselves as well.

Moreover, the mainstream medical profession is fully complicit in promoting this illusion of immortality.

  • We will find a cure for cancer, heart disease, and Alzheimer’s – indeed for every malady that can kill us; and
  • Patients in their 80s and 90s – in the last stages of their biologically programmed deterioration – are put on experimental drugs.

Death isn’t the natural endpoint of life. It is an enemy to be defeated.


Regular exercise, sensible diets and good medical care are, of course, positive things.  But this motivating mindset is not. The unstated goal is never to get old, never to die.  Our idealized 40 year-old feels 25. Our 60 year-old role model looks and acts 40.

In this way, the reality of dying never arrives. It is always out there in the future – 10 years further down the road from wherever we are now.  Somewhere in this process, of course, we die. But by virtue of this cognitive sleight of hand, it is always premature – an unfortunate stroke of bad fortune.


We pay a high price for this chronic state of denial. A natural rhythm of living is built into our nature. Fully embraced, each stage of life has its own special challenges and rewards. But all that is swept aside when we reflexively seek to freeze our outlook and choices, struggling to maintain the ambition and sexual allure of a 35 year old into our 60s and beyond.

Chronic denial of aging also leaves us unprepared when life’s natural end point becomes imminent. We typically react to a terminal diagnosis with disbelief which, when you think about it, is truly funny. Did we really think it wasn’t going to happen to us?

What is less funny is the fact that we then face this final challenge with little or no psychic preparation. The result? Too many of us die badly, railing against our fate and filled with complaints because our bodies no longer work as they’re “supposed to.”


The more sensible approach is to embrace death and dying in ways that empower us to live more fully and vibrantly. My particular take on how to do this is framed by two stories.

Not long ago I listened to an interview with Nuala O’Faolain, the Irish memoirist, who, living with a terminal diagnosis, was struggling with the fact that all of her wisdom would die with her. Hearing her anguish, I remembered a second story, of a woman whose Berkeley Hills house burned down in the 1980s, destroying all of her possessions.

Shortly after this event, people started contacting her. Years earlier she had copied her favorite recipes and sent them to a friend. That friend called to say that she was re-copying the recipes and sending them back to her. Her children also called to say they were making copies of the family photos she had faithfully sent to them over the years.  As these calls continued, the woman realized this: The only thing that was safely hers was what she had given away.

So here, it seems to me, is the answer to O’Faolain’s dilemma. One way to look at the rhythm of our years is to think of it as consisting of two interwoven but distinct paths.

The first – an acquisitive one – starts at a high level, exemplified by the infant who is constantly exploring, touching, experimenting, testing, and learning. This remains our dominant preoccupation into young adulthood as we hone our social and romantic skills, build careers and establish homes, families, and places in the community.

The second path – of giving it away – is always there as well. Indeed, Radical Decency teaches that, in healthy intimate relationships, loving and being loved are completely intertwined. Thus, effective giving is a skill we need to master as we emerge as seasoned adults. But while giving away is an important subtext in the earlier years of life, there comes a point when it needs to become our central focus.

Even into our 60s and 70s, the culture invites us to continue our acquisitive ways: To go on striving in our careers; to gorge ourselves on trips, games and new toys; to remain competitive with younger people, both professionally and socially.

The obvious problem with this approach is that it’s doomed to failure. Even Hugh Hefner eventually becomes a pathetic and laughable caricature: A doddering old man in pajamas.  The less obvious – and more serious – problem is that it crowds out the more nourishing promise of our later years. Properly conceived, these years are an incredibly sweet race against time: To give away as much as we can while we can; my answer to O’Faolain’s dilemma.

In making this our priority, we replace the doomed goal of “staying in the race” with a more realistic purpose. Here is a goal for our final years that offers an ennobling, life-affirming challenge; one that requires wisdom, sensitivity, imagination, patience, and persistence.

A serious commitment to “giving it away” also invites us to die well. While it is seldom acknowledged, everyone who loves us will be exquisitely aware — as we grow old — of  our death’s inevitable approach and deeply attentive when it finally arrives. Thus, if we are serious about our vocation of loving and nourishing our loved ones, our death is an absolutely vital and formative moment; our final, really big challenge – and opportunity.

What greater gift can we give to those we love than to handle this last and greatest of life’s mysteries with equanimity, acceptance and, even, curiosity and anticipation? Dying well, we give them an invaluable role model that, hopefully, will help to nourish and sustain them as they face their own decline and death.

As I write this Reflection I am 68, ridiculously healthy, feeling great. Knowing that dying can be really tough, I worry that I am being glib and Pollyanna-ish. When my own decline and death arrives, I may not live up to these brave words. But I also know that giving away what I have – now – is not just a nourishing way to spend my next years.  It is also the best way I know to prepare for my last, really big moment – when it arrives.

When I die, I hope to kick some serious butt!

Reflection 8: Why We Aren’t Good Students; Why It Matters

When I went back to social work school in 2000, it had been 32 years since my college graduation. One of the first articles I read discussed social construction as an analytic tool. I found its approach fresh and exciting. Then I learned that the article was a classic, written in 1971, 3 years after I graduated.

What hit me, at that moment, was that my intellectual growth precipitously declined the moment I left college. My interest in learning didn’t die. I continued to read books (mostly history, biography, and politics), the New York Times, Newsweek, and the New York Review of Books. I went to plays and movies. I listened to NPR. But while I was an above average adult learner, my efforts were, by any fair measure, inadequate — and utterly typical.

Why, for most of us, does serious study die when college ends?

The answer lies in the values that drive our educational system and the world of work. In theory, our colleges and secondary schools encourage students to ask the next question, to be aggressively curious, and to see learning as an endless, ever deepening, powerfully rewarding journey. But the deeper reality is that our schools faithfully reproduce the predominant culture’s competitive, win/lose values, making the competition for grades their operative priority.

Students, adapting to this imperative, become experts, not in learning, but in memorization and regurgitation. They graduate with neither the skills nor motivation to be effective learners. Instead, they are trained to be competitors: Experts at getting the best possible grades; prepped for the next competitive challenge – work and career.

In the world of work, the incentives once again pull us away from serious scholarship. In virtually every profession, specialization is the surest path to career advancement. In my years as an attorney, my serious study – seminars, research, attention to new developments – was focused on my specialty: Bankruptcy law. In like manner, computer programmers and doctors are typically students, not of their professions, but of their specialty within that profession.

In Consilience (1998), Edward O. Wilson points to this same phenomenon in academics.  To build their careers, our budding scholars become economists, or political scientists, or biologists – and play by the rules of their chosen discipline. Then, to get ahead, they find a specific niche within their chosen field, a specialization within a specialization. So even our professional thinkers are pulled away from the “big questions” that should, one would think, be the central focus for a conscious, self-aware species:

  • Who are we, biologically and psychologically?
  • How is our world structured and how does that affect our lives?
  • Given these realities, what are our best choices for living well?

For most of us, the idea of serious and sustained focus on these issues is a nonstarter.  Instead, preoccupied with other priorities, we embrace easy, superficial answers to life’s big questions; answers whose primary virtue is their ability to advance our political, professional and/or emotional agendas. Moreover, since we have so little exposure to the habits of scholarship, we fail to notice its absence. The result? We think what we believe is true.

But as Wilson notes:

“Most people believe they know how they themselves think, how others think too, and even how institutions evolve. But they are wrong. Their understanding is based on folk psychology, the grasp of human nature by common sense – defined (by Einstein) as everything learned to the age of eighteen – shot through with misconceptions. [Even] advanced social theorist, including those who spin out sophisticated mathematical models, are happy with folk psychology.”

The downside of this phenomenon is easy to name: Habitual, unreflective thinking that leads to excesses from endemic and murderous tribal exceptionalism (Rome, the Crusades, British and American imperialism, etc., etc.); self-immolating beliefs such as radical jihadism and the rapture; and so on.

The upside benefits of a serious commitment to life long learning are far less obvious.  Does such a commitment really make a difference?

My answer is an emphatic yes.

If we hope to craft the best possible answers to life’s big questions, we need to become skilled and dedicated students: Grounding ourselves in the best available research; allowing that data to guide us in formulating answers ; always remaining open to new or revised answers as our empirical knowledge and conceptual understandings evolve.

Note, importantly, that my enthusiasm for this enterprise is not some generalized “this is good for you” platitude.  To the contrary, the new understandings that result can literally change how we see the world and, with it, how we think, act, and feel.

So, for example, Daniel Siegel and others have taught me about the neurobiological mechanisms that make our brains habit forming machines – reacting to new stimuli in the same way it reacted to similar stimuli in the past; increasing the likelihood of that response with each repetition. I also learned that our fight or flight mechanism for dealing with imminent danger reacts 10 times faster than our thinking brain, pumps cortisol and adrenaline into our system, pushes blood into our large muscle groups, and shrinks the activity of our thinking brain.

From Steven Stosny I learned as well that the jolt of energy and (false) sense of clarity that fight or flight’s physiological changes evoke is deeply addictive at an interpersonal level: That, when attacked, we are biologically wired to respond in kind, with either a counter attack (fight) or withdraw (flight).

These understandings have changed my life.

Because my mother was a rager, I grew up with a hair trigger temper. The result? For most of my life, I judged myself for my outbursts; coped with the shame that grew out of my inability to control my emotions; and suffered in silence, certain in the knowledge that there was something profoundly wrong with me.

But no more.

Understanding the biological and psychological realities described above, I now make complete sense to myself. Confronted with anger from an early age, I learned to counter attack. And because the pattern kept repeating itself, that response became a deeply engrained habit, reinforced through the years by the jolt of energy its activation provided. I wasn’t wrong. I was human.

The result has been an easing of my shame and the defensive crouch it provoked; states of mind that, for years, limited my efforts to tame my emotional demons. Armed with a better understanding of the rage cycle, I was able to craft strategies to prevent its activation or, failing that, to interrupt it.  Knowing that our brains are habit forming machines, I also embraced a more realistic vision of the change process – seeing it as a war of attrition, requiring a steady and open ended commitment to my new ways of thinking, acting, and feeling.

Jared Diamond’s Guns, Germs and Steel, offers another, good example of the transformative power of serious study. That book persuasively argues that the historic dominance of Middle East and European cultures resulted from geographic and climactic factors; the early development and spread of plant and animal domestication in those areas. Diamond and others also describe the seismic impact of this event on human history, setting the stage for exponential population growth and – through the ability to control the food supply – the emergence of the hierarchical, authoritarian cultures that have dominated the last 4,000 years of human history.

With these understandings, any residual attachment I might have had to the mainstream cultural notion of Western superiority is gone, as is the mainstream view of history as a journey toward modernization and progress.

Our history is not preordained and is not shaped primarily, or even substantially, by the intrigues of the kings and generals that fill our history books. Who we are and how we live is, most fundamentally, the result of the interplay of biology, environment and natural selection. And history’s appropriate time frame is not the 5,000 years of “civilization” covered in our history books. It is instead 300,000 years of Homo sapien history, our 7 million years as a distinct primate subgroup, 3 billion years of life on earth, and 13 billion years of cosmic evolution.

I could cite many other examples in which scholarship has profoundly changed my thoughts and outlook: Paulo Frieire and Philip Lichtenberg’s dissection of the psychology of authoritarian relationships; Carol Gilligan and Terence Real’s insights into the different ways in which men and women are acculturated; and so on.  Hopefully, however, the examples described above make my point: Serious, careful and sustained study and reflection can change our lives. And, more fully assimilated into our mainstream ways of living, it can change the world as well.

Reflection 4: Perspectives on Morals and Ethics

I have always been troubled by what passes for moral and ethical guidance in our culture.  I remember being in Church, as a 15 year old, and hearing the minister say “love thy fellow man.” I also remember thinking, it’s now 11:30 am and he didn’t say a single, really useful thing about how to do that between now and next Sunday when Church reconvenes.

In my 20s I joined a profession with an elaborate Code of Ethics – the law. And to this day I attend ethics seminars to maintain my license. These classes are deeply demoralizing. The standard approach is to tell us what the rule is and how close to the line we can get without risking sanctions or a malpractice lawsuit.

The approach is deeply cynical and misguided, though it is difficult to find attorneys who questions it. Preet Bharara, the current U.S. Attorney for the Southern District of New York, is a refreshing exception. Attorneys, he points out, would never ask their law partners to identify the minimum amount, needed to be done, to maintain profitability.  To the contrary, he would  eagerly seek new and creative ways to make more and more money — no questions asked.  So shouldn’t the same mindset apply to our moral and ethical choices? Shouldn’t we strive with equal vigor to find new and creative ways to express our  ethical ideals?


I have no problem with a socially agreed upon set of moral standards. Some actions need to be encouraged; others socially prescribed. But moral and ethical guidelines need to be rooted in a larger, coherent vision of how we should live.  Absent such a vision to inform their creation and application, moral and ethical guidelines will inexorably morph into tools that promote the values that pervade our culture – control, domination, and material self-aggrandizement.

Here is one (of many possible) examples from the legal profession.

A cardinal – and very sensible – rule of the profession is to avoid conflicts of interest.  Since one defendant could seek to assign blame to another defendant, a single attorney should not represent both defendants. But to truly guide attorneys to a more ethical vision of their work, we need to come to grips with all of the implications, inherent in this rule.

One of its inevitable consequences is multiplying lawyers fees: Two attorneys, not one, at every deposition and hearing.  And since most lawsuits are about money (the standard recompense in civil lawsuits), you would think that the Code of Ethics would deal with the financial implications of this dual representation rule.

But, it doesn’t.

Why? Because the result is wonderfully convenient for attorneys: More lawyers employed, more fees generated.

Not surprisingly, this particular “unintended consequence” is all too common in the profession’s Code of Ethics.  To cite just one other example, the injunction to  “represent your client zealously,” is an open invitation for lawyers, billing on hourly basis, to pad their fees by filing marginally useful motions and fighting the other side on every issue.

What makes it worse is that the Code of Ethics could easily deal with this financial issue.  Suppose hourly billing, without adequate safe guards, is deemed to be unethical — since it very clearly puts the attorney’s and client’s economic self interest at odds.  Impractical? Impossible? Not at all. One possible safeguard would be to require attorneys to estimate overall cost in advance and, if that number is reached, to reduce their future hourly billing rate to an amount that just covers their costs (usually about 65% of normal fees).

If an intent to grapple with this fee exploitation issue existed, guidelines such as this one, could be easily crafted. But don’t expect the ABA’s Board of Governors to take this issue on any time soon. The true bottom line of the legal profession’s Code of Ethics is not legal ethics.


This same self-interested theme exists in the code of ethics that governs my new profession, social work. For example, clinicians are enjoined not to share information about themselves with clients. Like the legal example just discussed, this is an important area in which to offer ethical guidance.  But a simple “rule against” falls far short, since it fails to account for the times when self-disclosure can be a powerful tool of healing and growth. Once again, the deeper, unspoken theme is to protect the professionals — in this case by giving them license to avoid emotionally challenging engagements with their clients, without regard to their positive or negative effect on the therapeutic process.


Finally, I want to focus on adultery as still another area where the mainstream approach to morality, by failing to offer a larger vision of right and wrong, exacts a heavy price.

A very typical example is an intimate partner who, after 20 years of fidelity, has an affair.  Our cultural norm is to condemn the partner who engages in the affair as a cheater; a liar; a bad guy. So when the hypothetical couple comes to a marital counselor, such as me, the straying partner is typically wracked with guilt and the other partner deeply aggrieved.

My point is not to judge these reactions. They are sensible and expectable.  But our simplistic and unthoughtful approach to morality – sex outside the marriage equals adultery equals bad – obscures so much else. Sadly, it is an invitation for the couple to stay stuck in their pain.

One very important reality that the couple, in my example, can easily lose sight of is that the affair partner is actually a good person, highly responsible and committed to his partner. Why do I say this? Because (in our hypothetical) the affair was preceded by 20 years of commitment and fidelity.

This does not negate the fact that affair partner’s behavior grievously damaged the couple’s intimacy and trust. But their healing would be better served if they could fearlessly judge the act, separate and apart from the actor. Unfortunately, our received moral precepts obscure this vital distinction. (Recall President Bush condemning “evil do-ers” rather than acts of terrorism).

Another crucial issue, obscured by the couples’ “good guy/bad guy” mindset, is what motivated the straying partner. In our hypothetical, that partner did not enter into the extra-marital relationship lightly. To the contrary, his or her behavior was driven by compelling, though dimly understood, emotional forces.

Life is complicated and living intimately with someone else multiples those complications. Indeed, it is the rare (maybe nonexistent) couple that doesn’t accumulate hurts and unexpressed needs and frustrations, as the years go by.  Often, an affair is an inept and ill-advised attempt to break out of a painful and deeply entrenched pattern of behavior. And since a relationship is a system, the great likelihood is that both partners – in the time leading up to the affair – were coping with unresolved pain.

Given this reality, going back to the way things were is not a good choice. Better to look at the affair as a potential turning point – a time when long standing issues can surface and be dealt with in a more satisfactory way. Once again, however, our standard moral precepts do not lead the couple in this direction. The common outcomes are either (1) a divorce (get rid of the cheating bum), or (2) an extended period of remorse followed, as the pain recedes, by the re-emergence of their old ways of doing things; that is, the very patterns that led to the affair in the first place.


Radical Decency – by focusing inclusively on decency to self, others, and the world – is designed to offer precisely the kind of larger vision of how to live that can lead to more just, equitable and humane moral standards. Applied to professional ethics it focuses on the full range of collateral consequences for all parties.

When it comes to deepening our ethical insights, and crafting wiser choices, Radical Decency can support us in doing better – a lot better.

Reflection 1: Our Propaganda Saturated Culture

This Reflection series began with a movie I saw, while on vacation in Maine, several years ago: Extraordinary Measures, starring Brendan Fraser, Keri Russell and Harrison Ford.

At the end I had a sudden sense of clarity about what had just happened to me. I would sum it up as being seduced – and appalled at my own easy seduction.

The movie is about a father of two children both suffering from a debilitating disease certain to kill them by the time they are 10. He is our hero. A Harvard MBA, a rising executive at Bristol-Meyers, AND a patient and devoted husband and father who makes it to every recital.

Just for starters, how is that for a glib, unrealistic role model? The implication of this – and many other popular culture models like it – is that this is the standard for which we must strive: A hard charging professional who, by necessary implication, invests the enormous psychic energy and long hours needed to be a “winner” in that arena and, at the same time, is a devoted family person.

Since this ideal is so difficult to achieve, and even more difficult to maintain over time, it is not the positive, inspirational ideal it purports to be. Instead, in the real lives of real people, it is a prescription for frustration, shame, and a sense of failure. We are constantly measuring ourselves against impossible to achieve standards and – surprise, surprise – coming up short. Or, for the “lucky” minority that can maintain this juggling act, we exhaust ourselves and neglect more “optional” endeavors, such as community, leisure, study, speculative reflection, and simple down time.

But for me, the real kicker of the movie was its more specific messages. And again, they are messages that saturate our culture.

The first is that you can do anything if you try hard enough.

Our hero finds THE scientist who is on to a cure for his children’s “incurable” illness. He then quits his high paying corporate job, forms a start up to perfect this groundbreaking new medicine, sells the start up to corporate America to keep the project going, and then defies the corporation in order to give the miraculous cure to his two kids.  And, of course, the cure works!

Wow, what a message! Notwithstanding the enormous number of stories that permeate our culture, glorifying the heroic individual who defies impossible odds to “make it happen,” this is in fact a pernicious distortion of real life. In all but a statistically minute number of cases, terminal ill children die. Also, most startups fail. And most executives who heroically and emotionally stand up to their bosses get fired – never to be heard from again.

Which brings me to the second pernicious message of the movie: While corporate bosses may seem to be heartless and bottom-line oriented, in the end, they have hearts of gold. So, in this case, when faced with the father’s heroism and passion, the CEO’s essential humanity breaks through. Ignoring corporate rules and procedures, he allows our hero’s children to be part of the initial test for the new wonder drug.

The problem with this message? The great majority of corporations are not run by “good” people who, when faced with real life moral choices, are willing to sacrifice their profit-driven bottom line to “do the right thing.” To the contrary, the overwhelming majority of corporations fire people a without remorse and, far more often than we care to admit, condone environmental and employment practices, and public policy choices, that lead to injury, disease, and death.

The final message that jumped out at me is that disease, disability and, injustice all come dressed up in pretty little, socially acceptable, packages.

The dying children in this movie are adorable, feisty, funny, and charming.  And so is the dad, the agent of change. When I worked as a consultant for the Variety Club, years ago, I was struck by the staff member who complained bitterly about donors that wanted “pretty little white girls in wheel chairs.”

The reality: Disability and injustice are inflicted on real people and, disproportionately on the poor and uneducated. Often anger, ugliness, emotional imbalance, selfishness, etc., etc. are part of the package. And except in the rarest of cases, the people who seek real change are not saints either. So do we ignore “ugly” injustice and stop listening to obnoxious agents of change? That is, I submit, one of the implicit messages of this movie and so many other pieces of popular culture like it.

One final thought. In the moment, as I watched this movie, I was totally seduced:

  • By our hero;
  • By his family;
  • By the curmudgeon-y, unemotional, but ultimately soft hearted CEO; and
  • By the story itself.

In other words, this is not just propaganda. It is, if my instinctual reaction is typical (and I think it is) highly effective propaganda, with important consequences at both an individual and societal level.

It is humbling to think that it has taken it has taken me six plus decades of living to work through the obscuring and dense haze of this feel good propaganda to a deeper understanding of its pernicious effects. The work before us, if we hope to understand the many subtle forces that mold our lives – and to take effective steps to counteract them – is immense.

That is the challenge that Radical Decency seeks to address.