Joy of Work

“By doing things badly we make ourselves less real.” — Thomas Merton, The New Man

One of the joys of teaching is that you have an excuse to learn full-time. Given the fact that humans, even under the worst of circumstances, are compulsive learners, it’s great that some of us actually get paid to do it. Having just finished a week of diving deep into the ethics of genetic engineering, the delights and mysteries of the Tao Te Ching, and the value and meaning of good work—all of this with three different groups of thoughtful adult students—I am realizing the deepening satisfaction of emergent ideas. These are ideas which are a long time coming and which are formed through the confluence of different streams of thought. 

One of the ways I learn is to write my way through a problem, turning with it this way and that, until I find some place to stand if only for a moment. That’s the pleasure in writing this blog—to see what I think and to gain some clarity in the process. These are always snapshots of a process, a momentary glimpse of a mind in pursuit of . . . something. At times my train of thought is no more than a couple of boxcars on a siding; the fun is in seeing if the engine in the far distance, visible only as a plume of drifting smoke, will shunt through the right switches to arrive and connect. 

So all week I’ve had the sense of an idea just under the surface. I am remembering David Crosby’s song: “Just beneath the surface of the mud/There’s more mud here/Surprise!” 

It takes the shape of trying to understand—and articulate in a class on world religions—what Lao Tze is saying in his incomparable 81 passages of wisdom in the Tao Te Ching. He begins with a warning: “The Tao that can be told is not the eternal Tao,” and follows up later with, “Those who speak don’t know and those who know don’t speak,” a joke of sly proportions considering that he’s been going on about this page after page. It is, in the words of one commentator on the Tao, like “trying to unscrew the inscrutable.” 

But later in the week, as a group of us discuss readings on work and leisure in a class on Humanity and Culture, other passages sparkle in their directness: 

“When you are content to be simply yourself
and don’t compare or compete,
everybody will respect you.”

“Do your work, then step back.
The only path to serenity.” 

“Do you have the patience to wait
till your mud settles and the water is clear?
Can you remain unmoving
till the right action arises by itself?”

Taoism holds as a central virtue the notion of wu-wei, sometimes explained as ‘actionless action.’ It’s about working with natural elements in the world instead of forcing one’s way through life. In an aggressive and competitive culture like ours these profoundly simple ideas may sound naive at best and subversive at their heart. But it is often the case that when the needle on the gauge bends against the extreme there arises an alternative. Thus, when we long to do good work and to enjoy it, to speak and to be understood, to receive without fear, we may arrive at the desire for integrity. 

I’ve been struggling to come up with a word that surges up from the depths, parts the waters cleanly, and glints in the sunlight before dropping back. All week the word has been honesty, though I’ve resisted its popular distortion as cruelty disguised as truth—jus’ keeping it real. . .  But it’s a brave word and it stands closer to that shadowy wisp of meaning I’m trying to grasp than any other for now. 

I’m reminded of an enigmatic phrase of A. N. Whitehead, ‘Religion is what we do with our solitariness.’ I’ve been turning that over and over in my head like a rough-edged pebble, trying to smooth it out by constant contact. I don’t know where I found it and I can’t determine the context, but the shape of it looks like honesty—what we truly are when we are truly ourselves, alone before the Divine. In that state we know in our marrow when we are slip sliding away, avoiding the truth, trying to put one over. When it’s just us and God, no one else to impress, what reason do we have to lie? Why not just come clean for once, be courageous enough to laugh at our pretensions to courage, and be still?

Thomas Merton, who had a gift for simplicity of expression, says, “A multitude of badly performed actions and of experiences only half-lived exhausts and depletes our being. By doing things badly we make ourselves less real.” We winch up the tension, speed up the production line, churn out more and more things. “Our malformed conscience can think of nothing better to tell us than to multiply the quantity of our acts, without perfecting their quality. And so we go from bad to worse. . . .”

Let us have the humility to begin again, to do our work well for the beauty of it, for what we might learn in the doing. Teasing out these strands of thoughts, looking around us and finding a way to bind them gently together, we can at last use them thankfully and well. 

“Behold the world fresh—as it is, on its own terms—through the eyes of a beginner,” urges Epictetus. “There is no such thing as conclusive, once-and-for-all knowledge. . . . Spirited curiosity is an emblem of the flourishing life.”

And so to work . . . .

For What It’s Worth

“There’s something happening here/What it is ain’t exactly clear . . . .” — Stephen Stills, Buffalo Springfield


If you were of age in the late 60s you can probably hear that song in your head, the ringing notes of the opening bars suggestive both of hope and apprehension as Stephen Stills’ voice, bluesy with a mocking edge to it, drew us into the images. It became something of an anthem as the mass protests against the Vietnam War spread from city to city across America. 

I always associate that song with the 4th of July, perhaps because, inevitably, the 4th is about massive crowds—at least it is where I live, near Washington, D.C.—and because if there’s a protest to be writ large it will happen on the national stage of the Mall in the heart of D.C. 

It’s been years since I actually went down to the Mall for the Fourth of July celebrations. When I first moved out here from California in 1981, fresh out of graduate school, to take a teaching position, I did the usual things for a newcomer which included joining 250,000 people, the National Orchestra, fireworks (‘bombs bursting in air’ and ‘the rockets’ red glare’), sometimes the Beach Boys, parades, and speeches by the usual suspects — all of us basted and cooked to perfection in D.C.’s 100+ degree fire pits. If you do that a couple of seasons you develop a lingering suspicion that you’re just one of thousands of extras in an apocalyptic thriller movie. The thin veneer of civilization peels back in your waking nightmare as you imagine the ultimate fireworks of nuclear holocaust opening above the Washington Monument. 

So you stay home the following year and find you do not miss the hour-long wait at the subway station, moving a foot at a time toward the abattoir deep underground, all in lock-step with the thousands of sodden, hungry, and beaten citizens on this holiest of civic holidays. I exaggerate, of course, but only slightly: you may enjoy conjecturing on which parts of the narrative cross the line of sensible imagination.

But whether I stay or go to Fourth of July public rituals my dilemma remains the same: I do not know how to act patriotically on that day or any other day. I left Canada at the age of five in the company of grandparents who were headed for teaching positions in California. My memories of Canada are pleasant but my knowledge of its politics and culture is slight. Most of my life has been spent in America, with a year and some summers in Britain, and another year in British Columbia. And yet I remain a Canadian citizen and have never voted in this country. 

Assimilated to Northern California culture at an impressionable age, I nevertheless found no ground upon which to stand, and thus I remain oddly suspended, neither fully Canadian by geographical and cultural immersion nor American by citizenship and pride. When I traveled in Europe on my Canadian passport in the early 70s, a maple-leaf stitched proudly on my backpack, I received encouraging glances and the offer of conversation. When it was discovered that I was Canadian but lived in America, curiosity turned to something close to envy, although a lecture on the failings of American foreign policy was sure to follow. 

I was never sure how to respond. Like many middle-class American kids I had my views on the war, which ran the gamut from naive to ignorant. But of the  moral darkness of the venture I was deeply convinced and have found no reason since to revise that view. What I was naive about were the reasons why boys my age were drafted and why some even volunteered. As the war dragged on it became more clear that disproportionate numbers of poor whites, blacks, and Hispanics were being drafted. That seemed wrong to me, but I don’t think I could have explained why at the time. What genuinely puzzled me was why anyone would volunteer. In the years that followed I spoke to some who had stepped up with pride, served as officers, and returned home feeling betrayed by the American public. They had been told they were fighting for freedom. A lot of Americans saw them as baby-killers. 

When we stood for the pledge of allegiance in school I did not recite it nor place my hand over my heart. Dimly, I understood that would be somehow wrong, although my reticence was sometimes taken for defiance. When I saw the flag unfurled, waving in the breeze or heard the national anthem, I did not tear up nor bow my head in gratitude. The Star Spangled Banner seemed simply unfortunate, the lamest excuse for a call to patriotism that I could imagine. Nobody could sing it well and the only version I could stand was Jimi Hendrix’s fuzzed-up and melancholy riff. 

Yet, the first time I landed on British soil during a torrential downpour at Gatwick Airport in 1971, I felt like I’d finally come home. Raised by British grandparents, reading Churchill’s History of the English-Speaking Peoples under the covers at night, and marching with the hobbits through Middle-earth had conditioned me for a kind of naturalized citizenship. I slipped into it easily and naturally, feeling  less the outsider as the country cousin come to visit. As for national anthems, I found ‘Rule Britannia’ quaintly endearing, “God Save the Queen” serious and moving, but it was “Jerusalem,” sung at football matches and other public gatherings, that brought a tear to my eyes. Whether rendered by the pure voices of English choirboys or thundered through by English rockers Emerson, Lake, and Palmer, Blake’s vivid verses and imagery brought me close to pride of country.  

That may be the closest I get emotionally to patriotism, an anomaly that I have to chuckle over. Born in Canada, raised in America, with my heart attached to a misty Avalon, I realize that I am everywhere and nowhere. Perhaps the difficulty lies in the fact that patriotism, almost anywhere in the world—and especially, it seems, in America—is joined at the hip with war. Thus, to question American actions in the world is to dishonor the sacrifice of American soldiers. Since wars these days are marketed and sold through sophisticated advertising campaigns, and military objectives are subordinated to political imperatives, patriotism becomes an accessory worn on the sleeve, designed to quickly identify whose side we’re on. Remember when everyone clipped American flags to their cars after 9/11? There was a certain amount of nervousness if you were the first on your block to take your flag off. 

I don’t have any reservations about what this country has done for me. I admire American energy and imagination, its willingness to thumb its nose at centuries of aristocracy and privilege of lineage. Most of all, I love the straight forward, clear-eyed pragmatism that so often gets things done. But America, historically speaking, is a teenager— impetuous, brash, arrogant, and ignorant of many things. It is quick to seize on the latest fashion, be it technology, religion or idiom. There are times when you think, “I can’t take this kid anywhere!” It has the attention span of a squirrel, the narcissism of a Chihuahua, and the gratitude of a cat. 

That being said, it also has the best mission statement and corporate vision in the world. Alongside the fact that the country was founded on the economic necessities of slavery, the men who built the Constitution out of parts they’d filched from all over created the motherboard of freedom. It works well, really well, when we keep the hardware clean and the software—this wonderful spirit of inventiveness—free of ideological viruses. 

So I’ve found a type of ethical patriotism within which I can live. According to the Stanford Encyclopedia of Philosophy this is one in which “the patriot would take pride when the country does what is right. But her patriotism would be expressed, above all, in a critical approach to her country and compatriots: she would feel entitled, and indeed called, to submit them to critical moral scrutiny, and to do so qua patriot.”

It’s been tried before by many people, some of whom died because of it and others who simply and quietly live it out every day. But as Tennyson said in Ulysses, “Some work of noble note, may yet be done/Not unbecoming men that strove with Gods.”

The Health of the Body Politic

“When people accept futility and the absurd as normal, the culture is decadent. The term is not a slur; it is a technical label.” — Jacques Barzun, From Dawn to Decadence

To many of us who count ourselves in the liberal —sorry, progressive—tradition, the last week of June 2012 will be regarded as a peak among valleys, an historic moment. That was the week that the Supreme Court, by a 5-4 count, upheld the healthcare law put forward by Congress and President Obama. It is legislation, incomplete though it is, which will make life better for millions of people. You might even say it was the right thing to do: that a government of the people, by the people, and for the people ought to care about the welfare of its people. You could say that, but having said it you ought not to be surprised at the outraged reaction of many who will not see this as fairness and equitable treatment for most, but a terrible imposition against the few. 

I try not to let the paralyzing complexity of the process blind me to the straightforward belief that universal health care should be available for all. That the United States is the only rich country in the world without such a plan is shameful. It is shameful in the old-fashioned sense that we ought to be ashamed of ourselves, feel the burden of guilt on our shoulders, and be determined to do the right thing. The “right thing”, of course, is the point at which the debate splinters into a thousand yelps of “who is to say what is the right thing for everyone? What’s right for you might not be right for me!”
Well, yes, that might be true if you believe in turning individual preferences, whims, fancies, fads, and choices into general rules for human society. But serious discourse on the moral rights and obligations involved in this issue often stalls over this wild-eyed belief that we are, every one of us, so different that we share nothing in common, not even our needs as human beings. 

In the inevitable tension between the individual and the community, we generally find creative ways to fulfill most of our wants as individuals without disabling the needs of our communities. We’d feel cheated if it were otherwise because we expect the community to honor our individuality. That’s the American way. But why is it so hard to wrap our heads around the fact that we wouldn’t have individual rights if all of us, as a community, hadn’t made it so? As individuals, we only have two options for asserting our rights: either we blow everyone else away or we work together to create a society that protects all of us individuals together. 

Hobbes thought only the Leviathan of absolute monarchy could keep us from each other’s throats. Otherwise, our natural state would result in lives that were “nasty, brutish, and short.” Others, including Jefferson, thought that reasonable people could freely make decisions together that would benefit the one and the many. We have the freedom to  voice our opinions, even our unreasonable ones, because of that deeply-held belief. 

Americans are nothing if not pragmatic; we usually take the cheapest, fastest, and most effective route to the solution. But we rarely speak the language of duty, especially about those we dislike or fear. Instead, we speak the language of economy, the lingua franca that unites us all in the glib glossalolatry of capitalism, marketing, and public relations. If you really want to make the point about the need for ethics you must show why it is profitable to do the right thing. Doing the right thing simply because we’re convicted it’s the right thing will often draw blank stares, because as a society we’ve lost the capacity to imagine that two or more people could agree on a moral duty. But if you hint at loss of profits or a public shaming at the hands of the media you’ll be making sense. 

So the argument for universal health care on that ground would take into account that we’re already paying the bills for the poor who are without health insurance. Requiring everyone to have health insurance that they can afford will lower the overall costs to our society; gradually building an emphasis on preventive medicine will lower the costs too. 

In the months leading up to the election we’ll be bombarded with propaganda from both sides. In order to divert the missiles and drones a lot of chaff will be blown into the air by publicists, lobbyists, campaign aides, and the candidates themselves. The Republicans are developing battle plans for a final assault on the Death Star of Obamacare. 

I’m going to lock on to a guidance system that allows two perspectives on the target: one is that people have basic rights as human beings—and adequate health care is one of them. The other is that the greatest benefit to all of us accrues when we all share the burdens. 

Marching to a Different Drummer

 

“Real generosity toward the future lies in giving all to the present.” — Albert Camus, The Rebel

When we are young we cannot see the point in moderation. It strikes us as timid, cautious, perhaps toadying to the powers that be; in any case, if we pull back or withhold we risk the derision of the socially graceful, those young gods whose spectacular failures are even more to be envied than their modest and expected successes. Thus, if you grew up in the Sixties in an evangelical community you were bound to hear the thrilling stories of prodigality, the dissolute life in a far country, the moment of coming to oneself among the pigs, and then the trembling but resolute return to the family. 
 
Those of us who listened to these stories, who never left home, found ourselves split unequally in three ways: we were in part rejoicing with the father that the prodigal had returned, we were wistfully longing to be the prodigal himself, and we were, in some measure, the resentful older brother, dutiful and dull, in whose constricted craw the younger brother’s tempestuous travels stuck like a bone. 
 
It wasn’t so much—at least not in my case—that we wished to actually smoke the holy weed that breathless news stories assured us was being consumed all around us, but that we lacked the cojones to step off the well-lit path and into the shadows. I had no hunger for drugs or alcohol—a deficiency I am now grateful for—mostly because I believed I had no brain cells to waste. Sometimes, with a tinge of envy, I listened to friends describe their trips, but for the most part my adventures were of the literary sort. Albert Camus, George Orwell, Tolkien, C. S. Lewis, Thoreau—these were my mentors. I had a poster on my bedroom wall with a quote from Thoreau: “If a man does not keep pace with his companions, perhaps it is because he marches to a different drummer.” I was pretty sure my drummer was different from the rest, although musically speaking he went by the names of Ringo or Ginger Baker or Crosby, Stills, and Nash, Joni Mitchell, James Taylor, Gordon Lightfoot and Carole King. 
 
But with the Sixties exploding all around us, and living almost within sight of San Francisco, Berkeley, and all things cool, I think those of us who had been raised Christians and at some point consciously chose to be Christians had to learn to listen to some drummers and not to others. I’ve always been grateful I grew up in the Sixties for it was one of those disjunctive moments in history that shakes everything up—art, music, politics, religion, mores, self-identity, national consciousness. Endless are the books on the impact of that era and fascinating the commentary on the persons who lived under the hot glare of the spotlights. Now, as many of those artists, musicians, and writers enter their 70s, we begin to understand their legacies. The body of work that many of them accomplished—those who did not burn themselves up in the process—now becomes visible. The pioneers of rock are now the old masters, even farther back in time than the Big Band era musicians were for those of us who came of age in the Sixties. 
 
Every generation has to leave home—sometimes in anger, sometimes with many a backward glance—but leave it must. It’s not for nothing that the central metaphor in most wisdom traditions is the Path or the Way; the idea of life as a journey is so self-evident as almost to be trite. Yet, in looking back we believe we see a pattern to our wanderings that gives us comfort while it still surprises us. “You can’t connect the dots looking forward;” said Steve Jobs in his now-famous commencement address at Stanford. “You can only connect them looking backward.” 
 
Can we choose our rebellions when we’re young? I’d like to think we have the perspicacity to sign up for the ones that have the longest half-life, but I doubt we can see that far. “It is perfectly true, as philosophers say” remarked Kierkegaard, “that life must be understood backwards. But they forget the other proposition, that it must be lived forwards.” But while we may not have clearly seen the road ahead, there was still something that was drawing us on to take this path and not that one. Sometimes we acted with conviction and urgency, other times with a sense that we had no other choice but to follow this particular track. Only as we got far enough down the road that we could look back did we begin to make some sense of it. And by then, of course, it was too late. . . So while we’d like to be able to say to the young just starting out, “Try to live in such a way that you don’t have to lie about where you’ve been,” it’s probably not going to be heard. We learn best by doing, not by memorizing, which is why history is still an important subject because it’s a way to connect the dots for those who are busy leaning forward. 
 
In 1956 Albert Camus published his seminal essay, The Rebel. Reviewers called it a “piece of reasoning in the great tradition of French logic,” and noted that, “here is the voice of a man of unshakable decency.” In a shattered Europe after WW II Camus had the courage to ask, “What is a rebel? A man who says no, but whose refusal does not imply a renunciation. He is also a man who says yes, from the moment he makes his first gesture of rebellion.” The dilemma I saw was twofold: unthinkingly joining up in a mass movement can lead to tragic results, while refusing to extend oneself can lead to moral and creative paralysis. But on the other hand, the only way we come to know who we truly are is to put ourselves in situations where we are tested. Some tests we will fail, and we can only hope that we will fail upwards and not fall by the way in the process. 
 
We may not, with clarity, be able to choose our rebellions, but we can choose to rebelagainst injustice, despair, fear. In the closing pages of The Rebel Camus’ voice rises in eloquence, leaving behind the cool cadences of his logic and sounding a note prophetic and courageous. 
 
“For twenty centuries the sum total of evil has not diminished in the world,” he says. We might be tempted to turn away then and cultivate our narcissism. There are plenty who stand ready to help us indulge ourselves for a price. But rebellion, says Camus, can’t exist without “a strange form of love.” It is a love that does not calculate and is prodigal in its gifts to those yet to come. “Real generosity toward the future,” says Camus, “lies in giving all to the present.”
 
There is no future in the politics of resentment or retribution; to put aside the murderous impulses of power and history, he says, “a new rebellion is consecrated in the name of moderation and of life.” Camus could not believe in the Church’s kingdom to come nor could he devote himself to a secular utopia purchased through the blood of millions. “It is time to forsake our age and its adolescent furies,” he said.
 
As a Christian, I couldn’t agree more. This is a rebellion wide enough to embrace this Earth, our home, while choosing to rebel, in a thousand ways each day, against injustice in the name of courage and decency. “Do not be conformed to this world, but be transformed by the renewing of your mind,” said St. Paul, a rebel drummer worth marching with. 
 

Eric Hoffer, Talent Scout

“What we know with certainty is not that talent and genius are rare exceptions but that all through history talent and genius have gone to waste on a vast scale.” — Eric Hoffer, The Temper of Our Time

Eric Hoffer was a San Francisco longshoreman and something of a social commentator and philosopher. In 1961 he wrote a book, The True Believer, which became a bestseller. In his rough-hewn political sensibility and solid, linear style he was a folk-hero to many.

I was in my teens when I first came across The True Believer, a book on fanaticism and mass movements. The fact that Hoffer was working down on the docks in San Francisco, only 75 miles from where I grew up, and writing books like that and The Temper of Our Time gave him a credibility that could only be matched in my lights by C. S. Lewis, Tolkien, and Edith Hamilton. I wasn’t all that discriminating in my reading, being subject to a syndrome that compelled my eyes to stray to print when not otherwise occupied. But Hoffer’s words rang true to me and he hit home with many a sentence laid down with deliberate care and an icy honesty. 

His personal history was the stuff of a Dickens novel. Born in 1902 in Brooklyn, he suffered blindness from the age of seven, two years after he and his mother fell down a flight of stairs. She never recovered and died the same year he lost his sight. Mysteriously, his vision returned when he was fifteen and he began reading voraciously, afraid that his blindness might return. By the time he was a young man his father, a cabinetmaker, had died also. With the $300 the cabinetmaker’s union gave him after his father’s funeral, Hoffer took a bus to Los Angeles, where he kicked around on Skid Row for ten years, failed at a suicide attempt, and became a migrant worker up and down California and other Western states. 

Trapped in the Sierra Nevada Mountains for a winter season, he read a book he’d picked up by impulse before making the trek into mining country. Montaigne’s Essays opened his eyes to the possibilities of writing and learning. Over the course of a long and vigorous life as a longshoreman on The Embarcadero in San Francisco, he wrote 11 books, was the subject of a 12-part interview with radio station KQED in San Francisco, was interviewed twice by Eric Sevareid, and in 1983, four months before he died, was awarded the Presidential Medal of Freedom by Ronald Reagan. He never attended college or received any formal education beyond high school: he was a self-taught, self-made man through the acts of reading and writing.

One doesn’t have to agree with Hoffer’s sometimes stringent opinions to relish the way he can reframe an entire intellectual perspective. In a section of The True Believer he notes that, “The more selfish a person, the more poignant his disappointments. It is the inordinately selfish, therefore, who are likely to be the most persuasive champions of selflessness.” In a section on the poor as particularly susceptible to mass movements, Hoffer says, “Our frustration is greater when we have much and want more than when we have nothing and want some.”

Somebody once said, Nietzsche probably, that all philosophy is autobiography and in Hoffer’s case that certainly seems to be true. Having brought himself up by his own bootstraps and working with calloused hands and a bent back most of his life, Hoffer had an abiding contempt for “intellectuals” and a stalwart admiration for “the masses.” He considered intellectuals effete, useless, and power-hungry. Most of them, in his view, were foreign; it was almost unAmerican to be an intellectual.  

He despaired that the age of men of action was fading as around the world intellectuals prized the power away from them. “By intellectual I mean a literate person who feels himself a member of the educated minority. It is not actual intellectual superiority which makes the intellectual but the feeling of belonging to an intellectual elite,” he said.

“One cannot escape the impression that the intellectual’s most fundamental incompatibility is with the masses,” he says. “In every age since the invention of writing he has given words to his loathing of the common man.” For Hoffer, the foreign intellectual is simply stymied at American resourcefulness. It’s not the intellectuals who built the dams, highways, skyscrapers, factories, cars, and airplanes in America. It was the solid, down-to-earth masses who showed what they could do without masters to shove them around. They built this country but somewhere along the way they lost it.

Hoffer was writing in the early 60s for Harper’s, The New York Times Magazine, Holiday, The Saturday Review, and Cavalier. Perhaps in those days when students were occupying the president’s office at Columbia, striking classes at Berkeley and demanding curricular changes at other campuses, he was understood as an oracle of freedom, a working-class hero who championed freedom of thought over against intellectuals who would stifle creativity. 

Elsewhere in the world Hoffer saw the intelligentsia in Communist countries, as well as Asian and African nations, as the new colonialists. Having jettisoned Western masters, these new countries now found themselves ruled by native intellectuals, people trained in Western ways of thinking, who bent the thin reed of nascent freedom to their own advantage at the expense of their own people.  

Oddly, for someone firmly planted in the working class, Hoffer believed automation would free up the masses for more erudite pursuits. He also believed that intellectuals, with their heads in the clouds and their hands in the till, didn’t want working people to become affluent. Perhaps tongue in cheek, he envisioned a time when most manual labor had been turned over to machines and the people could finally educate themselves. 

Reading him today is a lesson in cultural metamorphosis and historical interpretation. Automation has accelerated production and trade, driven thousands out of work, and given millions access to devices Hoffer could not have foreseen. The intellectual elite, such as they are, now gamble with other people’s money on Wall Street, decide which new reality shows will draw the most eyeballs and occasionally figure out ways to make the world a better place for the masses. 

In today’s milieu much of what Hoffer said would gladden the hearts of Tea Partiers, deeply suspicious as they are of the liberal Eastern establishment. They might bristle, though, at his statement that “where a mass movement can either persuade or coerce, it usually chooses the latter.” 

But Hoffer’s enduring theme—and his signature contribution to American social and political thought—is his steadfast belief that ordinary Americans are capable of producing great things. “The American intellectual rejects the idea that our ability to do things with little tutelage and leadership is a mark of social vigor. He would gauge the vigor of a society by its ability to produce great leaders,” he says at the end of The Temper of Our Time. “Yet it is precisely an America that in normal times can function well without outstanding leaders that so readily throws up outstanding individuals.” 

He may be right, but the great conundrum facing us these days is those who desperately want to be our leaders probably shouldn’t be there, while the ones who could do the job aren’t electable under the current system. 

My guess is that the country will be alright. No one is counted a great leader until after they’re dead. In the meantime, we’ll make do with people who have enough hope to try for the ideal and the courage it takes to achieve the possible. 

Skill Sets for Hire

“In a culture of disrespect, education suffers the worst possible fate—it becomes banal. When nothing is sacred, deemed worthy of respect, banality is the best we can do.” — Parker Palmer, The Courage to Teach

I stumbled across a blog this week by an investment advisor, Mike Shedlock, entitled “Mish’s Global Economic Trend Analysis.” He had written a post which revealed that for the first time ever a majority of the unemployed have some college education. Shedlock called the price of an education preposterous and gave us five solutions to dramatically lower the cost. These included killing the student loan program, cutting all state aid to colleges, increasing competition by accrediting more online universities, and busting the teachers union once for all. But the one that really caught my eye was this:

“High school counselors and parents must educate kids that there simply are no realistic chances for those graduating with degrees in political science, history, English, art, and literally dozens of other useless or nearly-useless majors.”

After I stopped fuming and running through a long list of ad hominem arguments against this blinkered Philistine, I tried soberly, reflectively, and sympathetically to think like him. I didn’t get very far. As far as I can see he believes in education solely for its instrumental value in getting a job. After that. . . .what? But education and learning are as different as a job and a vocation. 

I’ll give him this: the cost of education is scandalous, no question about it. The value of a college education these days is certainly disputable, and the efficacy of four to six years of the college experience toward getting a job is harder and harder to justify. But I balk at eliminating most majors in the humanities and social sciences. Simply because they may not score a direct line between subject and object is no reason to dump them. More often than not they become portals to many other opportunities.

I’ve often told my students that the grand purpose of college is to learn how to learn. Content and subject matter is certainly important, but what matters more is the ability to take in new information and make something of it. That’s what we should be teaching as students are learning English, history, art, political science, biology, accounting, and philosophy. Any of those subjects affords us the possibility of learning what it means to be human, how to adapt to changing circumstances and what to live for. Do they lead to jobs? Of course they do. Nothing we learn is wasted if we know how to use what we’ve learned. 

But people like Shedlock are hammers looking for nails; they seem to believe that if you’re not supplying then the only alternative is to demand. And the Demanders, as we’ve so clearly seen recently, are the losers, the muppets, the dimwits who deserve to be ripped off by the smartest guys in the room.

Trying to imagine a curriculum built around Shedlock’s restrictions all I could come up with was math, science, and business. Those would be the majors that would lead to jobs in health care, industry, investment banking, and insurance. Since there would be no community colleges there wouldn’t be computer technicians, security or law enforcement, paramedics, plumbers, carpenters, electricians, nurses or cyber-security. State universities, many of them major research centers, would wither away, taking with them a plethora of important and necessary disciplines. And of course, there would be no designers, advertisers, or journalists. 

But there could be doctors trained online by the University of Phoenix or physicists with certificates from one of the many Massive Online Open Courses (MOOCs) recently touted by Thomas Friedman in The New York TImes, a scenario in which 100,000 students take a course from, say, Stanford or an accredited for-profit university based in the Cayman Islands. Multiple-choice software-graded exams do the heavy lifting and students around the world can help each other when the teacher is asleep. 

I’m still trying to figure all this out. I have no doubt that online universities will continue to have an important place in the education of millions. They may even come to be the norm. And the industry that is Higher Education will need to re-vision its mission for learning instead of trying to become the Disneyland of Skill-Set Training. Above all, we need to remember that the unexamined life, as Socrates said, is not worth living. According to some, those who examine life are not worth hiring.

Are We Evolving Yet?

“All kinds of images forever float  
About us everywhere, and some are born
Of their own generation in the air

And some have more substantial origin
And some are compounds of two things or more . . . .” 
— Lucretius, The Way Things Are (trans. by Rolfe Humphries)

Lucretius was marveling, in the context surrounding the passage above, at the many inventions of the mind—Centaurs, Scyllas, hounds of Hell—and reminding us, rather archly, that these things don’t exist except in our minds. And the mind is, in his words, “very delicate and sensitive.”

Lucretius was referring to illusions and our endless capacity to make bogeymen out of a few threads of this and that, sewn together with fear and animated out of dreams when reason sleeps. 

Such has been the bogeyman of same-sex marriage for politicians who, putting reason aside, must accede to what their loudest constituents denounce. And then there’s Joe Biden. 

In a move which must have shaken the White House and its staffers, Biden said on ‘Meet the Press’ that he was “absolutely comfortable with the fact that men marrying men, women marrying women, and heterosexual men and women marrying another are entitled to the same exact rights, all the civil rights, all the civil liberties.” That’s not an outright endorsement of gay marriage: instead, it’s an absolutely clear statement that the real issue is over denying a group of people their civil rights. 

The President’s people were quick to put some air between Biden’s remarks and the President’s ‘evolving’ position, but by the middle of the week the President had declared himself supportive of same-sex marriages. Was it a historic announcement, akin to FDR signing the National Labor Relations Act in 1935 and Lyndon Johnson clearing the way for the Civil Rights Act of 1964?

It’s too soon to tell, but some public policy experts and presidential historians are hailing it as a major step in civil rights. 

Peter Dreier, E. P. Clapp Distinguished Professor of Politics at Occidental College, believes that such changes are inevitable. In a blog to the Huffington Post , Dreier cites poll after poll that show increasing support for gay rights, among them the right to marriage. We’ve come a long way, says Dreier, in overcoming prejudice and fear in this area. 

According to surveys conducted by the Washington Post and ABC News, a majority of Americans, 52 percent, now say marriage should be legal for gay and lesbian couples. For those born between 1965 and 1980, 50 percent believe gay marriages should be legal. For Americans born after 1981, fully 63 percent support the legalization of gay marriage.

Was the President courageous in taking such a stand? Most of the reaction, it seems, was about the effect his statement would make on his chances of reelection. Whether he boldly went where no president had gone before or whether he prudently stated the obvious, the fact is that he jerked the Romney camp—and Republicans in general—on the defensive. By recognizing it as a moral decision, not just a political calculation, Obama put the issue on firmer ground than mere ideology. In admitting his own gradual process he gave Americans a reasoned model for a significant change in one’s thinking. 

There are at least two ways to regard this as more than a political IED. One is to claim it as an abomination from a religious perspective, a view based on a handful of texts in the Old and New Testaments. From this point of view the issue of gay marriage isn’t the problem, homosexuality is. The argument wouldn’t even get as far as civil rights denied. Since homosexuals have given over their humanity by committing such unnatural acts, the question of human rights is a moot point. Such ‘people’ do not qualify for equal protection under the law. 

Another perspective is to separate it from its religious bindings and to regard it in a civil and secular light. If looked at in this way it is a moral issue, the denial of significant civil rights to a segment of the population that has been demonized and derided for decades. 

For many thoughtful Christians this might appear as an ethical dilemma, a troubling choice between two apparent goods: the authority of the Bible vs fairness and justice for all. While this is not the venue for a Biblical exegesis on the subject, it is clear that theologians and Biblical scholars do not have a consensus on the Bible’s teaching about homosexuality. The word never appears in the Bible, for one thing, but more significantly, where the practice is condemned it’s usually in the context either of God’s command to populate the land the Hebrews had taken from the inhabitants or it’s a reaction to the degrading practice of pederasty in Greek and Roman cultures. Nowhere in the Bible is a monogamous, committed, and loving relationship between two people of the same gender ever portrayed. There are a number of reasons for this, first among them that the cultures would not have permitted it, and they did not permit it because it had no utility for the propagation of the species and the life of the community. Where survival and cultural identity are threatened such relationships are viewed with suspicion and fear. 

But morality and ethics are fluid elements in human history. Once it was considered right and proper to stone people to death for religious infractions; now most cultures find that repugnant. There was a time when white Christians found Biblical support for owning slaves. That support was refuted and the larger issue of the dignity of persons and love for other persons won the day. 

When religions clash with the historical evolution toward fuller and deeper human rights we should err on the side of human rights. I say this because I believe that true religion is, as the BIble puts it, ‘to care for the widows and orphans.’ That’s not all religion is good for by any means, but it’s certainly the point at which all of us could do better. 

In the late 1970s, when I was in graduate school at the School of Theology at Claremont, a question about gay marriage was put to our teacher during class. The professor, the son of Methodist missionaries to China, a man who was a minister and a theologian through and through, a philosopher who was a leading exponent of process theology, an activist who was a pioneer in a biblically-based environmentalism, thought for a moment and then said words to the effect that, “I believe God wants us to experience the joy of a deep, committed relationship within marriage. Why should gay or lesbian couples be denied that kind of relationship?”

The question startled me then for I could see no argument against it. All these years later, having known and admired such couples, having seen their struggles and their triumphs in married life, I still can’t. There is beauty and strength in the quiet return to each other at end of day.

Miracle doesn’t lie only in the amazing
living through and defeat of danger;
miracles become miracles in the clear
achievement that is earned. 
— Rainer Rilke, from Just as the Winged Energy of Delight

Death of an Uncommon Man

“And so I appeal to a voice, to something shadowy,
a remote important region in all who talk;
though we could fool each other, we should consider—
lest the parade of our mutual life get lost in the dark.” — William Stafford, A Ritual to Read to Each Other

When Michel Montaigne (1533-1592), Renaissance statesman and the father of the modern essay, was thirty-six, he had a near-death experience. He was riding in the forest with three or four companions, servants in his household, musing over something intriguing to him, when suddenly he took a tremendous blow to his back, was flung from his horse, and landed ten yards away, unconscious. It seems that one of his men, a burly fellow, had spurred his horse to full gallop to impress his friends, and had misjudged the distance between himself and his master, inadvertently knocking  Montaigne and his little horse off the path. 

Sara Bakewell tells the story in her book, How to Live or A Life of Montaigne. At the time, Montaigne felt himself to be drifting peacefully toward eternal sleep, although he was actually retching up blood and tearing at his belly as though to claw it open for release. For days he lay in bed recovering, full of aches and grievous pains, marveling at the experience he’d had and trying to recall every moment of it. It changed his life, which, until then, had been dedicated to learning how to die with equanimity and grace. 

In an essay on death, written some years after the incident, Montaigne rather offhandedly sums up the lesson, “If you don’t know how to die, don’t worry. Nature will tell you what to do on the spot, fully and adequately. She will do this job perfectly for you; don’t bother your head about it.” 

Bakewell notes that this became Montaigne’s answer to the question of how to live. In fact, not worrying about death made it possible to really live. In an era in which a man of thirty-six could, by the limits of those times, see himself on the verge of getting old, the contemplation of death had been refined to a high art. Montaigne picked this up from his voluminous study of the Greek and Roman classics, his admiration for the Stoics, like Seneca, and the Roman orator, statesman and philosopher, Cicero, who famously wrote, “To philosophize is to learn how to die.”

Death was an obsession for Montaigne when he was in his twenties and early thirties. In succession, his best friend died of the plague in 1563, his father died in 1568, and in 1569 his younger brother died in a freak sporting accident. In that same year Montaigne got married; his first child, born that same year lived only two months. Montaigne lost four more children, only one of six living to adulthood. Yet, in spite of all that early sorrowful practice, he had grown no easier with death. 

It wasn’t until his near-fatal accident that he began to understand how little our own death need affect us. His experience of it was one of peaceful release; he had almost kissed Death on the lips. From that experience he gradually migrated out of fear of dying to being engaged in living and learning how to live. 

Some of this came to mind today while I was immersed in thought at the funeral of a friend, a man well-respected in my community, who was Chair of the Department of Anthropology at the Smithsonian, author of over 125 scientific articles and books, and once voted by the Washington Post and Washingtonian Magazine as one of the 25 smartest people in Washington, DC. 

He had balanced a life as a scientist in constant discovery-mode with being a husband, a father, a member of a church, and chairman of the local school board. In his sudden death, we mourned the loss of a man who made life look effortless, achievement and highest honors a matter of diligence, whose passing left a body of work and a legacy to be admired. 

I remembered him as being kind, forthright, clear-eyed, and honest, a man who generously took the time to ask one questions of himself and to probe for answers together. 

Our friend understood, said the minister in his homily, that we do not travel this life alone. As a scientist, he worked with others, as a member of a faith community he struggled with matters of conviction and truth, as a man he knew that we do not grieve alone. Not a sentimentalist nor given to emotional displays, he made honesty and integrity his benchmarks for a life with others.

So little time in life. . . so much to live into! Montaigne turns from preparing for death to living a conscious life in a way that remarks upon itself. In the lens of his self-reflection he gives us a mirror for ourselves. In his boundless curiosity about life our friend, Don Ortner, rendered Death almost an afterthought. Be honest, live simply, trust fully, do good work: it’s essential, these men said, to stand for life in the midst of death. 

Just so, William Stafford, from the poem quoted above ends with this stanza:

For it is important that awake people be awake,
or a breaking line may discourage them back to sleep;
the signals we give—yes or no, or maybe—
should be clear: the darkness around us is deep.

Economethics or Can’t Buy Me Love

WilsonOldSprinter

“The most fateful change that unfolded during the past three decades was not an increase in greed. It was the expansion of markets, and of market values, into spheres of life where they don’t belong.” — Michael J. Sandel, What Money Can’t Buy: The Moral Limits of Markets

Once in a while a book hits a resonant tone within one’s life, enough so that you want to exclaim, “Yes! This is what I’m saying.” Such a book is Michael Sandel’s What Money Can’t Buy, and the tone he hits is that we live with the assumption that everything has its price and there is nothing that money can’t buy. Examples abound: the Dallas school district that pays second graders $2 for every book they read; the practice of paying Indian surrogate mothers upwards of $6,250 to carry a pregnancy; the right to shoot an endangered black rhino on a game farm for $150,000, and on and on. 

 

Sandel’s argument, carefully considered and reasoned, is that utilitarian arguments for letting the market dictate the most efficient way to fulfill our wants lead to inequality and corruption. Inequality, because if everything is for sale, those without the means end up suffering even more. And corruption because pricing certain goods in life changes and distorts our perspective on the value of those goods.

 

If all he had done was to point out such instances, that would be interesting enough: there is apparently no limit to the imagination of people bent on making a buck, no matter the moral cost. But what Sandel has done is to question the assumption that powers the engine of capitalism and that shapes our culture to such an extent that we even subject our relationships with others to a cost-benefit analysis. Moreover, such an analysis is our unthinking default position. You know we’ve succumbed to a virulent ideology when we struggle to feel outrage at the fact that corporations pay to be allowed to continue polluting or that the unflinchingly arrogant can hire someone to do their apologizing for them. By the notions of today’s cultural values that’s known as a ‘win-win’ situation. You have a need and a fistful of cash; I have the answer and a need for your cash. We exchange—and everybody wins.

 

But in that transaction, so transparently justifiable these days, is a tiny pellet of cynicism about the moral meaning of values. To change the metaphor slightly, we drop our values into a volatile bath of corrosive chemicals that leave them leached and useless. 

 

“We corrupt a good, an activity, or a social practice,” says Sandel, “whenever we treat it according to a lower norm than is appropriate to it.” Thus an organization based in North Carolina, called Project Prevention, will pay $300 to drug-addicted women to be sterilized. The founder, Barbara Harris, says, “I’ll do anything I have to do to prevent babies from suffering. I don’t believe that anybody has the right to force their addiction on another human being.” 

 

According to market logic the transaction increases the social utility for all parties: the addict gets $300 for giving up her ability to have children, and the organization has the satisfaction that one more drug-addicted baby will not be born into the world. What’s not to like? 

 

Sandel points up two objections. The first is the criticism that this constitutes a form of coercion: offering $300 to a drug addict is an offer she can’t refuse and thus she is not acting freely. 

 

The other objection centers on this as a form of bribery. Public officials who accept bribes demean and degrade their office by applying a lower norm to it than is appropriate. 

Whether or not this deal is coercive, say critics, it is corrupt because both the seller (the addict) and the buyer (Harris) “value the good being sold (the childbearing capacity of the seller) in the wrong way.” Sandel continues: “Harris treats drug-addicted and HIV-positive women as damaged baby-making machines that can be switched off for a fee. Those who accept her offer acquiesce in this degrading view of themselves. This is the moral force of the bribery charge.”

 

Behind these examples lies the real heart of Sandel’s argument with economists: that their claim they only explain behavior but don’t judge it simply cannot be supported. Whether they like it or not they are entangled in moral decisions constantly. The market is not value-neutral but is shaped and influenced by cultural norms. If that were not the case we’d still be buying and selling slaves, since on a purely utilitarian basis it increases efficiency for both the buyer and seller. But for the slave it is a horrible and undeserved punishment because it deprives that person of the respect and freedom due to human beings. If the utilitarian approach works for the greatest good for the greatest number, then it hits a wall on this one and many others like it.

 

In considering this I’d like to coin a new word: economethics—the discipline that studies the ethical implications of economic theories. If ours is a market-driven culture, as Sandel and many others claim, then such a study would be essential. It might keep us questioning whether we really want to gauge the worth of actions and relations and people solely by their pecuniary value (from Latin, pecu, which meant ‘flock or herd or cattle). 

 

But we don’t have to wait for the formal recognition of this field. We can begin the resistance to the reigning ideology by simply practicing the Golden Rule, a form of which has been around in all the major religious faiths since the Axial Age began circa 500 BCE. ‘Do unto others as you would have them do unto you.’ Priceless!

 

 

Seeing the Whole Together

“Teaching is an art, and an art, though it has a variety of practical devices to choose from, cannot be reduced to a science.” — Jacques Barzun, Begin Here: The Forgotten Conditions of Teaching and Learning

“On the face of things, there is no art of teaching. Teaching is, rather, an aspect of all arts; as a division of each art, it cannot be considered an art itself.” — Robert Grudin, The Grace of Great Things: Creativity and Innovation


This is the season when alumni from all walks of life wend their ways back to campuses around the country. They will be variously shocked, discouraged, amused, and maybe intrigued by the changes they see in the old school. 

One thing has not changed, however, and that is the constant question about the value of education in America. Alumni, students, parents, legislators, and teachers ask the question, over and over again in a myriad of ways. The anxiety betrayed by the asking suggests that we have no clear idea what we want out of education. The fact that American students consistently do not place in the top ten worldwide in any subject category is a cause for consternation.

Meanwhile, according to the New York Times, Finland’s students continue to place near the top in international tests of math, science, and reading while the US ranks 27th in science, 19th in math, and 15th in reading. Handwringing and derision are indulged in and delegations of American educationistas make the trek to Finnish schools to learn their secrets. Finland has fewer students nationwide than New York City has — 600,000 to New York’s 1.1 million — much more homogeneity, far less poverty, and the average resident checks out 17 books a year from the library. These are disparate facts; jumbled together they create a somewhat misleading portrait of both countries. 

To read educational surveys and official reports—and they are Legion — is to enter Alice’s Wonderland, minus the humor and heavy on the politicization. To recall the basics about teaching and learning I often read Jacques Barzun and Robert Grudin, those quoted above. These particular sentences, plucked from their context, make it seem that agreement cannot be found among teachers, especially about the nature of teaching. 

But while they may differ on the details they are both, I believe, honing in on a key point: that teaching in its most fundamental and noblest form is about confronting students with what lies outside their narrow concerns. Grudin says, “To learn is not merely to accumulate data; it is to rebuild one’s world,” and Barzun, whose contempt for the latest methods in teaching is unreserved, speaks of the “difficulties,” not the “problems” of teaching. “It will always be difficult to teach well, to learn accurately; to read, write, and count readily and competently; to acquire a sense of history and develop a taste for literature and the arts—in short, to instruct and start one’s education or another’s.” 

Grudin adds a nuance to this by noting that teaching is intended, when done well, to shock the learner with a sudden juxtaposition of the new alongside the familiar. “True teachers,” he says, “all seem to practice, in many ways and under many guises, this form of shock. . . .  Good teaching develops students’ creative abilities by unlocking their sense of wonder. Students learn creativity not directly from the teacher but from the cathartic self-revelation that the teacher inspires.”

When was the last time you felt the ‘shock of the new,’ that bolt of excitement when you realized you understood something that had seemed impenetrable only moments before? Did that happen in a classroom? If so, you are blessed with a rare experience. 

Both Grudin and Barzun recognize that teachers of this sort are few. It is a convenient truth that many students come to college woefully unprepared, some without any apparent study skills and most without any curiosity about the way the world works. But that alone is not reason enough for mediocrity in teaching. 

There are other reasons for why teachers might not give their best day after day in the classroom. One is the sheer size of some classes, when sections of a single course can number 500 or more. Another is the fact that over 60 percent of college teachers these days are adjuncts, a peculiar existence in which one dashes from campus to campus for classes, has no office, and is paid a fraction of what full-time teachers make without any benefits. Still another reason is that studying is just one of many activities in a student’s life. Most of them work, some full-time, and squeeze classes in around their work schedules. A good number are student athletes, another form of work which requires long hours of practice, road trips during the semester, and days missed for injuries. 

Yet another reason is that most students equate a college education as the means to a job, the collecting of ‘skill sets’ which will fit them nicely to step into harness at a variety of locations throughout a lifetime of work. Education, then, is a series of hoops to hop through, obstacles to avoid, and a system to game with the least amount of mental effort. They have been encouraged in this by business leaders, by family members, and by educational administrators who regard  them as ‘customers.’

The natural alternative to this way of thinking is to see education as an end in itself, something done without any regard for practicalities. This viewpoint rightly draws heavy fire from almost everyone who has ever paid bills, managed a household, and held a job. But if formal learning is more than training for a job or personal indulgence then what is it?

Robert Grudin draws the contrast between the Sophists and the Socratics. The Sophists disdained any learning that did not lead to the specific and the practical. They would have felt right at home with students who are training to hold a specific occupation. The Socratics, on the other hand, believed in a liberal education that could transcend the specific and the merely practical. Only by gaining a wider and broader perspective, they argued, could a person become truly practical. Life demands of us the ability to see the forest and the trees, indeed, the tree and the leaf. A liberal education gives us the ability, they thought, to understand why the big picture is made up of many pixels, to use a contemporary metaphor. It is an interdisciplinary body with curiosity at its heart and enthusiasm right out to its fingertips. It is literally a vision or visualization (the Greek word is ‘synoptic,’ meaning ‘to see the whole together’) of the world from diverse points of view. 

“Forget Education,” says Barzun, ironically and good-naturedly. “Education is a result, a slow growth, and hard to judge. Let us talk rather about Teaching and Learning, a joint activity that can be provided for, though as a nation we have lost the knack of it.” 

Wise words for those who would cast themselves as life-long learners. In this season of alumni reunions find a teacher who opened the world to you and thank him or her.

That’s the kind of teacher I’ve yearned to be. After three decades of teaching, I’m still learning. To lift a phrase from St. Paul, “”It is not to be thought that I have already achieved all this . . . . but I press towards the goal.”