Showing posts with label Frasier. Show all posts
Showing posts with label Frasier. Show all posts

Monday, March 4, 2013

The Bishop of Rome



I am not Catholic.

I don't see anything wrong with Catholicism. Many of my friends are Catholic. I have attended Catholic services. I was even a pallbearer at a Catholic funeral once.

I say all that merely to establish the fact — beyond any doubt — that I have virtually no credibility when it comes to saying anything about the pope. None. Consider yourself warned. Take anything I say on this subject with the proverbial grain of salt.

Especially if you're Catholic.

See, on this matter, I feel a lot like Frasier Crane must have felt when he found himself at a Jewish shiva. One of the guests — well, several, actually, but I'm thinking of one in particular — observed, "You're not Jewish, are you?" when it was clear from his unfamiliarity with Jewish mourning customs that he was not.

Frasier replied, "Well, my ex–wife is Jewish, which means our son is half Jewish, which makes me — no, I'm not Jewish."

So I guess I'm sort of cutting to the chase by acknowledging up front that I am not Catholic.

I was raised in the Methodist church, but I suppose I have had more exposure in my life to a greater range of religious faiths than most people. My father was a religion and philosophy professor, just like his father before him, and, when I was a child, my family often attended religious services in other faiths. My father knew most of the religious leaders in the area, and we attended services at least once in every faith that was represented in central Arkansas in those days — regardless of the size of the congregation.

In spite of all that — or, perhaps, because of it — I am not especially religious today. I'm not really sure why that is so. Deep down, I think I believe that there is some sort of greater power, but my interpretation of God and the afterlife seems to be quite different from that of most people.

It is not my intention to persuade anyone that he or she is wrong about any spiritual matter since I don't know for certain what lies beyond. Never has been. Isn't now. And, while I cannot see into the future, my guess is it will continue to be that way.

But even if I have little or no credibility on religious issues, that doesn't stop me from having an opinion on Pope Benedict's decision to step down.

I thought it was a courageous decision — and yet another example of what a pope can teach us.

Thanks to the nearly three–decade papacy of John Paul II, there haven't been many popes in my lifetime. In fact, the upcoming conclave, in which the next pope will be chosen, will be only the fourth in my memory.

But it will be the first of its kind in the memories of all living people, no matter what their faiths may be.

The last time a pope resigned, William Shakespeare hadn't even been born. For 600 years, popes have left office only through death. In most families, you would have to go back a dozen generations — if not more — to find the ancestors who were living when a pope resigned.

But Benedict has shown Catholics and non–Catholics that it is all right — even preferable — for a pope to accept the fact that he is not infallible when it comes to the natural aging process, that while he may be seen as infallible when it comes to matters of faith, he is not immune to matters of the flesh. When that process interferes with a pope's ability to face the challenges confronting his church (which has more than 1 billion members worldwide), a wise pope needs to step aside and let someone else do the heavy lifting.

Seven years ago, when John Paul II died after a long, painful and extremely public physical deterioration, it was often said that he showed everyone — Catholic and non–Catholic alike — how to die with dignity. I felt at the time that there was much truth in that, but I also felt that he had forced his church to function without an effective leader while it waited for him to die.

I recall thinking — a year or two before he died, perhaps longer — that the Catholic church needed to have some sort of mechanism through which a pope whose physical or mental capabilities were in the inevitable decline of old age could step aside.

I didn't realize it was possible for a pope to resign. No pope had resigned since Gregory XII in the 15th century. I had always assumed that was part of the deal. When a man became pope, I thought, it was with the understanding that he could not become a pope emeritus.

But Benedict has shown that it is possible for a pope to become a pope emeritus — and put the interests of the church above his own.

It is a wise man who recognizes when it is time to go, to hand the torch to the next one in line. Benedict is to be commended for his selfless act.

As the papal conclave begins, I hope — for the sake of all my friends who are Catholic — that the cardinals will choose a pope with the vitality and the strength to lead his church into the challenges of the 21st century — and to deal with the unfinished, sometimes messy, business from the 20th century.

Friday, July 20, 2012

The Tragedy in Colorado



It is still Friday, July 20, 2012.

The smoke (please excuse the pun) is still clearing in Colorado following the shooting at the midnight showing of the new Batman movie.

In the coming days, I am sure, more will be known about the gunman, what led him to his heinous act and the eventual death toll than is known now. Please keep that in mind as you read this because, if you are reading this sometime in the future, some of the facts are sure to have changed.

There are bound to be things that are thought to be true as I write that will be proven to be false. Already today, the death toll has fluctuated downward, but it may well go upward in the coming days. There are still people fighting for their lives, and some may lose that fight. The police are trying to figure out how to get into the shooting suspect's apartment and disarm booby trap[s] the suspect left behind, and I've heard some people say that process could take days or weeks.

But something that will not change is the fact that, once again, the relative peace of daily life for most Americans was shattered.

This morning, after I had first heard of the shootings, I went on Facebook, where I found that one of my former journalism students — now the executive editor of his hometown newspaper — had shared his paper's Associated Press account of what had happened in those early morning hours when most of us were sleeping in blissful ignorance.

And, in the comments section that accompanies just about every article that is published online these days, a young person identified as a student at the local high school, commented, "I don't understand why the world has to be like this sometimes."

And I was reminded of when that editor and I were on a college campus together two decades ago, and one of his classmates asked me, after observing the number of religious leaders who had taken exception to various accounts of Bill Clinton's ethics and aligned themselves with the likes of Patrick Buchanan, who had delivered a speech that was extremely long on intolerance at that summer's Republican convention, about the logic behind their position.

"I thought ministers were supposed to be about love and forgiveness," he said to me.

The same thought that crossed my mind on that occasion crossed my mind this morning when I read that young man's comment — the naivete of youth.

I guess we all start out that way. It reminds me of a conversation Daphne had with her father on the TV show Frasier. Her father told her that he was splitting up with Daphne's mother for good, and Daphne, disillusioned and disappointed, said she had always believed that love conquered all.

"We all believe that when we're young," her father replied, "but then life beats us around a bit, and you learn to dream a little smaller."

There may be a lot of truth in that statement, but July 20 has always struck me as a date when the stakes have been even greater than usual — and, consequently, the hopes have been a bit grander, too.

July 20 often seems to bring memorable events. When I was a child, men walked on the moon for the first time on a July 20. When my parents were still young, a plot to assassinate Adolf Hitler failed on this date — an event that wasn't nearly as positive as but, in some ways, was more significant than the moon walk.

(Hitler would be dead within a year, anyway, but, if he had been killed on this date in 1944, it is probable that many lives would have been spared. How many? No one can say.)

When my grandparents were children, the Ford Motor Company shipped its first car on this date. Henry Ford's assembly line concept radically transformed the 20th century.

There are other dates like that on the calendar, dates when great strides were made in medicine, manufacturing, agriculture, whatever.

But sprinkled among them — and sometimes, as is the case today, coinciding with them — are days of unspeakable and unexpected terror and anguish.

Such occasions do not always feature a lone gunman. Sometimes it is other things.

But no generation is immune to shocking reminders that life is not fair.

It hasn't been fair when young people have died in what should be one of the safest settings outside their homes — their schools.

It wasn't fair that a crew of astronauts that included the woman who was slated to be the first teacher in space died in a fiery explosion less than two minutes after liftoff.

Nor was life fair when prominent people were being gunned down and race riots were breaking out in the 1960s.

I have often pondered why it is that some people die so young and others live into their 80s or 90s. There must be more to it than the cliche "the good die young."

The more religious among us will tell you that it is all part of God's plan and that we are not intended to understand God's reasoning.

That's just as well, I suppose, because I used to get headaches trying to figure out God's reason for allowing babies to perish in the Oklahoma City bombing.

The only reason I can think of is sheer randomness. I'm sure there were people at that movie who were there only at the request of others; maybe some of those people were hurt or killed — perhaps only because they were being polite to someone else.

I've heard of unaccounted–for servicemen who were undoubtedly prepared for the possibility of dying in a foreign land but more than likely never gave it a thought while standing in line at a movie theater.

The fact that this kind of thing has the power to paralyze virtually the entire country with fear tells you how rare — comparatively — such a thing really is here.

This summer, I've been re–reading Truman Capote's brilliant nonfiction novel "In Cold Blood" about the massacre of a Kansas farm family in 1959. "[D]rama, in the shape of exceptional happenings, had never stopped there," Capote wrote.

(Aurora, Colo., is considerably larger than Holcomb, Kan., was in 1959, but I suspect that the same thing could be said of Aurora.)

At one point, Capote observed that visitors to Holcomb noticed that almost all the lights in town were on late into the evening.

Capote asked, "Of what were they frightened?" and supplied the answer he had received over and over: "It might happen again."

That is an irrational fear, of course, but it's one that some people do have in these situations. I saw an online poll this morning asking people if they were more or less likely to patronize a movie theater this weekend. Thousands of people responded that they were less likely.

People living in Israel have long been accustomed to the idea that the store where they were shopping or the restaurant in which they were eating or even the road upon which they were traveling could erupt in violence at any minute.

When that happens, they mourn their dead, too, but they move on much faster than we do.

Here in America, I expect our national conversation to focus on Aurora for weeks — in spite of the Olympics and, perhaps, in spite of the conventions.

That will be a good thing if it leads to constructive conversations about what can be done to minimize the risk of such a thing happening again without trampling on constitutional rights.

But already today I have heard people, on both sides of the divide, arguing that the gunman had a political agenda.

Such talk can have no purpose except to contribute to what is already shaping up to be the dirtiest presidential campaign in my memory.

And that we do not need.

What we need is a discussion about how to reduce the possibility of violence intruding on our daily lives.

It probably cannot be eliminated.

But maybe it can be curtailed.

Tuesday, August 2, 2011

Do You Really Want This Job?

That's what I want to ask anyone who runs for president.

I mean, I know full–time jobs are hard to come by these days — and that one does pay well, along with coming with a well–furnished, rent–free home in which to live for four years and an airplane that will take you anywhere you want to go.

But, other than that, I really would like to know why anyone would want to be president.

It wasn't that way when I was a kid.

I was always interested in the presidents when I was small. Looked up to them, I did. When I was little, I committed to memory all the presidents in chronological order — don't know why I did that, couldn't even hazard a guess. Nevertheless, I did that when I was in first grade. True story. I had been given a set of cards with each president's portrait on one side and brief biographical data on the other. Sort of like baseball cards — but with commanders in chief.

Somewhere I got the idea that serving as president was the greatest, noblest thing to which a person could aspire. My parents and my friends would tease me about coming to visit me in the White House. There was even a time when I believed I would be president one day.

But I gave up on that idea a long time ago.

To seek the presidency, I believe, requires an unholy alliance of selflessness and egotism. It is a combination one rarely finds in garden variety occupations (outside of politics). The successful application of those two personality traits is rarer still.

A president must be incredibly selfless, willing to accept the nearly constant scrutiny, the almost total lack of a private life that comes with the territory — and, simultaneously, he must possess an enormous ego to think that he can wear all the hats one must wear as president.

The expectations really are incredible. No human being could possibly live up to all of them — but that hasn't kept some from trying.

Some, of course, haven't even tried.

But most who have tried to be all things to all people — and most who have not tried to be anything — have not succeeded in the Oval Office.

The successful presidents, the ones who are remembered by history, typically are remembered for their strengths in spite of their weaknesses. They carved out their niches. You know what I mean — Lincoln, for example, is remembered as "Honest Abe" and "The Great Emancipator" (even though abolishing slavery was not his initial goal when the Civil War began).

The unsuccessful ones tend to obtain less flattering nicknames.

In the first 2½ years of his presidency, Barack Obama has shown time and time again that among his top priorities is a desire for bipartisanship — preferably while maintaining a certain amount of distance which gives the appearance of elitism to some.

(Sometimes he reminds me of Frasier Crane, who was once the subject of an unflattering limerick that was scrawled on the men's room wall at work, and he sought to prove he was just one of the guys by inviting all of his colleagues, most of whom he did not know, to a party at his place.

(But his quest for acceptance backfired on him. His colleagues embraced him a little too warmly, and Frasier lamented the end of the days when he was "unapproachable" to most of the people in the office. "Couldn't they have sent just one representative?" he asked.

(After weeks of dealing with the likes of John Boehner and Mitch McConnell and their minions, I have to think Obama would sympathize.)

Maybe it was something else. I suppose, when Obama's presidency has been over for several years and historians have had the opportunity to assess every angle of every action and the long–term consequences, there will be an answer of some kind.

But what seems clear to me, at this point, is that Obama squandered much of his political capital on inconsequential fights early in his term when his party handily controlled both chambers of Congress, leaving him with little in reserve when he really needed it — on this squabble over the debt ceiling (which still isn't resolved as I write this, by the way).

On the surface, one can say that Obama probably did about as well as he could have hoped for. Neither side was going to get everything it wanted, but the catastrophe that he and others feared probably has been avoided.

If the crisis really has been resolved, it can't be seen as a triumph for either side. Both sides will spin it to their best advantage, but the truth is that it never should have come to this in the first place.

The real "winners" — if it can be said that there were winners — were the millions of Americans whose existences may have been made much more difficult if nothing had been done.

Some people will say that was leadership — although, publicly at least, Obama remained at arm's length of the debate and let others do the heavy lifting. Meanwhile, the rhetoric from both sides remained quite partisan, suggesting that even more intense debates lie ahead in the 2012 election cycle.

Given the partisan tone of this debate, Obama couldn't be surprised at the criticism that has come his way from the right and the center. But he might be somewhat taken aback by the wrath that has come from the left.

Many Democrats in Congress aren't happy, and neither are columnists who are usually supportive of this president.

There was a double whammy for Obama in the almost always supportive New York Times.

Paul Krugman wrote that the president surrendered under pressure and there is more to come. A precedent has been set, he says, that will endure beyond the Obama presidency.

"[H]ow can American democracy work if whichever party is most prepared to be ruthless, to threaten the nation's economic security, gets to dictate policy?" he asked.

Another of Obama's followers, Maureen Dowd, may have been even more damning. She wrote that Obama — in the eyes of an unnamed Democratic senator who, presumably, will be among those to vote on the deal today — is "turn[ing] into Jimmy Carter right before our eyes."

Such is the fickle nature of American politics. When Obama was elected and about to take the oath of office, he was compared to Lincoln and FDR. As his first year in office dragged on, the comparisons dropped to less historically impressive (flawed but successful) predecessors.

When the discussion deteriorates to comparisons with one–term presidents, I would suggest that you investigate the source. It's usually someone with an axe to grind. But Dowd was on board the Obama bandwagon before there was a bandwagon. She was an Obamaphile before Obama was cool.

And, over at the Washington Post, which has been nearly as supportive of Obama as the Times, Greg Sargent reminds those in power that, once the latest distraction is behind us, it's time for that long–promised "pivot to jobs."

That makes me think there may be some dissension within the ranks. At best, with his approval rating languishing in the 40s, Obama must realize that the 2012 election will be much closer than the one in 2008. States that narrowly voted for Obama — and rarely vote for Democrats — like Indiana, North Carolina and Colorado — are highly unlikely to vote for him again, and it will be touch and go in a lot of other places, too.

When liberals start saying a liberal president surrendered and is taking on the look of the Democrats' last one–term president, it could well lead to a challenge from within Obama's party (it isn't too late for a challenger to emerge, however unlikely he/she would be to succeed) and a possible victory for the other party in the general election — even if the opposition nominates someone generally seen as an extremist.

That was what happened to Carter — who entered the presidency on a wave of popularity that was similar to Obama's but is remembered by some as "President Malaise."

It isn't a fair characterization. As I have written here before, Carter never used the word malaise in his now–infamous speech from July 1979. But these things take on lives of their own.

It really makes me wonder why Obama would want to spend another four years in the White House — or why any of the Republicans who are challenging him would want to take his place.

Monday, May 11, 2009

The Journey to Acceptance



Forty years ago, Dr. Elisabeth Kübler–Ross published a book titled "On Death and Dying," which identified the five stages a person goes through in coming to terms with death (either that person's impending death or the death of someone he/she is close to) or some other catastrophic loss (job, divorce, etc.).

Those stages are:
  1. Denial

  2. Anger

  3. Bargaining

  4. Depression

  5. Acceptance
The book elaborates on each stage. They're mostly self–explanatory, I think, although, if you aren't familiar with these stages of grief, it might help to add a little explanation about "bargaining." In that stage, a person "bargains" with a higher power to spare his/her life or the life of a terminally ill person.

Kübler–Ross said these stages do not necessarily come in this order, and not everyone experiences all of them. But she said most people go through at least two. Presumably, a person can make the transition to the acceptance stage more smoothly if there is someone around to help.

Back in 1998, unemployment was not nearly as extensive as it is today, but the "Frasier" show used Kübler–Ross' five stages of grief as the basis for an episode about Frasier losing his job. Although it was presented in the context of a situation comedy, there are probably many people today who could relate to what Frasier went through.

Some of the jobless folks — especially the ones who have recently lost their jobs — may not realize that adjusting to the loss of a job is like adjusting to a terminal illness. The "Frasier" sixth season premiere (you can see the final one–third of the episode in the clip that is attached to this post) should be required viewing for all who lose their jobs. I think it would speed up the acceptance process.

As it so often did, "Frasier" provides real wisdom at the same time that it entertains.

But the truth is that there's nothing funny about the journey to acceptance. It can be tough. Unfortunately, not all of the millions of people who have lost their jobs in this recession are as fortunate as Frasier to have loving friends and family who stand ready to help.

And that's one of the most important factors.

Monday, April 13, 2009

Young at Heart



Last fall, I started writing two blogs in addition to this one. It seemed to me that there were too many topics I wanted to write about, and I figured two additional blogs would help me to be more organized in my thoughts.

It's worked for me, too. But there are still times when the topics I want to write about overlap. This is one of those times.

I guess, in part, it's due to the Easter season, with its emphasis on life and death and resurrection. How can one not think of death when the crucifixion is such a prominent part of the season?

And, as I mentioned yesterday, Easter was the last time I saw my mother. She was her vibrant, healthy self when I saw her, robust at the age of 63, but a flash flood took her life in May 1995. It has been inevitable, I suppose, that I have thought of her on every Easter since then.

And, in the days leading up to Easter, Americans were shaken by the highly publicized deaths of two young people — a rookie pitcher for the Los Angeles Angels, who died in a car crash at the age of 22, and an 8–year–old girl, who was murdered and then was stuffed in a suitcase that was submerged in a pond.

And, here in Dallas, it seems we're always hearing reports of young people who have died. Case in point — 16–year–old Kimberly Martinez, a sophomore at W.T. White High School, died in a car crash early Sunday. Her boyfriend and his brother–in–law picked her up at a party, and their vehicle struck a utility pole. Speed and alcohol appear to have played a role.

The unspoken assumption is always that one will live to a ripe old age, but that is not the case for everyone. We manage to put that unpleasant thought out of our heads until we are confronted with another example of how brutally unfair life can be.

Sometimes life can put us in a funk. I've been thinking today of an episode of "Frasier" that seems appropriate, and I've posted a clip from it with this post. That's where the overlap in this comes in. Typically, I would post something like that on my entertainment blog, but it seems to me that "Frasier" often strikes just the right note and transcends my feeble attempts to categorize things.

Maybe that is because Frasier is a psychiatrist. True, he's somewhat self–absorbed and his stories are entertaining, but he often manages to come up with the answers to the questions we all face. In the clip I've attached to this post, Eddie the dog was despondent and, in an attempt to discover the reason and restore him to his perky self, a dog psychiatrist was brought in.

Frasier and his brother, both of whom are psychiatrists, resisted the idea. They believed that an animal psychiatrist is a quack.

The episode examined some other points about the relationships between people and their pets, often in a humorous way. I've always enjoyed the part of the episode where the dog psychiatrist asks the members of the family what Eddie would do as a human. He wanted to know what Eddie the human would serve at a dinner party — Martin thought he would serve meatloaf and Daphne speculated it would be poached salmon, but skeptical Niles insisted those entrées "might be underdone" because Eddie couldn't reach the knobs on the oven. When the psychiatrist wanted to know what Eddie the human's first words would be, cynical Frasier suggested, "Give me a breath mint!"

Then, when the dog psychiatrist wanted to know what kind of cologne Eddie the human would wear, Martin figured it would be Aqua Velva, but Frasier said it would be "toilet water." Niles chimed in, "Same answer for 'favorite beverage!' "

The story also gave the characters a chance to explore the debate one often hears between dog owners and others, in which dog owners insist that dogs understand what is said to them. Watch this clip. It's short, but it manages to pack a lot into a brief visual moment.

How easily the conflicts in life can be resolved, though. As it turned out, Eddie's problem was that his favorite toy was buried beyond his reach under sofa cushions, prompting Frasier to advise his caller to "take a tip from our dog friends and treat yourself to your favorite toy."

Or, perhaps the words from a Frank Sinatra song express it just as well:
"Don't you know that it's worth
Every treasure on earth
To be young at heart
For as rich as you are
It's much better by far
To be young at heart
And if you should survive to 105
Look at all you'll derive
Out of being alive."

Whatever your age may be, it's good advice to be young at heart.

I don't mean to be blasé about this, but the Grim Reaper will come whenever he chooses. There's nothing to be gained from hastening his arrival.

In the meantime, treat yourself to your favorite toy.

Wednesday, September 24, 2008

The Dean of Ballparks


The exterior of Fenway Park in 1914.



Now that Yankee Stadium has ended its 85-year tenure as the home of the New York Yankees, which stadium holds the title of dean of America's ballparks?

Actually, that much hasn't changed.

Yankee Stadium opened its doors in 1923. But the ballpark that has been open longer than any other was — and still is — Boston's Fenway Park, which, as baseball historians will tell you, was Babe Ruth's home for many years before the Bronx opened the House That Ruth Built.

In fact, Fenway Park opened in 1912. Ground was broken on the construction site 97 years ago tomorrow, on Sept. 25, 1911, and the first baseball game was played there the following April — a few days after the "unsinkable" Titanic sank in the North Atlantic.

And Ruth, who began his baseball career as a pitcher, played his first major league game with Boston in 1914.

Every other ballpark in the major leagues came into existence after Yankee Stadium. The oldest baseball stadium in the National League is Chicago's Wrigley Field, which opened in 1926.

With the noteworthy exception of Wrigley Field, all the other ballparks in the majors are mere infants by comparison.

In the National League, for example, the next-oldest ballpark after Wrigley Field is Los Angeles' Dodger Stadium, which opened in 1962.

All the other ballparks in the American League are still practically waiting for the paint to dry when compared to Fenway Park. The next-oldest baseball stadium in the American League — now that Yankee Stadium has closed its doors for good — is Baltimore's Camden Yards, which has been open since 1993.

Almost every major league baseball team plays in a ballpark that was built in the last 15 years. And almost all of those ballparks were said to feature something that distinguished the ballparks of yesteryear.

Why not renovate those ballparks instead of starting over from scratch? That seems like a fair question to ask. But the answer is different in each city. Maybe the location of the ballpark creates parking problems that can't be resolved or land can't be acquired to expand the seating sufficiently.

Maybe the stadium's location interferes with commerce. Maybe the financial operation has been poor and the owners had to sell to someone who doesn't care about history, just how much money there is to be made from buying and selling the land.

Sometimes, the structure is simply antiquated, unstable, and needs to be replaced.

Whatever the reason — and, in spite of the fact that the new stadium will be called Yankee Stadium and won't be given one of those hideous names that incorporates the sponsor's name (which could prove to be embarrassing if the sponsor goes out of business or has to endure a humiliating public scandal) — another monument to the past is coming down.

And we are the poorer for it.

It reminds me a bit of an episode of "Frasier," in which Frasier and his brother decide to buy and restore a restaurant that was significant in the family's history but is about to shut down permanently.

They decide to do this while having a one-last-time family dinner at the restaurant, after Daphne observes that, in her native England, they cherish their "antiquities," but Americans can't wait to tear them down.

It's not quite that simple. But it does strike me as ironic.

Sunday, July 27, 2008

The 'Left at the Altar' Syndrome

One of the most popular TV characters of the last quarter of a century was Dr. Frasier Crane, portrayed first as a supporting character on "Cheers!" and then as the lead character in his own series by Kelsey Grammer.

An element of Frasier's character was his ongoing difficulty with women — epitomized in part by his experience of having been "left at the altar" by the supposed woman of his dreams.

I've never been the groom in a wedding ceremony. I can only imagine how it must feel to be left at the altar. In an episode of his TV series, Frasier once described the experience as having left a "sucking chest wound."

But "left at the altar" is the phrase I've heard political analysts use to describe the final step in the transition that voters go through when they're making the decision whether to support the nominee of the party that is out of power.

Normally it happens in the closing days of a campaign. Call it a leap of faith, if you will.

If the voters decide not to take the alternative that is being offered to them, they will leave that nominee at the altar — even if that candidate was perceived to be ahead of the opposition earlier in the campaign.

And, then, presumably, that candidate experiences what Frasier experienced.

In a lifetime of watching presidential politics, I have never seen circumstances that seemed so favorable for the party that has been out of power to capture the White House. The president is very unpopular, the war he started is very unpopular, and the economy seems to be lurching toward a recession (if it isn't there already).

Some might say that the 1980 campaign was an example of a year in which the incumbent party faced impossible odds like the ones I've described. I would point out, however, that the United States was not involved in a war that year.

And another way in which 1980 differed from 2008 is that the incumbent president ran for re-election in 1980. In 2008, the incumbent president is barred by law from seeking a third term, and the vice president declined to run for the presidency.

So the Republican nominee is the proxy who must take the abuse that is really directed at the administration.

Nevertheless, I first heard the "left at the altar" analogy used in media discussions during the 1980 campaign, when Ronald Reagan was challenging incumbent President Jimmy Carter.

The consensus since that time is that Reagan reassured skeptical voters with his performance in his debate with Carter in the last week before the election — and went on to be elected in a landslide.

I heard the phrase used again 12 years later, when Bill Clinton was running against incumbent President George H.W. Bush.

In spite of Republican efforts to make Clinton's lack of military service during Vietnam, his experimentation with marijuana and rumors of his womanizing the issues, Clinton prevailed.

(I even heard a few pundits mention the "left at the altar" syndrome as an explanation for why Michael Dukakis wasn't able to follow through on his apparent leads over then-Vice President George H.W. Bush in the polls in the summer of 1988.

(But I never thought the voters left Dukakis at the altar as much as they were driven away by the image of him riding around in a tank and the viciousness of the Bush campaign's "Willie Horton," "Boston Harbor" and the prison "revolving door" TV commercials.)

I've been thinking about the "left at the altar" syndrome while reading an article that was co-written by Larry Sabato of the University of Virginia, Alan Abramowitz of Emory University and Thomas Mann of the Brookings Institution, headlined "The Myth of a Toss-Up Election."

"While no election outcome is guaranteed ... virtually all of the evidence that we have reviewed — historical patterns, structural features of this election cycle, and national and state polls conducted over the last several months — point to a comfortable Obama/Democratic party victory in November," they write.

"[M]aybe conditions will change ... and if they do, they should also be accurately described by the media. But current data do not justify calling this election a toss-up."

The authors also reflect on the 1980 campaign in making their argument.

"[T]hese June and July polls may well understate Obama's eventual margin," they write. "Ronald Reagan did not capitalize on the huge structural advantage Republicans enjoyed in 1980 until after the party conventions and presidential debate. It took a while and a sufficient level of comfort with the challenger for anti-Carter votes to translate into support for Reagan."

That's really the point of the "left at the altar" syndrome. The voters need to reach that final "level of comfort" to justify leaving the party in power.

If they reach that comfort level, they proceed with the change. If they don't, they fall back on the familiar.

That's the challenge facing Obama — helping the voters reach that comfort level.

Earlier, I mentioned the combination of factors that makes it look like this should be the Democrats' year. Sabato, Abramowitz and Mann make a similar observation.

"You have to go all the way back to 1952 to find an election involving the combination of an unpopular president, an unpopular war, and an economy teetering on the brink of recession," they observe.

"1952 was also the last time the party in power wasn't represented by either the incumbent president or the incumbent vice president. But the fact that Democrat Harry Truman wasn't on the ballot didn't stop Republican Dwight Eisenhower from inflicting a crushing defeat on Truman's would-be successor, Adlai Stevenson.

"Barack Obama is not a national hero like Dwight Eisenhower, and George Bush is no Harry Truman. But if history is any guide, and absent a dramatic change in election fundamentals or an utter collapse of the Obama candidacy, John McCain is likely to suffer the same fate as Adlai Stevenson."


Perhaps. But I still feel race is the obstacle that the electorate must leap over before it reaches the point where it will proceed with voting for a black man for president.

Whether voters admit it or not, whether it's politically correct to acknowledge it or not, I believe race remains a barrier, albeit a psychological one, for many voters. They may want change, but they may not be ready for this particular change.

I mentioned yesterday that the Democrats already enjoy nearly unanimous support in the black community. What Obama needs to do is reassure members of groups that haven't been as supportive of Democrats in the past.

And he needs to close the deal with these groups.

In 2004, for example:
  • John Kerry won the voters who were under 30 — but those voters represented only 17% of the participants in the election. George W. Bush, meanwhile, won a majority of the voters who were 30 or older. Obama needs to reassure older voters, who have proven to be more reliable election participants, while encouraging his energetic young supporters to show up at the polls.
  • It has been suggested that Obama's presence on the ticket will energize blacks in the South and lead to a massive increase in black participation in that region. In 2004, whites were the only racial group that voted for the Republicans, but they represented 77% of the vote, and they gave 58% of their vote to Bush (a margin of about 16 million).

    There aren't many black votes left for Democrats to win, but there apparently are many white votes to be won.
  • Meanwhile, the South produced 32% of the 2004 vote — and the Republicans cruised to victory in the South, 58% to 42%. That's a margin of more than 7 million.

    (I've heard it said that Bob Barr may be in a position to influence the outcome of the race — particularly in some Southern states, especially his home state of Georgia — by siphoning off votes from McCain. But Steve Kornacki says, in the New York Observer, that "it is highly, highly unlikely that Barr will be a consequential player" in the election.)
  • Because of the animosity of the primary campaign, rumors persist that many of Hillary Clinton's female supporters (and possibly some of her male supporters) will either support McCain or choose not to vote at all.

    That would be bad news for Obama. Democrats won the female vote against Bush in 2004, 51% to 48%, but they haven't won the male vote since 1992.

    They need to follow a strategy that will retain their female supporters while gaining ground among male supporters.
  • Remember Obama's remark about people who cling to guns and religion? It might be wise to avoid that kind of remark in the future.

    In 2004, 54% of voters who participated in the election were Protestants — nearly 60% of those voters supported Bush. And 27% of the voters were Catholic — but Kerry, who is also Catholic, lost that demographic to Bush, 52% to 47%.

    Gun owners were a minority in the 2004 electorate — 41% of participating voters said there was at least one gun owner in the house, and 63% of those voters supported Bush.
There are many demographic groups that are capable of swinging a close election to one side or the other.

It is not wise for a campaign to take victory — or defeat — in any group for granted.

Saturday, June 7, 2008

Fears About the Future

Kurt Anderson has written an interesting article about the Barack Obama campaign's chances against John McCain in New York Magazine.

Anderson is an Obama supporter, but he has a list of 10 factors that, far from making Obama seem invincible, clearly make the presumptive Democratic nominee appear vulnerable in November.

I don't intend to recite all of his concerns here. I will merely discuss a few of them. But I encourage you to read the article in its entirety.

I don't think Anderson's intention is to discourage Obama's supporters. I believe he wants Obama to win the election.

But, with a pedigree that goes back to his work for the George McGovern campaign in 1972, Anderson has some insights to share. To ignore them is to ignore the lessons of history -- and human nature.

One of Anderson's concerns is stated this way: "Presidential elections are Civil War re-enactments -- in which the North can lose."

I grew up at the same time as Obama did, in the 1960s, when blacks -- and white civil rights activists -- were murdered in the South and in bordering states.

The mentalities of the people who committed those acts had their roots in the Reconstruction period a century earlier, and that thinking still lives in many of their descendants. It isn't overt, but it's there.

Today's South isn't the South of 40 years ago, when segregation continued to exist in spite of federal law and disciples of Jim Crow perpetrated acts of homegrown terrorism (for which most were later acquitted by all-white juries). Today's South has assumed a different shape, one that may appear reasonable, even pleasing, to the naked eye.

The proof is in the pudding, the saying goes. If Obama is able to carry even one or two Southern states in the fall, I will begin to believe the region is truly being transformed. But history tells me that calling the struggle to win Southern states an "uphill battle" is being kind.

Moreover ...

Lyndon Johnson believed the Civil Rights Act and the Voting Rights Act were essential pieces of legislation. But the Democrats' support of those bills, he also believed, would deliver the Southern states to the Republican Party for decades to come. He confided as much to his closest advisers.

Johnson died about four years after leaving office. He didn't live to see how prophetic his words were. But, in the more than 30 years that have passed since Jimmy Carter was elected in 1976, Southern states have repeatedly voted for the Republican tickets, many of them continuing to do so when a Southerner named Bill Clinton was at the top of the Democratic ticket and the economic conditions were far from favorable for the Republicans.

There have been times when that tendency has truly contradicted logic -- as in 2000, when Vice President Al Gore failed to carry his home state of Tennessee, making Gore the only major party presidential nominee since McGovern to lose his home state.

(I expect Obama to carry Illinois and McCain to carry Arizona, so I believe Gore will continue to hold that distinction after this election is over.)

Of the five Southern states that voted for George Wallace's independent candidacy against Richard Nixon and Hubert Humphrey in 1968 (Alabama, Arkansas, Georgia, Louisiana and Mississippi), all five supported Carter in 1976 but only one (his home state of Georgia) supported Carter's bid for re-election in 1980. Since that time, Arkansas and Louisiana backed Clinton in 1992 and 1996 and Georgia supported Clinton in 1992.

Clinton also carried Tennessee twice in the 1990s, and he carried Florida in his bid for re-election in 1996.

The Southern states unanimously endorsed George W. Bush in 2000 and in 2004, in spite of the fact that Gore was the Democrats' presidential nominee in 2000 and John Edwards (from North Carolina) was the party's vice presidential nominee in 2004.

The Southern states also unanimously endorsed Bush's father in 1988, even though a Texan was on the Democratic ticket -- and the revival of that "Boston-Austin" connection was widely believed by Democrats to be the prescription they needed to recapture the White House.

Individual congressional districts have occasionally elected black Democrats (typically in districts with large black populations), but blacks have consistently lost statewide races in the South.

In Gore's home state, for example, a black Democrat with 10 years' experience in the House of Representatives was nominated for the Senate two years ago. The voters in that state elected a white Republican (whose only previous electoral success was his election as mayor of Chattanooga) in what was clearly a Democratic year.

The Republicans in Tennessee turned the race in their favor in the final days mostly by characterizing the Democrat's background -- prep school, Ivy League college -- as one of elitism and privilege.

The only black Democrat I know of who has won a statewide race in the South (excluding party primaries) was Doug Wilder, who was elected governor of Virginia in 1989.

Interestingly, Wilder's campaign suffered from what has been dubbed the "Bradley effect," which was named for a black politician named Tom Bradley, the former Democratic mayor of Los Angeles, who ran for governor of California in 1982 -- a year that tended to favor Democrats -- and lost despite leading in the polls in the closing days of the race.

The Bradley effect has been seen in other elections involving blacks and other minority candidates. It refers to white voters being more likely to tell pollsters that they will support minority candidates than they are to actually vote for them.

In 1989, Wilder's comfortable lead in the late polls evaporated by Election Day and the results were close enough to prompt a recount. Unlike Bradley, Wilder managed to win his race -- perhaps because he took positions that haven't been associated with most Democratic politicians, like his support for the death penalty.

And, as governor, Wilder backed up his support for the death penalty by overseeing more than a dozen executions. He also supported cuts in higher education to help bring the state budget into balance.

(As a postscript, Wilder launched an unsuccessful bid for the presidency in 1992 and considered running for the Senate in 1994 because state law prohibits a governor from serving in successive terms. He became an independent and was elected mayor of Richmond in 2004 but announced this year that he would not seek re-election.)

Since I've mentioned the use of "elitism" as an argument that helped the Republicans in the Senate race in Tennessee two years ago, I guess that brings me to another of Anderson's concerns -- "Is [Obama] 'elitist,' too condescending and glib and remote and full of himself?"

Anderson writes, "I don’t find him so -- but then again, I myself am an elitist who can seem condescending and glib and remote and full of himself, so who am I to judge?"

Americans of all races made TV's Frasier Crane perhaps the country's most beloved elitist, first in his supporting role on the series "Cheers" and then in his popular spinoff series, "Frasier."

That didn't necessarily mean they would like to have someone like Frasier Crane in the Oval Office. Americans seem to prefer to have someone like themselves in the White House.

Elitists aren't in the majority. I guess that's part of what makes them elitist.

A president doesn't have to "feel your pain," although that apparently worked for Clinton. But a president who seems to understand what ordinary Americans are facing is preferable to one who appears remote.

It remains to be seen if middle-class, white Americans -- most of whom did not attend Ivy League schools -- prefer Obama (a Harvard graduate) over McCain. Perhaps in his own way, McCain can be viewed as every bit as elitist, with his Naval Academy education, although he may be able to mitigate that impression with his experience as a prisoner of war during the Vietnam conflict.

Anderson also worries about what he calls "the evil in men’s hearts." Given the country's history, he has good reason.

"Every time I watch him work the crowds, I cringe a little, dreading the lurching nut and pop-pop," Anderson writes. "Any assassination is horrific; the murder of Obama could be a national trauma beyond reckoning. ... It would amount to a national statement concerning our racial divide: Nah, we can’t."

Of course, there are many precautions that can be taken to try to prevent an assassination. But it is clear that, at a time when suicide bombers have become distressingly commonplace, if someone is intent on doing harm, such precautions can be overcome.

There are many arguments, pro and con, to be made about the Obama candidacy. Some of those fears are legitimate, some are baseless. They're largely the result of the eternal tension between the known and the unknown.

A great president will help his people deal with that tension.

"The only thing we have to fear is fear itself," Franklin D. Roosevelt told the country in 1933.

A fearful America still had to endure the pain of the Great Depression. It trampled some, but most got through it because they recognized fear for the enemy it was -- and their president helped them have the faith that better days were ahead.

To become president, Obama's challenge is to help a majority of his countrymen have that faith in him.

Saturday, September 29, 2007

What's In A Name?


In one of his famous lines from one of his most famous plays ("Romeo and Juliet"), William Shakespeare writes, "What's in a name? That which we call a rose by any other name would smell as sweet."

I had to think long and hard when I was deciding which name to give this blog. I didn't want to choose a name that was overly pretentious. I wanted it to be something that suggested the challenge involved in writing about the issues of the day and the intention of delivering thoughtful, rational perspectives -- combined with respect for the ongoing efforts to make this country what it was designed to be.

A name is not something to be taken frivolously. This is a topic that was addressed, in a way, on "Frasier" a few years ago. Niles and Daphne wanted to be on a waiting list for an exclusive school for their as-yet unconceived child, and they needed a name as a "placeholder." They tried a lot of methods but couldn't come up with anything that pleased both of them. In the end, Roz told them that she would choose a name, Niles and Daphne wouldn't have to know what it was, and it would serve its purpose.

Unfortunately, the name Roz chose -- "Ichabod" -- ultimately led the school's selection committee to conclude that Niles and Daphne didn't take the selection process seriously, and "Ichabod" Crane was rejected.

(Even so, I guess Roz's choice was better than the one Daphne came up with at random from a phone book. "Bob" Crane implied too many unsavory things. More unsavory than the "Headless Horseman.")

In making the same point many, many years ago, comedian George Carlin observed, "If Janitor in a Drum made a douche, no one would buy it."

I have a good friend who knows just about everything there is to know about the internet and computing. He believed I should choose a name that will jump to the top of the list when people run a search in Yahoo or Google or some other search engine.

But I was trained and educated as a journalist. Marketing isn't my thing. So the name I picked may not be the best choice for promotion, and it may not be easy to find, but I think it reflects the psychology behind this blog.

I chose "Freedom Writing" because it's a nice play on the name of the participants in an event of political and historical significance in America -- the Freedom Riders.

To test the Supreme Court's ruling in Boynton v. Virginia (1960), civil rights activists rode buses into the then-segregated Southern states. Those activists were called "Freedom Riders" and they embarked on their journey on May 4, 1961.

The Freedom Riders and their mission were overshadowed in the news at the time by the fact that astronaut Alan Shephard, on board Freedom 7, became the first American in space the next day with a 15-minute flight.

But the Freedom Riders have a place in the history of social justice in America, just as I hope "Freedom Writing" will have a place in the public debate about justice in this country.