Monday, November 30, 2009

The Always Quotable Mark Twain


Actor Hal Holbrook has been honored for his portrayals of Mark Twain.


It was 174 years ago today that the man who is, perhaps, my favorite writer of all time was born.

His birth name was Samuel Langhorne Clemens, but he is better known to history by his pen name — Mark Twain. He always said the name came from the lingo of Mississippi riverboats, on which he worked in his youth. When a riverboatman called out "mark twain," he was confirming that the water was two fathoms (12 feet) deep and, therefore, safe for a boat to pass.

He wrote about the origin of his pen name — and his experiences as a steamboat pilot in training — in his 1883 memoir, "Life on the Mississippi."

He was a prolific writer — a contributor to many American newspapers, a short–story writer and author of great novels. No less of a literary luminary than William Faulkner dubbed Twain "the father of American literature." Twain wrote many great stories in his life, including the one that is remembered by many as "the great American novel""Adventures of Huckleberry Finn."

Unfortunately, in some quarters, the story of Huck Finn has fallen victim to 20th century political correctness because of its frequent use of the word "nigger," which was commonly used in the 19th century.

Today, racism is a charge that seems to be almost casually tossed about, but if anyone wants to accuse Twain of racism because a 125–year–old book routinely uses that word, more evidence will be needed. And that evidence will have to be sufficient to outweigh the facts that Twain
  • was a resolute abolitionist, and

  • he paid the college expenses for blacks who wished to study law and theology.
Truth is, Twain was a progressive who supported women's suffrage (which became the law of the land about a decade after his death) and the labor movement.

Anyway, a couple of months ago, I observed H.L. Mencken's birthday by reminding my readers of some of his most quotable quotes. It is only fitting that I should do the same today in honor of Mark Twain.
  • "[A] classic — something that everybody wants to have read and nobody wants to read."

  • "Honesty is the best policy — when there is money in it."

  • "Always do right. This will gratify some people, and astonish the rest."

  • "Be respectful to your superiors, if you have any."

  • "James Ross Clemens, a cousin of mine, was seriously ill two or three weeks ago in London, but is well now. The report of my illness grew out of his illness; the report of my death was an exaggeration."

  • "[H]eaven for climate, Hell for society."

  • "The only reason why God created man is because he was disappointed with the monkey."

  • "I have been complimented many times and they always embarrass me; I always feel that they have not said enough."

  • "Always acknowledge a fault frankly. This will throw those in authority off their guard and give you opportunity to commit more."

  • "Familiarity breeds contempt — and children."

  • "None but the dead have free speech."

  • "Figures don't lie, but liars figure."
It would be great if a complete anthology of Twain's writings could be purchased, but he wrote many articles for obscure newspapers and he used several pen names during his life. As a result, previously undiscovered writings are frequently coming to light, like hidden gifts from history.

Perhaps more such gifts will emerge between now and April 2010, when we will observe the centennial of Twain's death.

Friday, November 27, 2009

The Worst of Times

While I do believe it may be a wee bit premature to make this kind of judgment, TIME's Andy Serwer is ready to label the first decade of the 21st century the "decade from hell."

"We're still weeks away from the end of '09, but it's not too early to pass judgment," Serwer writes. "Call it the Decade from Hell, or the Reckoning, or the Decade of Broken Dreams, or the Lost Decade. Call it whatever you want — just give thanks that it is nearly over."

I'll grant you, no decade that starts with the Y2K scare, the tedious recounting in Florida and the actual horror of September 11 and ends with the nation in the grip of a cruel recession — with a major hurricane and two stock market crashes thrown into the mix — will be remembered with much fondness.

Perhaps closing the book on this decade may help us break out of this funk we've been in for the last two years. Maybe it will give us the psychological boost we need.

Probably not — but we can always hope.

What Would FDR Do?

There are a couple of things that stand out in my memory of last Thanksgiving:
  • The terrorist attacks on Mumbai.

  • The anticipation of the Barack Obama administration (accompanied by speculation that Obama's former rival for the Democratic nomination, Hillary Clinton, might be his secretary of state).
Here we are, a year later, and I'm seeing articles about a president who has been dead for more than half a century, Franklin D. Roosevelt. What gives?

Well, I guess it's only natural for thoughts to turn to FDR. He was elected to deal with the Great Depression, and Obama was elected to deal with the most severe economic downturn since the Great Depression.

For Democrats, there is no president who can match FDR's achievements. He rescued the nation from an economic calamity, and he led the nation to a victory in World War II that he missed seeing by a matter of weeks.

Yesterday, I wrote about an article that found fault with Obama's Thanksgiving proclamation when compared to Roosevelt's.

Today, Stephen Herzenberg writes, in the Pittsburgh Post–Gazette, about the steps FDR might have taken to breathe new life into the economy.

My eyes were drawn to an observation that, because Congress is under pressure with unemployment continuing to rise, "momentum is building for a federal tax credit that would give companies an incentive to hire new employees."

I found this interesting because it is something Obama proposed during the presidential campaign last year — but, as PolitiFact.com points out, it was not included in the stimulus package.

So, what seemed (to me) like a logical and potentially beneficial approach to the problem has never been tried. It remains where it has been since Obama first mentioned it on the campaign trail — on the drawing board, an untried theory.

From what I have been able to gather, the proposal was nixed by Democrats in Congress. But it was Obama's promise so PolitiFact.com regards it as a broken promise. If the tax credit proposal had been part of the stimulus package, would it have made a difference? Herzenberg writes that Harvard economist Kenneth Rogoff says "the real appeal of a job–creation tax credit may be that it 'beats doing nothing.' "

The psychological impact of a tax credit notwithstanding, though, Herzenberg writes that, when compared to the things FDR did to combat joblessness in the 1930s, the proposal "looks like, well, nothing." It didn't look like nothing on the campaign trail, but, admittedly, that was a year ago. Millions of jobs have been lost since then.

The question that can't be answered, though, is whether as many jobs would have been lost if the administration and the Democrats in Congress had been more proactive about unemployment.

As far as the jobless are concerned, Democratic efforts to stem the tide now — whether in the guise of a much–ballyhooed jobs summit or any legislative measure — amount to little more than last–minute scrambling designed to save their jobs. But they may be able to do better than that by learning from what FDR did.

In the 1930s, Herzenberg writes, Roosevelt "championed the 'Big Four' social policies:
  • "a minimum wage to lift purchasing power at the bottom;

  • "a law strengthening workers' rights to unionize, laying the basis for the emergence of America's middle class through manufacturing unions;

  • "unemployment insurance, which enabled jobless workers to feed their families; and

  • "Social Security, which enabled the elderly poor to avoid destitution and increase their consumption."
"So far, what is Washington offering as the Great Recession's Big Four?" Herzenberg asks. "The Big Zero."

Herzenberg tries to be fair, pointing out that the Democrats have been devoting a lot of time and energy to health care reform, and he concedes that the reform plan will yield benefits to the economy in the decades ahead. But the issue now is reinvigorating the middle class, which has been brutalized by the recession.

"There are some ideas kicking around the margins that can help shape what today's Big Four might look like," he writes, adding that some "would update elements of the New Deal."

The problem, writes Jeanne Sahadi for CNN.com, is that all the strategies that are being considered for encouraging job creation have downsides.

But that may be the inevitable result of the dithering Congress and the administration have done on this subject all year. They've squandered precious time, but time is running out for the Democrats, who have operated on the false assumption that their triumphs in the 2006 and 2008 elections entitled them to congressional majorities indefinitely. No one knows what will happen in the 2010 midterm elections, but poll numbers suggest Democrats are losing the support of independent voters who were so crucial to their successes in the last two elections.

They may be able to regain the support of some of those independents if they pass health care reform, but as long as unemployment continues to go up, any such gains will be temporary. And the electorate is likely to be in a sour mood by November 2010.

Thursday, November 26, 2009

Another Year Older



Today is Thanksgiving Day, one that has made me wistful. You see, it is also my birthday, and I guess I should be thankful for that, but I don't really know what to feel. I've been out of work for more than a year, and, just yesterday, I learned that one of my high school classmates died of a heart attack a few days ago.

Anyway, today my thoughts have been, to borrow a description from Forrest Gump, floating along on a breeze, kind of accidental–like.

I've been remembering my birthdays during my childhood. I was born on Thanksgiving (I guess that means I've come full circle), and my birthday fell on Thanksgiving a couple of times when I was growing up. And of course there were other times when it was the day before Thanksgiving or the day after Thanksgiving.

Anyway, that always influenced my birthday parties. I don't think I ever had a birthday party on my actual birthday — well, there may have been an exception or two. It's funny now when I reflect on what an ordeal it was just to get some kids together for some ice cream and birthday cake. Mom always had to consult my friends' parents to avoid any conflicts with travel plans.

I guess it was almost a way of life for her when my brother and I were children. My brother was born the day after New Year's, another holiday that involved family travel plans. In fact, today I've been thinking about the day my brother was born. I was 3 years old, and my parents and I were visiting some friends on New Year's Day. My family didn't have a television at the time, and my father was eager to watch the bowl games.

That afternoon, as we were watching the football games, my mother came into the room to tell my father that she had gone into labor. "It's time to go to the hospital," she told him.

I will never forget his reply. It is my earliest memory. "Now? In the middle of the Rose Bowl?"

If there is a comparable story from my birth, no one ever told me. But, even though I was born on the American Thanksgiving, a story about my birth would not include televised football. My parents were missionaries in Africa when I was born, and November 26 was not Thanksgiving where they were. For that matter, I don't know where the nearest TV would have been.

I remember nothing about Africa because my parents and I returned to the United States when I was still very small. My father got a job in Arkansas, and that is where I grew up.

I've been trying to remember when I met my high school classmate who died earlier this week. We went to elementary school together. Perhaps we met in first grade. I know I was in school with his cousin through third grade.

His cousin, who was our age, had leukemia, as I recall. I remember that, when we were in third grade, his attendance was sporadic because of his frequent hospitalizations, and each time he came back he seemed to be doing worse. Once, he had a seizure in the middle of a school day; another time, he was in the bathroom and became so ill an ambulance had to be summoned.

I guess someone — the teacher, perhaps, or the principal — had told the class that Billy was very sick and that it might kill him. I don't remember if anyone ever told us that. But, on the day the ambulance came to pick him up, I vividly remember wondering if we would ever see him again.

Someone had planted that seed in my brain.

Anyway, that summer, he died. My mother took me to the funeral. All the kids in my grade at school were there, and so were the teachers. Our third grade teacher, who always struck me as being very strict, was there, sobbing constantly, muttering about how courageous Billy had been.

There had been other deaths in my world prior to that, but when Billy died, it was the first time I really understood its permanence. Maybe that is because I watched it happen.

And now Johnnie is gone, too. I have a picture in my mind of Johnnie at his cousin's funeral in that long ago Arkansas summer. He was there for everyone else in the family — in hindsight, I can only marvel at the burden he took on when he was only 9 years old. I can remember filing past the open casket and gazing at Billy one last time. He was emaciated. His hair was gone. But he was dressed in a nice suit. That fact has remained with me all these years, for some reason. He was dressed in a suit he probably never wore in life. That seemed odd to me.

Strange, the things a child's mind retains.

That afternoon has defined the death experience for me. And now Johnnie is gone, too.

I don't know if there is a lesson in Johnnie's death for me at this particular time. Maybe there isn't one. Maybe it's just one of those coincidences in life.

I just don't know.

I thought age was supposed to bring wisdom. I guess I'm a little pissed off to realize it doesn't.

Wishing for the Good Old Days


"What we're in is not a Republican recession or a Democratic recession; both parties had much to do with bringing us where we are today. But we're facing a national situation which calls for the best which all of us can produce, because we know the results will be something which we will regret."

Mike Mansfield

I've heard it called the "good old days syndrome."

It is a desire one often hears expressed by older people, a longing for the heroes of the past. It tends to imply that modern leaders/heroes lack something that those from yesteryear had.

I'm sure you know what I'm talking about. But, just so we're on the same page here, is there any better example of what I'm talking about than the exchange between Dan Quayle and Lloyd Bentsen in 1988?

Quayle compared his qualifications to John F. Kennedy's when asked what qualified him to be president. I suppose he could just as easily have compared his congressional experience to Walter Mondale's and Bob Dole's when they ran against each other for the vice presidency in 1976. But I guess Kennedy's iconic status was too tempting to resist. And Bentsen was waiting to pounce.

Anyway, this week, I've seen a couple of "good–old–days–syndrome" observations:
  • At Politico.com, Jake Sherman and Michael Calderone write that David Broder, a Pulitzer Prize–winning journalist, believes that Harry Reid is no Mike Mansfield.

    In an ideologically driven era — and, trust me, there have been times in our history when the houses of Congress were led by lawmakers who put the interests of the nation ahead of ideology and party labels — Broder "favors pragmatists over fierce ideologues." He has expressed his admiration for legislators like Mansfield and Howard Baker, and he has been open in his criticism of Reid.

    Mansfield had to negotiate some choppy waters during his years as majority leader, and he took positions opposing the Vietnam War and supporting civil rights that weren't always popular. But he knew how to work with the members of the other party.

    I'll acknowledge that Reid often appears incapable of keeping his fellow Democrats in line. I almost feel like I'm watching a Woody Allen movie sometimes. Annie Hall Goes to Washington.

  • The other interesting article I've seen was written by John Nichols in The Nation.

    Nichols reflects on the presidential Thanksgiving proclamation and compares Barack Obama (unfavorably) to Franklin D. Roosevelt.

    He finds Obama's proclamation to be "no more poetic, and no more adventurous, than those issued by George W. Bush."

    Ouch! Them's fightin' words when you're talking about a president whose speaking skills clearly set him apart from his predecessor.

    But I must admit that Obama's plain vanilla proclamation left a lot to be desired.

    And Nichols makes a good case for not "carrying on where Bush left off" but aiming higher. He sets as the target (unattainable as it may be) FDR's proclamation on Thanksgiving 68 years ago.
    "May we ask guidance in more surely learning the ancient truth that greed and selfishness and striving for undue riches can never bring lasting happiness or good to the individual or to his neighbors.

    "May we be grateful for the passing of dark days; for the new spirit of dependence one on another; for the closer unity of all parts of our wide land; for the greater friendship between employers and those who toil; for a clearer knowledge by all nations that we seek no conquests and ask only honorable engagements by all peoples to respect the lands and rights of their neighbors; for the brighter day to which we can win through by seeking the help of God in a more unselfish striving for the common bettering of mankind."


    Franklin D. Roosevelt
    Thanksgiving 1941

    "Here was a president seeking not to deny economic turbulence but to offer a vision for responding to that turbulence as united citizenry rather than as isolated individuals," writes Nichols. "This message was a constant for Roosevelt as he implemented the New Deal."

    FDR was a tough act to follow, all right. But, for Obama, it seems to me the proclamation was his opportunity to engage the nation as Roosevelt did. As it was with his failure to speak about unemployment on Labor Day, though, Obama the orator came up short of expectations.

    I am reminded of an exchange of dialogue from the final episode of The West Wing. The newly elected president was about to take the oath of office and then deliver his inaugural address. He and the outgoing president were riding to the Capitol, engaging in a little chit–chat, and the outgoing president asked about the speech. The new president replied that it had a few good lines, but there was no "ask not what your country can do for you ..."

    The outgoing president smiled. "Yeah, JFK really screwed us on that one, didn't he?"
Well, there's a good reason why these guys are remembered. They were giants.

And they give today's leaders standards to aim for.

In some ways, I guess, the good old days were better than you might have thought.

Wednesday, November 25, 2009

Thanksgiving Thoughts



Thanksgiving is one of those holidays that inevitably produces conflicting emotions. Wherever you are, you don't have to look far to find examples of people on opposite ends of the spectrum, people who are deliriously happy and people who are despondent.

I've often heard it said that, no matter how bad things are for you, there is always someone who is worse off. And that is true. But it doesn't mean your pain isn't genuine or that you aren't entitled to it.

On this Thanksgiving Eve, there may be much to be thankful for, but there still is a lot of pain in America. Those who are in charge insist on saying that, by traditional yardsticks, the recession is over — yet the unemployment rate is higher than it has been in more than a quarter of a century and many economists are saying it is likely to continue to go up in the first half of 2010. Last Thanksgiving, many Americans probably could not say that they knew anyone who was out of work. But, with more than 8 million jobs lost since the recession began, my guess is that far more are personally touched by joblessness this year.

I think I speak with a certain amount of authority when I say that most of the unemployed understand that the recession began nearly two years ago — and anyone who comprehends chronology knows that Barack Obama had not yet been elected president at that time.

Recently, I read an article in which the author said that (a) the recession was George W. Bush's fault and (b) it is too early to reach any conclusions about Obama. My response to that is (a) I don't dispute the fact that Bush was president in December 2007, but, even though I am not an economist, I am inclined to believe that a recession as severe as this one is the result of many decisions made by leaders from both parties, and (b) it may not seem fair to draw a conclusion about Obama, but that is the nature of the political calendar. Ready or not, the midterm election season is upon us and the economy is front and center.

I have been saying all year that job creation needed to be the focal point of the administration if it wanted to minimize its losses in the midterms, but Democrats preferred to blame Bush and devote their efforts to other matters. I cannot speak for everyone, of course, but recent public opinion surveys suggest to me that, however people may feel about who deserves the blame for the poor economy, a majority of Americans is running out of patience for the president to fix it. In their eyes, Obama has not done what he was elected to do, even if he thinks that calling for a "jobs summit" next month is (however belatedly) addressing the problem.

Take the stimulus package that was passed back in February. Congressional Democrats insisted, at the time, that it would create jobs almost immediately, but this week a Chicago Tribune editorial called that "a snow job."

I have to think that a newspaper that serves a city like Chicago knows something about snow. And the Tribune makes a valid point about claims of jobs that have been "created or saved" by the stimulus. Especially in places that don't exist — like the 15th House District in Arizona — except at the government's web site, Recovery.gov.

Well, when the national unemployment rate is in double digits, that's the kind of scrutiny the party in power must expect. Like it or not.

While many Obama supporters may dismiss opinions expressed in the Tribune as the rantings of a conservative paper, it is worth remembering that the Tribune endorsed Obama's candidacy last year, the first time it had endorsed a Democrat in 161 years.

And that's a verifiable number — unlike the claims of jobs that have been "created or saved" by the stimulus.

Well, whether you have been personally affected by the recession or not, have a happy Thanksgiving.

Here's hoping that, next year, there are verifiable job gains for which we can be truly thankful.

Monday, November 23, 2009

Parallel Lives



The recent release of Sarah Palin's memoir, "Going Rogue: An American Life," has made me ponder the course of her political career and that of the other woman who was a running mate on a major party ticket a quarter of a century ago, Geraldine Ferraro.

Certain similarities jump out at me, starting with their ages. Both were in their 40s when chosen to be running mates. Ferraro was 48, which was within the range of most previous Democratic running mates. Palin, on the other hand, was 44, the youngest Republican running mate in 20 years (considerably younger than Dick Cheney or Jack Kemp had been).

Initial surveys indicated that both were popular choices, although they ran into trouble once their conventions were over and the campaigns began in earnest. Palin's problems in the 2008 campaign have been well documented, but, in case you need a reminder (or you are too young to remember the 1984 campaign), not only was Ferraro criticized for a style that was regarded by some as reckless and defiant, but she had problems with her family as well. Less than a month after being nominated, Ferraro had to face relentless questioning about her and her husband's finances.

That was a distraction, but Ferraro wasn't helped by her shoot–from–the–hip style. After telling reporters that she would release her tax returns but her husband would release only a tax statement (his explanation to her, she said, was "Gerry, I'm not going to tell you how to run the country, you're not going to tell me how to run my business"), she made a remark that dogged her: "You people married to Italian men, you know what it's like." Republicans sensed a gender–neutral opportunity to attack and they didn't let it go to waste.

Both Palin and Ferraro had somewhat limited political careers prior to being nominated, and their lack of experience frequently was compared (unfavorably) to the abundance of experience possessed by their opponents. After Ferraro's debate with George H.W. Bush and Palin's debate with Joe Biden, both were said to have performed better than expected, but they were hammered, nevertheless, by the opposition for their "extremist" political views, and both lost the general elections by wide margins — even though it could be rightly said that the opposition's presidential nominees were more popular personally than their policies.

Ferraro and Palin were chosen in large part to appeal to female voters. It was a roll of the dice that didn't pay off. They may well have attracted some female voters, but exit polls indicated that neither succeeded in winning the women's vote. After the 1984 election was over, most political observers agreed that no potential Democratic ticket could have defeated Ronald Reagan, and, following last year's economic meltdown, the same probably could be said of any potential Republican ticket in 2008. Blaming the female running mates strikes me as convenient but ultimately indefensible.

Like Palin, the year after the campaign, Ferraro published her memoir, "Ferraro: My Story," which was a bestseller. There was talk about her political future, and she was labeled a "rising star" in party politics, but, beyond founding a political action committee that had as its mission the goal of electing 10 women in the 1986 Congressional elections and two unsuccessful bids for the Senate in the 1990s, Ferraro's political career was over.

There is talk today about Palin's political future as well. What that future holds has been debated since Palin's resignation as governor of Alaska a few months ago, but she still clearly appeals to some Republican voters.

I hear some of today's Democrats fretting about Palin. What will become of the nation, they ask, if Sarah Palin is nominated for president in 2012 — and, God forbid, actually wins? I've heard some cite, as an ominous sign, reports from the Des Moines Register that suggest that more than two–thirds of Iowa Republicans have a favorable opinion of Palin.

"That's close to the 70 percent who hold favorable views of former Arkansas Gov. Mike Huckabee, who won the 2008 caucuses," writes the Register's Thomas Beaumont, "and it's higher than the 66 percent who view former U.S. House Speaker Newt Gingrich favorably. Palin's number is also higher than that of former Massachusetts Gov. Mitt Romney, runnerup in the 2008 caucuses, who is viewed favorably by 58 percent of the state's Republicans."

I would give a lot more credibility to those numbers if this were November 2011 and the Iowa caucuses were a few weeks away. But even that couldn't be viewed as conclusive. In mid–November of 2007, polls indicated that Hillary Clinton and Mitt Romney would win the Iowa caucuses.

I may be wrong, but my inclination since July has been that, unless she runs for and wins a seat in the Senate or the House next year, Palin ultimately will not pose nearly as much of a threat as many Democrats fear. She will have no recent achievements to bolster a political record that was — to put it charitably — quite thin in 2008, but it was acceptable for a vice presidential nominee. It will be far less plausible for a potential president.

Beaumont quotes a former director of the state's Republican Party, who claims Palin is misunderstood and has been victimized by mistakes that were not hers. Therefore, this isn't about achievements. "She's getting the chance to set the record straight."

Fine. I'm all in favor of personal redemption. But my belief has been that resigning her post will work against her when many Republicans ask themselves the tough questions that caucus participants must ask about every candidate. Typically, if you don't have recent achievements, you'd better have a record of achievements. Palin doesn't have the latter and she quit the former. That's not exactly a bumper sticker slogan.

Even today, more than two years before the next Iowa caucuses, there are signs that decision will hurt a potential Palin candidacy. A GOP activist told the Register that Palin "needs a policy platform, with a conservative organization or media outlet, to boost her credibility."

And, even though she enjoys high favorable ratings from Iowa Republicans, Beaumont reports, "24 percent of Iowa Republicans view Palin unfavorably, compared with 12 percent for Huckabee." Party activists told Beaumont they believe the decision to resign has a lot to do with that.

Democrats who are worried about 2012 are getting ahead of themselves. They need to be promoting the idea of getting all their senators and representatives on the same page.

History says the party in power will lose ground in the midterm elections. Lately, public opinion surveys are saying the same thing.

Without the bullet–proof majorities in Congress, how much of his agenda can Barack Obama expect to push through in the last two years of his term? How will his record of achievements look then? I suppose that depends on exactly how much ground is lost in 2010.

See, that achievements thing works both ways.

Sunday, November 22, 2009

Presidential Succession



Today, of course, is the anniversary of one of the most significant events of the 20th century — the assassination of President Kennedy here in Dallas in 1963.

For 46 years, one of the things that has bothered conspiracy theorists is the behavior of Vice President Lyndon Johnson on that day. In a recent documentary on the History Channel, it was suggested that Johnson hastily arranged to take the oath of office on board Air Force One before leaving Dallas in part because he feared that Attorney General Robert F. Kennedy would try to find some way to deprive him of the presidency.

In 1963, that might have seemed unlikely to an American public that had long been conditioned to the idea that the vice president would be the next in line if a president died in office. But that was a procedure that was rooted in a 122–year–old precedent that, independent of constitutional authority, had elevated seven vice presidents to the presidency following a president's death. In fact, presidential succession was not established legally until the passage of the 25th Amendment in 1967.

The truth is that Johnson's concerns were not unjustified even if LBJ appeared, on the surface, to be a bit paranoid.

This may be obvious to readers of this blog, but I have long been fascinated by history's ironic twists and turns. November 22 is loaded with them — and not just in the 20th century.

Take, for example, the case of Richard Nixon and the "Wilson desk."

When Nixon became vice president, he asked for the "Wilson desk" for his office, and his request was granted. But it turned out the desk didn't belong to the Wilson that Nixon had in mind. Nixon was an admirer of Woodrow Wilson, the 28th president, but the desk that adorned his vice presidential office and, later, the Oval Office had belonged to Henry Wilson, the 18th vice president, who served under Ulysses S. Grant.

Henry Wilson was Grant's running mate when Grant sought re–election in 1872. He replaced Vice President Schuyler Colfax on the ticket. Colfax was embroiled in a scandal and was considered too controversial; ironically, it was revealed after the election that Wilson was tainted by the same scandal.

That isn't the part that I find truly ironic, though. After being sworn in, Wilson suffered a serious stroke that affected his ability to preside over the Senate although he tried to persevere in spite of his limitations. Then, on this day in 1875, he suffered a second, fatal stroke, becoming the fourth vice president to die in office. The vice presidency remained vacant until Rutherford B. Hayes and his running mate, William Wheeler, took office in 1877.

A century later, in 1973, Nixon's vice president, Spiro Agnew, resigned amid charges of corruption, and about a month later, Nixon nominated a replacement for Agnew under the provisions of the 25th Amendment. Nixon, of course, chose Gerald Ford, who succeeded him when Nixon resigned in 1974 and became the second president to nominate an unelected vice president.

Here's the ironic part — for me, anyway. If the 25th Amendment had been the law of the land when Henry Wilson was alive, Grant would have had to pick a replacement for him when he died in 1875. But the vice presidency remained vacant for nearly 16 months.

If Grant had died before his term ended, I suppose he would have been replaced by the president pro tempore of the Senate (which would have been Republican Thomas Ferry of Michigan — the state Ford represented in Congress). In Grant's day, the president pro tempore was next in line after the vice president. Congress changed the order in 1886, making members of a president's Cabinet the next in line until the Presidential Succession Act of 1947, which made the speaker of the House next after the vice president. The speaker remained second in line behind the vice president after the passage of the 25th Amendment.

America has been a work in progress for more than two centuries. If the Founding Fathers had been blessed with the ability to anticipate every possible scenario, they could have spelled out from the beginning the procedures for presidential succession and filling vice presidential vacancies.

If that had been the case, Johnson's legal ascendance to the presidency in 1963 could not have been questioned by RFK or anyone else. Instead of being in a hurry to establish a legitimate claim to the Oval Office, LBJ could have focused on whether the assassination had been an international conspiracy involving the Russians or the Cubans, a domestic conspiracy involving organized crime or rogue operatives in the intelligence community or the act of a lone individual — and taking the appropriate steps.

But, as it turned out, the practice of a vice president succeeding a president who did not complete his term in office was not established until 1841, when William Henry Harrison died only a month after taking office and John Tyler, amid considerable confusion brought about by an unprecedented development, took the oath of office. At the time, the ambiguous language of Article II of the Constitution did not indicate whether a vice president would become president or merely an "acting president" if the duly elected president was unable to discharge the duties of the office.

Tyler then served the rest of his term with no vice president.

Back to the "Wilson desk."

Nixon apparently believed throughout his eight years as vice president that it was Woodrow Wilson's desk in his office because he asked for the same desk when he became president in 1969. He even referred to it once in a speech from the White House — his "silent majority" speech in November 1969.

After learning the truth about the desk, speechwriter William Safire took it upon himself to break the news to Nixon, and he briefly discussed — in his book "Before The Fall" — a memo he wrote to Nixon explaining what apparently had happened.

"Spin" was a concept that had not been defined at the time, but Safire proceeded to give the mistake the best spin he could, pointing out to Nixon that Henry Wilson had been an early abolitionist and one of the founders of Nixon's Republican Party.

Nixon, though, was never one to admit a mistake, and he never — to my knowledge — publicly corrected the error.

My guess is that he tried to cover it up.

Saturday, November 21, 2009

Admissions Tests

When I was in high school, I remember getting up early one Saturday morning to take my college entrance exam. Several years later, I went through the same thing when I took my entrance exam for graduate school.

In the weeks prior to both exams, I took practice tests that were available in local bookstores, but they were hardly the same as actually studying for the entrance exams. You couldn't really study for the exams. You could only prepare yourself for the format because there was no telling what kind of questions would be asked.

And the practice tests, as I recall, had strategic tips for the test taker. One such tip was that it was better to leave a question blank if you didn't know the answer because you would be penalized more for guessing incorrectly.

The questions were multiple choice questions, like the ones on "Who Wants to Be a Millionaire?" but you were on your own when you began taking the test. There were no lifelines to utilize — no audience to consult, no friend to phone.

I'm sure those tests are still as stressful as they were when I took them. I was about 17 when I took my college entrance exam and I was about 10 years older than that when I took the entrance exam for grad school.

I can only imagine how stressful it must be for 4–year–olds in Manhattan who, according to the New York Times, are taking admissions tests for kindergarten.

And, apparently, some parents are committing large sums of money to preparing their young children to take these admissions tests, even though, as Sharon Otterman reports in the Times, "[p]rivate schools warn that they will look negatively on children they suspect of being prepped for the tests they use to select students."

It reminds me of an episode of Frasier, in which Niles and Daphne were preparing an application for a "pre–kindergarten and daycare center" on behalf of their not–yet conceived child because the waiting list was several years long.

"It's pre–kindergarten," Daphne protested. "They run around, they sing, they nap."

How special can the school be, she wanted to know.

"I hear that the top 2% in coloring and putting away can pretty much write their own ticket," Niles replied.

Maybe there are kindergartens in Manhattan that really are that special. But I think of the kindergarten I attended when I was 5, and it wasn't very special at all.

It was called "Bluebird Kindergarten," and the teacher ran it from her home. A small classroom had been added to the back of the house, and it was in that room that the teacher instructed us in forming letters and numbers with pencil and paper. We colored pictures. Sometimes we sang songs. Sometimes we took naps. During our recess breaks, we went outside and used rocks to crack open pecans that fell from a tree in the back yard.

At the end of the year, there was a "graduation ceremony." The boys wore blue caps and gowns, the girls wore pink caps and gowns. I don't recall any lengthy speeches — it probably would have been difficult to keep two dozen 5– and 6–year–olds still during a speech. I think "graduation" was mainly a few words from the teacher thanking the parents for their cooperation, followed by distribution of certificates to the pupils.

The whole thing probably took less than half an hour.

It wasn't the Harvard of pre–schools, but my classmates and I learned the basics that gave us the foundations we needed for first grade.

And our parents didn't have to spend thousands of dollars to prepare us for admissions tests.

Friday, November 20, 2009

Going Rogue


rogue (noun)
  1. vagrant, tramp

  2. a dishonest or worthless person : scoundrel

  3. a mischievous person : scamp
Merriam–Webster Online Dictionary

The other night, a high school classmate of mine made an interesting observation on Facebook.

Referring to Sarah Palin's new book, "Going Rogue," my ex–classmate posted the Merriam–Webster link and wondered, "So, which of these definitions does Sarah Palin think best describes herself?"

He confessed that "ever since the book was announced I've been very puzzled by her choice of titles."

Another former classmate tried to clarify the point, saying, "She is using the term that McCain's staffers used about her."

The first classmate responded, "So she wants to emphasize the fact that the people whose job it was to sell her to the public thought she was unstable and irresponsible? If I were managing her 'brand' I would counsel her to reconsider. Maybe this is all part of being a 'maverick' ..."

I wasn't a Palin fan last year — in the interest of full disclosure, I wasn't an Obama fan, either — and she never seems to make it easy on people like me. This is the 24th post I have written in which "Palin" has been listed as a label, and I have tried — or, at least, I feel that I have tried — to give her the benefit of the doubt.

When it comes to Palin, I rarely agree with her on anything, but I do try to be fair. Her book hasn't hit the store shelves yet so I haven't read her side of the story, but I am unaware of any lobbying that she did (or that anyone did on her behalf) to encourage John McCain to pick her as his running mate.

In fact, Dan Balz and Haynes Johnson wrote, in "The Battle for America 2008," that, although McCain had been reviewing possible running mates since securing the nomination in the spring, Palin's name wasn't added to the list of prospects until about a month before the Republican convention — when the McCain campaign "became alarmed at the size of Obama's lead among women."

If it turns out that Palin waged an active campaign to be chosen, I would feel differently. But the Balz–Johnson account tends to confirm what I have suspected all along — that she was not chosen because of her political views but because of her gender.

McCain, I have contended, believed that the majority of Hillary Clinton's supporters could be persuaded to support his campaign if he had a female running mate. But, while Clinton's supporters undoubtedly were disappointed that she came up short in her bid for the Democratic nomination, it turned out they were driven more by ideology than gender.

McCain might have been more successful in winning their support if he had chosen a centrist woman to be his running mate — but I have seen no indication that ideology played a key role in Palin's selection.

Once she was on the ticket, campaign officials may have experienced "buyer's remorse" when it became clear that she was in over her head — but it should be noted that McCain was the buyer. He may have felt remorseful at times, but I think that was overridden by his desire to avoid appearing indecisive in what was perceived to be his first presidential–level decision.

The role of the campaign staff was to be supportive of the ticket and try to help shore up any weaknesses, like the fact that she came across as inexperienced and uninformed in her interviews with Katie Couric. If she was ill–prepared for the national spotlight during the fall campaign, it was in part because the campaign does not appear to have made much effort to address her deficiencies.

It was legitimate, for example, to question Palin about the leaders of foreign countries and America's relationships with those countries because, if elected vice president, that was the kind of knowledge she would need if she eventually became president. But five vice presidents had been elected since the last time a sitting vice president ascended to the presidency, and such knowledge wasn't strictly part of the definition of the job for which she was a candidate. The vice president presides over the Senate. In modern times, the vice president has been dispatched to represent the United States at the weddings and funerals of foreign dignitaries so familiarity with the governments of foreign countries is a good thing to have, but it is not a constitutional requirement.

The vice president is next in line for the presidency, but it has been nearly 50 years since a vice president became president following the death of a president, and it has been 35 years since a vice president became president following the resignation of a president. As I observed before either running mate was chosen, we're overdue — historically — for such a thing to happen, but, in a lifetime of studying the presidency and presidential campaigns, I have seldom come across an instance in which a prospective vice president was chosen because he (or she) was believed to be the most qualified to take over as president if that became necessary.

I have seen running mates who were chosen because their presence on the ticket, it was believed, would heal political wounds and unify the party (i.e., Ronald Reagan's choice of George H.W. Bush, his main rival for the 1980 Republican nomination, and John Kerry's selection of John Edwards as his 2004 running mate). I have seen running mates who, like Palin, were picked because it was believed they would appeal to certain demographic groups (i.e., Walter Mondale's choice of Geraldine Ferraro in 1984).

I have even seen people who were mentioned frequently as potential running mates primarily as lip service to shaky supporters.

Perhaps my classmate was right. Perhaps McCain's staffers did view Palin as "unstable and irresponsible." If so, they weren't the only ones. But it was their task, as my classmate also observed, to "sell" her to the voters. And they failed.

I'm inclined to believe Palin is right when she says she has been made a scapegoat for McCain's defeat. As objectionable as she may have seemed to many voters, I don't believe any running mate could have salvaged the Republican ticket after the economic meltdown occurred.

Having a running mate who went "rogue" did not cost McCain the election. No matter which definition one takes for that word.

Wednesday, November 18, 2009

The Right Name

Tonight, I've been musing about the difference having the right name can make.

Entertainers and writers often take a nom de plume, and it seems to be beyond dispute that having the right name can make all the difference, no matter what one's line of work may be.

And it's funny, sometimes, the names that gravitate to certain professions.

I recall seeing, for example, a sign for a dentist whose name was Dr. Paine. I often wondered how many potential patients were driven to a different dentist simply because of his name.

(Which reminds me of something George Carlin once said. "If 'Janitor in a Drum' made a douche," he said, "no one would buy it.")

I once knew a florist named Rose. No kidding.

A few times I have known of basketball players whose last name was Short. That doesn't sound like the right name for someone who plays a game in which height is looked upon as a decided advantage.

For many years, I worked on copy desks for daily newspapers. On one such job, I worked nights. It was a morning paper. For awhile, I worked with a fellow named Day. I suppose that is considered ironic.

Another irony from those days involved one of our lead editors, the fellow who designed the front page of the paper every night. In those days, color was not as widespread as it is today. The editors could use it on the front pages of their sections if they wanted to — but it was with the understanding that the color process would make production more expensive.

Color gradually became commonplace for newspapers, but, in those days, the decision to use color was often a gut–wrenching experience for the editors, who frequently had to justify their decisions. For those who designed the front page, the most compelling reason to use a color photo was to attract readers. We were in the midst of a newspaper war, and the editors of the main section saw color as the key to boosting circulation.

I worked in sports, which seemed to have been given the green light to use color whenever we wanted to. And color really did bring sports photos to life, whether they were produced by one of our photographers or came across the wire. But the front page of the sports section was not the one on display in convenience stores and news boxes.

Anyway, this fellow on the news side had a reputation for resisting color. I always assumed it had something to do with his frugal nature — and, truth be told, I agreed with him most of the time. In those days, it was hard to justify using color for photos of people's faces — like the photos you see on columns.

The newspaper term for such photos is mugshot, and most of the mugshots that news had were of men wearing dark suits and dark ties standing in front of a dark background (think Blues Brothers without the sunglasses). What could color possibly add to that?

But there were times when I disagreed with the decisions that were made.

The name of the night editor who stoutly resisted using color photos on the front page was ... Ed Gray.

Known as the "aptly named Ed Gray" by some of the copy editors on the news side.

Still with me?

Well, the reason I'm on this tangent this evening has a lot to do with some interesting facts that are connected to tomorrow, November 19:
  • Forty years ago tomorrow, Apollo 12 landed on the moon, and Pete Conrad and Alan Bean became the third and fourth men to walk on its surface.

    For more than 40 years now, children have studied the Apollo program in school, and they have been told that Neil Armstrong spoke the now iconic words, "That's one small step for man, one giant leap for mankind," as he took the first steps on the surface of the moon.

    I suppose it was the luck of the draw that Armstrong and Buzz Aldrin were the first to step on the moon, but sometimes it seems like it was destiny. What if the crew had come down with something and a backup crew had to go instead? Suppose it had been Conrad or Bean who spoke the first words from the moon. What was said surely would have been different. And neither Conrad nor Bean has the same impact as the name Armstrong.

    Maybe that's because it conjures up images (for older Americans) of Jack Armstrong, the All–American Boy, a radio character who was popular when Neil Armstrong was a boy.

  • Ten years earlier, in 1959, the Ford Motor Co. discontinued the Edsel. It was named after Edsel Ford, son of Henry Ford, company founder. Edsel the man died of cancer in 1943. His namesake vehicle was introduced to the public in September 1957, but it was discontinued a little more than two years later because it was so unpopular.

    No one ever determined an overriding reason for the Edsel's lack of popularity. But I always thought the name had something to do with it. Edsel didn't have much magic.

  • Sometimes, I guess, it isn't necessary to do anything at all if you've got the right name. On Nov. 19, 1990, a pop duo that went by the name of Milli Vanilli were stripped of their Grammy Award because they did not perform at all on the album that won the award — studio musicians and singers were used instead. The duo merely lip synched in live performances.

  • As I said, some names just seem destined for certain occupations. On Nov. 19, 1862, a boy named William A. Sunday was born in Iowa. As an adult, he was known as Billy Sunday, the most famous evangelist in America in the early part of the 20th century.

    Billy Sunday influenced national policy. He was a supporter of Prohibition, and it was widely believed that he played a key role in the passage of the 18th Amendment, which made the transportation and sale (but not the consumption) of alcohol illegal.

    If your aspiration is to be a preacher, Sunday is a pretty good surname to have.

Monday, November 16, 2009

Common Sense and Breast Cancer

The United States Preventive Services Task Force today reversed a seven–year–old recommendation and urged women to start screening for breast cancer at the age of 50, not 40.

"It also says women age 50 to 74 should have mammograms less frequently — every two years, rather than every year," reports Gina Kolata in the New York Times. "And it said doctors should stop teaching women to examine their breasts on a regular basis."

In what I feel will ultimately prove to be an understatement, Kolata writes that the guidelines "are likely to touch off yet another round of controversy over the benefits of screening for breast cancer."

Many people are interpreting that to mean that a woman's age should be the only determining factor. I don't think the guidelines are based exclusively on age, but, to be truthful, I haven't seen much justification for the revisions beyond age. Some but not much.

Kolata writes that the "modest benefit of mammograms — reducing the breast cancer death rate by 15 percent — must be weighed against the harms. And those harms loom larger for women in their 40s, who are 60 percent more likely to experience them than women 50 and older but are less likely to have breast cancer, skewing the risk–benefit equation."

It seems to me that a number of factors, like family history, need to be considered. Age is one factor, and it is clearly the one on which critics are latching, but it isn't the only factor. And, while Kolata provides dramatic numbers that indicate the risk does increase as a woman gets older, the task force apparently did take into consideration other risk factors.

It underscores the importance of communicating with your doctor. He can't reach an accurate conclusion if he doesn't have all the facts.

Whether your family has a history of cancer or not, if you have questions, ask your doctor. Then, together you can decide on what is best for you — even if it doesn't fit someone else's guidelines.

Common sense is called for.

And if you're going to do your own research, I always point in the direction of the American Cancer Society's website.

The ACS is kind of busy these days, preparing for the Great American Smokeout this Thursday. But it did post a response to the new guidelines. And I consider the ACS a valuable resource on cancer. Any cancer.
"The American Cancer Society continues to recommend annual screening using mammography and clinical breast examination for all women beginning at age 40. Our experts make this recommendation having reviewed virtually all the same data reviewed by the USPSTF, but also additional data that the USPSTF did not consider."

Otis W. Brawley, M.D.,
chief medical officer,
American Cancer Society

In the weeks and months ahead, I expect the revisions to be discussed extensively. And I wouldn't be surprised if one or both sides in the health care reform debate use the findings in an attempt to score a few points with the public.

If and when that happens, I expect the ACS to be fully engaged in the discussion.

Cancer is far too serious to be reduced to a political football.

Sunday, November 15, 2009

The First Pacific President

The longer I live, the harder it seems to be to anticipate what people will get worked up about next.

Sometimes it isn't hard to guess. A good example is the attack at Fort Hood, Texas. It comes as no surprise to me that, in the aftermath of that event, people are discussing the red flags that always seem to be so abundant in hindsight.

Many years ago, this was called "going postal." I can't tell you how many times I have seen news reports about a tragedy that is similar to this one in which former co–workers and neighbors seem shocked — and then someone says something like, "You know, he wasn't quite the same after [pick a traumatic event]."

It is, perhaps, an unfortunate coincidence that, while the people at Fort Hood were honoring the memories of their fallen colleagues, news reports included the plans to hold the trial for 9/11 mastermind Khalid Shaikh Mohammed in New York, only a few city blocks from the site where the World Trade Center once stood.

Those seem to be the high–profile topics these days, although some journalists, like Bob Herbert of the New York Times, write that "it's fair to wonder why the president and his party have not been focused like fanatics on job creation from the first day he took office."

I think it's reasonable to ask that — and I have, frequently. Perhaps Mr. Herbert, being black, won't be accused of racism for wondering that, as I have.

In my opinion, the topics of employment and war and peace are always legitimate subjects for discussion. But sometimes there are topics that take center stage that I find bewildering.

Like Michael Scherer's item in his blog for TIME in which he takes Barack Obama to task for calling himself the "first Pacific president."

Scherer said he felt "obligated to object" to that assertion because of his California roots. He went on to point out that "two of our recent presidents" — Ronald Reagan and Richard Nixon— won statewide elections in California before being elected president.

The coast of California, of course, sits on the Pacific Ocean — as do the coasts of Oregon, Washington and Alaska. But many of Scherer's readers appeared to interpret Obama's remarks as referring to Pacific islands — presumably because he delivered his remarks in Tokyo.

But also because he spent his formative years in Hawaii and Indonesia.

This apparently strikes some readers as a case of splitting hairs, although many seem guilty of the same thing in their responses. I read comments that implied that Obama was referring to Pacific islands, not nations. One reader wrote, "i completely knew what he meant. the 'pacific' is the islands. california isn't in the pacific, it's on the west coast."

I must say that I often wonder how such people reach the conclusions they do.

If they had bothered to read the text of Obama's remarks, they would have found that he spoke repeatedly of America's alliance with Japan. The word "island" never appeared in the approximately 4,300–word statement, but very early on, Obama said, "The United States of America may have started as a series of ports and cities along the Atlantic, but for generations we also have been a nation of the Pacific."

The phrase "Pacific president" appears only once — at the end.

In that context, Obama spoke of "America's agenda." And he spoke of himself as America's first Pacific president — not the first Pacific islands president of America.

His remarks clearly labeled America as a Pacific nation.
"This is America's agenda. This is the purpose of our partnership — with Japan, and with the nations and peoples of this region. And there must be no doubt: as America's first Pacific President, I promise you that this Pacific nation will strengthen and sustain our leadership in this vitally important part of the world."

Let's see. Early in the statement, he told his listeners that America had considered itself a Pacific nation for generations. Based on the text of the statement, I got the impression he was speaking of nations that touch the Pacific and are, therefore, part of the Pacific region. Hawaii is the only state that is surrounded by the Pacific, but I would argue that the phrase "for generations" (plural, therefore two or more) would suggest that Obama was saying America looked upon itself as a Pacific nation before Hawaii became a state 50 years ago.

And the states on America's west coast have been part of the United States for a long time — California was admitted in 1850, Oregon was admitted in 1859 and Washington was admitted 1889. That's a century and a half of being a Pacific nation.

And if having those states in the Union made the United States a bona fide member of the Pacific region's nations, any president with connections to one or all of the states on the Pacific coast would qualify as a "Pacific president." Because those states formed the nation's physical link to the Pacific Ocean.

That would include Reagan and Nixon. It could even include Herbert Hoover, who was born in Iowa but attended college at Stanford University in California and was a registered voter in the state as an adult.

"For generations" could mean only two generations — it is a vague phrase — but I found nothing in the statement that made me think Obama was limiting his remarks to a period that is only slightly longer than his own lifetime. Some people, like the reader I quoted, will insist that they know what Obama meant.

The problem with that is, he didn't say it.

Saturday, November 14, 2009

Around the World in Less Than 80 Days

To someone who has studied journalism and/or American history, the name Nellie Bly brings to mind many things.

She was a pioneering journalist. It is not uncommon in the 21st century to read a woman's byline or see a woman reporting on the events of the day on the evening newscast. But in the 19th century, it was virtually unheard of. It was so unusual, in fact, that "Nellie Bly" wasn't even her real name. The custom, in those days, was for female journalists to use pen names, and her editor chose the name "Nellie Bly."

(Whether it was intentional or not, the first name was a misspelling of the name of the lead character in a song by Stephen Foster — "Nelly Bly."

(Foster, who wrote such classic tunes as "Oh! Susanna," "Camptown Races" and "My Old Kentucky Home," died a few months before Bly, whose real name was Elizabeth Cochran, was born in 1864.)

A few years before the start of the 20th century, at the age of 31, she married a millionaire manufacturer who was 40 years older than she was, and she became president of the Iron Clad Manufacturing Co., which made steel containers. In that capacity, she invented and patented the 55–gallon oil drum that is still in use more than 100 years later.

But it was on this day 120 years ago that Bly began what may well have been the greatest adventure of her life.

In 1888, she suggested to her editor at the New York World that she should embark on a journey that mimicked the trip around the globe described in Jules Verne's "Around the World in Eighty Days." The following year, on Nov. 14, 1889, she did exactly that, leaving New York on a trip that would take her nearly 25,000 miles.

She completed the journey in 72 days, besting Verne's character by better than a week. That was a world record at the time, although businessman George Francis Train improved on it a few months later.

Interestingly, during her voyage, Bly actually met Verne in Amiens, France. In her book about her travels, "Around the World in Seventy–Two Days," Bly described the brief meeting. She said she asked Verne if he had ever been to America. He said he had, but that he had only been to America once "[f]or a few days only, during which time I saw Niagara." He said he had often wished to return, but his health prevented him from doing so.

Bly asked him about his inspiration for "Around the World in Eighty Days." Verne said he was inspired by something he read in a French newspaper that suggested that such a trip was possible.

Before leaving, Bly asked to be shown Verne's study. After that, Verne wished her luck and told her, "If you do it in 79 days, I shall applaud with both hands."

"[T]hen I knew he doubted the possibility of my doing it in seventy-five, as I had promised," she wrote.

In fact, she exceeded both their expectations.

Friday, November 13, 2009

Crime and Punishment


"The world breaks everyone and afterward many are strong in the broken places. But those that will not break it kills. It kills the very good and the very gentle and the very brave impartially. If you are none of these you can be sure it will kill you too but there will be no special hurry."

Ernest Hemingway
A Farewell to Arms

When I graduated from college with a B.A. in journalism, my first job was as a general assignment reporter at a newspaper in central Arkansas. I held that job for nearly two years. During that time, I covered several murder trials.

Consequently, I was familiar with the judicial procedure in the capital murder trial of Curtis Vance, who was convicted recently of last year's murder of Little Rock TV news anchor Anne Pressly.

Every state does things a little differently. When someone is being tried for murder, the procedure in Arkansas is to decide the issue of guilt or innocence and then, if the defendant was convicted, hold a second trial in which the jury decides the punishment.

Informally, it is called the "penalty" or "punishment" phase. And, when the defendant has just been convicted of capital murder, the only options are life in prison — or death.

As a young reporter, I always found the penalty phase to be the most dramatic part of the story of a murder trial because, at that point, the guilt or innocence had been established and the defense attorneys were no longer protesting that their clients were not involved. Well, most of them were not pretending anymore. A few of them did, but most had adapted their strategies to the reality of a conviction.

Their arguments and their witnesses were intended to support what they believed were mitigating circumstances, and I came to regard "mitigating circumstances" as excuses for the crime. Not explanations. How can one explain a premeditated murder?

I remember covering one murder trial in which the defendant, having been convicted, learned (allegedly for the first time) in testimony during the penalty phase that his mother had been mentally retarded. There had been some physical abuse in the family, all of which contributed to his actions as an adult. That was the defense's argument.

The defense attorney must have done a good job of selling that one. Or perhaps it was the moment during the testimony when the defendant spontaneously burst into tears and court had to adjourn for awhile so he could compose himself. Maybe that was when the jurors decided they could not authorize the execution of that defendant.

I guess it's hard for juries to decide questions of life and death — although I suppose I would prefer to leave it up to 12 jurors than a single judge. But jurors must be emotionally vulnerable after a guilt phase that frequently has gruesome physical evidence. Maybe it is all too easy to manipulate jurors after such an experience.

Vance apparently broke into Pressly's home, beat her with a piece of wood, raped her and left her to die in the early morning hours of an October day last year. Her injuries caused a massive stroke, which led to her death a few days later. The photos of the crime scene must have been pretty unsettling.

The defense brought in Vance's mother, who tearfully testified that she struggled with an addiction to crack when Vance was a child and that she was physically abusive. On one occasion, she said, she slammed his head into a brick wall. Doctors testified that they believed he had suffered brain damage.

I don't know if the attorney actually linked the abuse to the possibility of brain damage or if that conclusion was left to the jurors to reach. Whether the attorney connected the dots for the jurors or they did it themselves, it is a logical conclusion, and it is easy to see how it could persuade jurors to give him life.

Pressly's stepfather said he was not disappointed with the jurors' verdict. "There really aren't any winners tonight," he said. To me, that seems to be a very generous attitude, considering the magnitude of the loss his family has experienced.

But it's a tough task for jurors. I know I didn't always agree with the jurors in the trials I covered, but I never criticized their decisions.

We ask 12 citizens to do a dirty job for the rest of us. We must be willing to accept whatever their decision may be.

Denial Ain't Just a River in Egypt

There is plenty of bad news in the latest Pew Research findings.

Most of it appears to be bad news for Democrats, since they hold the White House and majorities in both houses of Congress. But there is some for the loyal opposition, even though the Republicans wield little, if any, power these days.

"The mood of America is glum," Pew reports. "Two–thirds of the public is dissatisfied with the way things are going in the country. Fully nine in 10 say that national economic conditions are only fair or poor, and nearly two–thirds describe their own finances that way — the most since the summer of 1992."

That's bad news for the party in power. The Democrats have held majorities in both houses of Congress since the midterm elections of 2006. They've held the White House for nearly a year. In the public's mind, they own the economy. Not George W. Bush. Not the Republicans.

And it isn't just domestic issues that contribute to this negative mood. "An increasing proportion of Americans say that the war in Afghanistan is not going well," Pew reports. To make matters worse for the Democrats, "a plurality continues to oppose the health care reform proposals in Congress."

Pew observes that Barack Obama's approval ratings haven't changed since July — but those ratings are far from the glittering numbers he enjoyed in the early days of his administration. "[O]pinions about congressional incumbents are another matter," Pew says — and that should concern congressional Democrats because, unlike 2008, they will not have Obama at the top of the ballot to attract the largely supportive but traditionally non–participatory demographic groups that propelled Democrats last year.

It has been my feeling for quite some time that Democrats would run into trouble in 2010. For one thing, a political party seldom enjoys three consecutive successful elections. For another, history says that midterm elections almost always go against the party in power.

And, if that wasn't enough, Obama and the Democrats have made no effort to encourage job creation in this country. I don't have to hear many words being spoken by Obama or someone in his administration — or one of his diehard supporters — that lead me to the conclusion that the speaker (whoever he or she may be) truly understands what it is like to lose a job in this economy.

Or that they comprehend the price they will pay at the ballot box next year.

I've heard all the excuses. And I've studied my history. And, on the rare occasions when I hear Democrats speak about unemployment, the same word pops into my head.

"Denial."

Yesterday there was a report that Obama has decided that joblessness — which, according to the latest reports, is at 10.2% — is serious enough to warrant a White House jobs summit — next month.

"[W]e have an obligation to consider every additional, responsible step that we can to encourage and accelerate job creation in this country," Obama said.

At the very least, this should have been done 10 or 11 months ago — if not in the first few weeks after last year's election.

Ian Swanson and Jordan Fabian write in The Hill that this is the latest evidence that Obama and the Democrats are getting nervous about next year's elections.

Actually, Swanson and Fabian sort of offer Democrats a new excuse when they write, "Unemployment wasn't expected to hit double digits that soon, and many economists now warn they do not expect the jobless rate to drop until at least next summer; in the most recent recessions, unemployment has not stopped rising until a year after the recession's end."

(I can accept that as an explanation, but not as an excuse for inaction. Not when the better part of this year has been squandered on a banal debate over health care.)

"That could doom dozens of incumbent lawmakers in the House and Senate, jeopardizing the large majorities Democrats enjoy in both chambers," Swanson and Fabian say. "That, in turn, could make it impossible for Obama to move his agenda forward in the latter half of his first term."

It seems to me that the very act of calling for a "summit" is a denial of reality. It suggests that joblessness is something new — a conclusion that has only recently been reached because conditions have suddenly worsened.

The conditions were in place before Obama was elected. He and Joe Biden can protest that they "misread" the economy, but that excuse doesn't hold much water. Their statements on the campaign trail last year clearly implied that they knew how devastating the situation was.

But, even if they didn't, wasn't it Obama who, as president–elect, spoke of assembling a team of experts who would be prepared to deal with every crisis? Where have these guys been all year?

The stimulus package that Congress passed and Obama signed back in February was promoted for its alleged potential to put Americans back to work. But, to those who were unemployed then and remain unemployed today, that sounds like nothing but lip service — just like the talk coming from the White House and Congress about how many jobs have been "saved."

The reality that the unemployed see is the billions being spent to bail out banks and big corporations while the little guys — who did nothing to bring on this economic calamity — are scolded for needing extensions of unemployment benefits because no one is hiring.

Some people do see the urgency. Paul Krugman writes, in the New York Times, that "these aren't normal times. Right now, workers who lose their jobs aren't moving to the jobs of the future; they're entering the ranks of the unemployed and staying there. Long–term unemployment is already at its highest levels since the 1930s, and it's still on the rise."

Krugman observes that long–term unemployment "inflicts long–term damage ... so it's time to try something different." Krugman has been saying, for months, that the stimulus package was not big enough. Now, he says, we must discuss "cheaper alternatives that address the job problem directly."

He concludes that "we need to start doing something more than, and different from, what we're already doing."

But, if you're a Democrat who prefers to live in that cozy cocoon of denial, Peter Fenn has a one–word response in Politico.com to those who wonder if it is possible for Democrats to get through next year's midterm elections unscathed. "Absolutely."

Denial.

Tuesday, November 10, 2009

Death of the D.C. Sniper

It was a matter of minutes ago that it was announced that John Muhammad, the adult half of the D.C. sniper team, has been executed.

Like most people, I remember where I was and what I was doing on September 11. But, unlike most people, I didn't see it as it was happening. I was working in an office, and there was no TV in that office so everyone sat riveted to their radios. I didn't actually see footage of the attacks until the middle of the afternoon. The manager decided to close the office early so I went home and then, about seven hours after the second plane hit the World Trade Center, I first saw footage of that event. And then I saw footage of the WTC collapsing.

That's been the source of a strange dichotomy for me. I feel as if I shared the experience I had on September 11 with the people in that office. But I didn't share the experience of people who saw it all unfold.

Anyway, I feel differently about my experience with the D.C. sniper case. It wasn't a one–day event. Instead it was spread out over several weeks. But I felt more personally involved with it. I saw the reports of the latest shootings. Even though it was all happening in another part of the country, I couldn't help feeling that I could be next. The attacks seemed to be so random, people getting shot while doing ordinary, everyday things like buying gas or mowing the lawn or sitting at a bus stop reading a book.

When I was younger, I was against the death penalty because I felt there was always a possibility that the wrong person could have been convicted. But in the last 15 years or so, DNA evidence has emerged as a convincing element of most death penalty cases, which has eased my concerns.

But, whether you have DNA evidence or not, I've learned something else in my life.

Some people are evil, plain and simple.

John Muhammad was one of those people. And the world is a better place without him.

Good Advice

Bob Herbert's column in the New York Times has some good advice for Barack Obama.

"If I were a close adviser of President Obama's," Herbert writes, "I would say to him, 'Mr. President, you have two urgent and overwhelming tasks in front of you: to put Americans trapped in this terrible employment crisis back to work and to put the brakes on your potentially disastrous plan to escalate the war in Afghanistan.' "

I can only hope he reads it — either on his own or upon the recommendation from one of his actual advisers.

"I would tell the president that the feeling is widespread that his administration went too far with its bailouts of the financial industry, sending not just a badly needed lifeline but also unwarranted windfalls to the miscreants who nearly wrecked the entire economy," Herbert writes. "The government got very little in return. The perception now is that Wall Street is doing just fine while working people, whose taxes financed the bailouts, are walking the plank to economic oblivion."

Herbert touches on the administration's insistence on emphasizing health care reform while unemployment has been virtually ignored. "We have spent the better part of a year locked in a tedious and unenlightening debate over health care while the jobless rate has steadily surged," he writes. "It's now at 10.2 percent. Families struggling with job losses, home foreclosures and personal bankruptcies are falling out of the middle class like fruit through the bottom of a rotten basket."

I do not recall health care reform being a part of the general election debate last year, but you couldn't prove it by the way it has been pushed to the top of the administration's first–year agenda.

The president's priorities are misplaced. That is Herbert's message. It has been mine, too.

I can only hope he gets it. Before it's too late.

Monday, November 9, 2009

The Definition of Insanity


"The definition of insanity is doing the same thing over and over again and expecting different results."

Albert Einstein

Professor Einstein never worked for a newspaper. At least, I don't think he did.

But, even if he didn't, I have little doubt that he would have seen the newspaper industry as the embodiment of his observation.

I offer, as Exhibit A, a memo (complete with copy editor's marks and comments) that has been making the rounds at the Toronto Star.

On one hand, I find it amusing because this memo is one long example of just about everything they warned us against when I was studying journalism in school — starting with a bloated, neutral language that snakes its way around the facts without ever really acknowledging them. The publisher is basically saying that the newspaper is losing money so the paper is going to slash its staff and cut back on what it produces (in more ways than one). The newspaper is going to eliminate those on the payroll who are trained and experienced in order to replace them with untrained and inexperienced people who will work for less.

It's a decision that is made by bean counters when it should be made by wordsmiths.

On the other hand, I find this depressing because it is symptomatic of the kinds of self–defeating decisions that newspaper publishers have had to make in recent years. Just about anywhere you go in America today, the local newspaper (if there is one) is considerably smaller than it was a few years ago. There are fewer people in the newsroom with the training and the experience that the work requires.

Today's newspapers are streamlined operations. On the surface, it would appear that the publishers have made the tough but sound business decisions they had to make to survive. But the newspaper business is not like other businesses. You can't cut corners like that with a product like a newspaper and reasonably expect most people to pay the same price for less.

But publishers can't understand why they're losing readers.

The original mistake that newspapers made was failing to recognize the potential of the internet and construct a business model that would succeed in a digital age. Newspapers had to act quickly, though, to take advantage of the initial opportunity, and few had business managers who were that nimble on their feet.

Anyway, that window of opportunity slammed shut fairly quickly. And now that the internet is clearly here to stay, some newspapers are repeating what has been shown to be an unsuccessful strategy for newspapers — charging people for access to their content.

At some point, newspaper management will realize that internet consumers have many sources for news and information that are available to them for no additional cost. Unless a newspaper has this generation's H.L. Mencken or Red Smith on staff, there will be no compelling reason for most readers to fork over whatever is being charged. Most internet subscribers will gravitate to the free sources for news and sports scores.

Actually, small–town newspapers may be in a better position to profit online. A small–town newspaper is more likely to be the only source for articles on the latest local school board or city council meeting. With more local radio stations establishing an online presence, that may change in many small towns.

But many of the radio stations in metropolitan areas are online now, and they already are competing with the metro newspapers for local news. Consequently, there isn't much unique content that big–city newspapers can offer.

But reducing the quality of the product just makes it easier for many subscribers to conclude that it is a product they can live without.

I firmly believe that newspapers must offer a product that readers are convinced they must have if they are to survive. That won't be accomplished through slick advertising campaigns or offering less for the same price.

But no one has devised a business model for newspapers that will succeed in the digital world — so most newspapers insist on duplicating the mistakes others made before them.

Doing the same thing over and over again and expecting different results.

The definition of insanity.

Courtesy of the newspaper business.