Monday, December 26, 2011

A Close Call

Ordinarily, the American public doesn't know how a president perceives the decisions he must make while he is still in office.

There are, after all, so many decisions a president must make during his term. Seems like the destinies of most presidents, in the words of Forrest Gump, float on a breeze — and not a gentle one at that. They kind of go from one decision to the next without spending too much time (if any) reflecting on one that has already been made. By that time, there are already half a dozen more matters that need the president's immediate attention.

You usually have to wait until a president leaves office, catches his breath and writes his memoirs before you learn which decisions were the most gut–wrenching ones he had to make. They're usually pretty predictable, too — where (and whether) to put American troops in harm's way, which programs to support financially, etc.

Decisions that require courage, that call for the wisdom of Solomon.

Last week, Amie Parnes of The Hill offered Americans a rare glimpse into the mindset of a president — well, of this particular president — and it wasn't for all three years of his presidency, just the current "holiday season." Still, I rather doubt that it will be the subject of a political science lecture, though.

"The toughest call for the president this holiday season," wrote Parnes, "could be whether to join his family for Christmas in Hawaii or stay in lonely Washington."

Parnes conceded that "there's no ideal time for a presidential vacation," which is certainly true. I cannot remember a president who was not criticized for taking a little time away from the Oval Office (Harry Truman once called it the "crown jewel of the federal penal system"), but "this one comes at a particularly inopportune moment."

Yes, I guess you could say that. Obama was already requiring Congress to remain in Washington until an agreement was reached on the payroll tax cut extension. It wouldn't have looked very good if he had taken a "do as I say, not as I do" approach to the matter and skipped town to spend Christmas on the beaches of Hawaii.

Fortunately for Obama, the intransigent House Republicans gave in (no real reason why they shouldn't; as Obama observed, legislators in both parties favored the extension), the president signed the bill into law and got to Hawaii in time to spend the holiday with the wife and kids in their beach house.

While he was in Hawaii, the president also mixed some business with pleasure, attending Christmas services on a Marine base.

Whew! Crisis averted.

Of course, for millions of Americans, holiday travel wasn't an issue. They have no jobs — or, at least, no full–time ones — and, without that, it makes holiday travel something of a moot point.

Has the same effect on the payroll tax cut, too, for that matter.

Friday, December 23, 2011

Ghosts of '68

As the polls of Republican voters have been careening from one anti–Mitt to the next, chatter about the possibility of a brokered convention has been rising above the din.

My original inclination was to dismiss such talk. The prospect just seemed too remote.

My parents weren't old enough to vote the last time there was a brokered convention. But the rapid rise and fall of challengers to the consistent frontrunner, Mitt Romney, has led me to conclude that a brokered convention is a real possibility. It may still be remote, but it is real.

Many Republicans seem to think Romney isn't committed enough to Republican values so the search has been on for someone else. And there just might be enough of those disgruntled Republicans to deny Romney enough delegate support to wrap up the nomination on the first ballot.

The problem is that each someone else to whom Republicans have flocked has been shown to be flawed in some way. It wouldn't be so bad if the flaws were thought to be modest, but these flaws have been too great for most Republicans.

Romney is always accused of being too wishy washy, a waffler, a flip–flopper. The same thing might be said of any Republican who ran for office in dark–blue Massachusetts — but he has been the one true constant in this volatile campaign. His rivals have risen and fallen, but Romney has remained about where he was from the beginning — around 25%.

That level of support won't be sufficient, his detractors argue, and I agree — if it remains where it has been through most of 2011.

But it won't.

I think it will start to fluctuate once the primaries and caucuses begin — and the fluctuations are likely to be upward. There is a kind of finality about primaries — and caucuses, too, for that matter. It is different from the phase we have been watching this year, which is dominated by fluid and non–binding polls.

Actual delegates are committed to candidates in the primaries and caucuses; there are real numbers to hang your hat on. True, those delegates are only committed through the first ballot — but delegates are not assigned to candidates at random. Those who come to the convention pledged to support Candidate A reached that point because they were true believers in Candidate A.

Unless Candidate A drops out after the first ballot, my guess is that they are likely to remain with Candidate A even if they are no longer bound by party rules to do so.

All of that remains hypothetical at this stage because no Republican convention has gone past the first ballot since Tom Dewey was nominated in three ballots to face President Truman in 1948.

Democrats hope that history will repeat itself in 2012 — an embattled Democratic president, running against a "do–nothing Congress," comes from behind to stage an upset victory and triumphantly waves the early edition of a metro newspaper prematurely proclaiming his electoral demise.

(In the 21st–century version, I guess the image — digital, of course — would be of Obama holding up a laptop with the message "Romney (or Gingrich or Paul or whomever) Defeats Obama.").

Perhaps that is what fate has in store. I have my doubts. I still believe the prevailing economic conditions will have an overwhelming influence on the outcome of next year's elections — unless something totally unexpected happens to distracts the public's attention a week or two before the election.

Republicans, on the other hand, have been wishing for a repeat of 1980, when a charismatic Republican (Ronald Reagan) emerged to defeat an embattled Democratic president (Jimmy Carter) in the general election.

Which scenario you see developing may depend upon which side you favor, but, from a purely historical standpoint, the election I see as having the most in common with the campaign upon which we have embarked is the 1968 campaign.

Then, as now, a Democrat was in the White House. He had been quite popular when he was elected four years earlier, but his approval ratings had steadily declined and his party had lost a lot of seats in Congress in the midterm elections.

"Of the paradox of Lyndon Johnson historians will write many books," wrote Theodore H. White in "The Making of the President 1968."

"Few men have done more good in their time, and no president has pressed more visionary ideas into law. Yet few have earned more abuse and roused less love, loyalty and affection from those he sought to help."

Seems like the kind of thing that some of Barack Obama's supporters might be inclined to write about him when the story of the 2012 presidential campaign is written.

Johnson decided not to seek renomination after he nearly lost the New Hampshire primary to Eugene McCarthy's insurgent campaign, and the story of the Democratic nomination battle that spring was the story of the McCarthy–Robert Kennedy duel in the primaries.

Obama is not likely to withdraw from the race — nor does he appear likely to drop Vice President Joe Biden from the ticket — so the prelude to the Democratic convention next year probably will be quite different from what it was in 1968.

But I believe the most striking similarities are to be found on the Republican side.

All along, the frontrunner for the Republican nomination was Richard Nixon, the former vice president who had been beaten in a close race for the presidency eight years earlier and then had lost a race for governor of California two years later. Nixon had a reputation as a conservative anti–communist, which he toned down in pursuit of the 1968 nomination.

And, as he positioned himself in the party's center, he turned back all challengers, one by one.

First, there was Mitt Romney's father, George, whose candidacy collapsed after he famously claimed he had been "brainwashed" into supporting the war. Romney was ridiculed for the remark and wound up dropping out of the race before the New Hampshire primary.

Nixon's next challenger was New York Gov. Nelson Rockefeller, a darling of the antiwar wing of the Republican Party who emerged as its champion with a write–in campaign in New Hampshire.

Rocky did defeat Nixon in the Massachusetts primary, but, for the most part, his bid for the nomination was ineffective.

In the spring of 1968 — after the shooting of Martin Luther King and before the shooting of Bobby Kennedy — Nixon's fellow Californian, Gov. Ronald Reagan, was his next challenger.

At that time, Reagan was not the experienced executive he was when he was nominated in 1980. The 1968 Reagan had two years of experience as governor, which didn't really compare to Nixon's political record of more than two decades, yet he defeated Nixon in the California primary. Thanks to his margin there, Reagan finished the primaries (which were not nearly as widespread as they are today) with a slight edge over Nixon in the national popular vote.

But Nixon went to the convention a handful of delegates shy of securing the nomination. Reagan and Rockefeller reportedly were going to join forces in a final effort to deny Nixon the nomination, but neither would agree to endorse the other.

When all was said and done, Nixon won by a wide margin on the first ballot — and went on to win the presidency by a narrow margin in a three–candidate race that fall.

While the manner in which Nixon's rivals fell was not the same as it has been for Romney's, I find the parallels between 1968 and 2012 compelling.

Wednesday, December 21, 2011

Fourth-Best President Ever?

"I would put our legislative and foreign policy accomplishments in our first two years against any president — with the possible exceptions of Johnson, FDR and Lincoln — just in terms of what we've gotten done in modern history."

Barack Obama
60 Minutes interview

My, someone certainly has a high opinion of himself and his place in American history.

I didn't watch the president's recent interview on CBS' 60 Minutes, but, apparently, in a segment that was not aired originally, he claimed that his administration's "legislative and foreign policy accomplishments" were as good or better than any other "with the possible exceptions of Johnson, FDR and Lincoln."

As I have said here before, I'm something of an amateur historian. I minored in history when I was in college, and I have always had an interest in the American presidency and American politics in general.

I'm also a journalist. That was my major in college, and it is the subject I am teaching now. I was trained to write and to think in Associated Press style, which constantly strives for clarity and consistency. So, when a president compares his presidency to "Johnson, FDR and Lincoln," my question is, "Which Johnson?"

The statement, you see, is imprecise. There have been two presidents named Johnson. I'm pretty sure I know which one Obama meant — Lyndon, who succeeded John Kennedy nearly 50 years ago, not Andrew, who succeeded Lincoln nearly 150 years ago.

Until the Clinton presidency, Andrew Johnson was the only president to face an impeachment trial in the Senate — where he was acquitted by a single vote. He chose not to seek a full term on his own in 1868.

A Siena College survey that was released in July 2010 rated Andrew Johnson as one of the five worst presidents in American history.

No, I'm quite sure Barack Obama did not mean to compare himself to that President Johnson. His image has undergone some changes in a century and a half, but, in recent years, he has been remembered as a "white supremacist."

I'm convinced the first black president in American history does not want to be remembered as comparable to Andrew Johnson.

Lyndon Johnson, on the other hand, is almost a Lincoln–like figure for American blacks — and he was responsible for the most advancements — in housing, education, employment opportunities, voting rights, in fact rights in general — for blacks and all other underprivileged Americans.

But LBJ, as I wrote about a month ago, had the misfortune of being a president who wanted to do great things domestically (which he did) but served at a time when foreign affairs dominated.

I wrote that Obama appears likely to turn out to be LBJ in reverse — a president who first ran for the presidency because he wanted to end a war and wound up being undone by his inability to tame the economy.

In addition to teaching journalism, I have also been teaching basic writing, and one of the things I try to impress on my students is the importance of using the right word to express the right thought.

That isn't an easy thing for most people — even people who earn their livings (or who have earned their livings) as writers struggle at times to find the right word. I know I do. Most of the time, I keep a thesaurus within arm's reach whenever I sit down to write — and there are still times when I choose the wrong word.

Nor is it easy to select the right word when one is being interviewed without some notes or a TelePrompTer to help. Consequently, I do have some sympathy for Obama. I have seen many people "misspeak" (to use a word that was particularly popular during the Watergate days) in such a setting.

But this wasn't the first time Obama has been interviewed by someone. Far from it. He is no novice when it comes to being interviewed. He just has a tendency to stick his foot in his mouth when he does.

When Obama suggests that his presidency is the best in history "with the possible exceptions of Johnson, FDR and Lincoln," I really have to marvel at his use of the word "possible" and what it implies.

In hindsight, Obama himself might admit that it wasn't the most prudent word choice he could have made, but I believe it speaks volumes about what he really thinks of himself and his presidency.

I think he really does believe his presidency, in its first two years, accomplished more than any other president — but he will allow for the possibility that LBJ, FDR and Lincoln accomplished more.

Lincoln is kind of a no–brainer. The Siena survey listed him third, and most surveys rank Lincoln in the top three.

FDR was the top–rated president in Siena's survey, which is also kind of a no–brainer. The only president to be elected four times, he guided the country through its worst economic crisis ever and is credited with leading it through World War II even though he died a few weeks before hostilities ended in Europe.

But Siena's survey did not rank LBJ in its Top 10. Apparently, Obama holds him in much higher esteem than most historians — at least the ones who were surveyed.

They ranked Theodore Roosevelt second. Roosevelt is remembered for several achievements — trust busting, conservation, labor laws, public health and safety laws — that continue to influence American life.

T.R. was the first American to receive the Nobel Prize — but, unlike Obama, he was rewarded for an actual achievement (negotiating the resolution of the Russo–Japanese War), not merely for his potential. By his omission from Obama's statement, though, it appears the president thinks his accomplishments in his first two years were greater than Roosevelt's.

The survey listed George Washington as the fourth–best president, and that should be a no–brainer, too. He is remembered as the father of the country, its first president. Thanks to his selflessness (he declined the salary that was offered to him, preferring not to tarnish, in any way, his image as a public servant) and his insistence that the leader of the new country should not be a monarch, we call our presidents "Mr. President," not "Your Highness."

It set the tone for the last 200 years, but I can only conclude that Obama also believes his contributions to American life in his first two years as president are greater than Washington's.

The Siena survey ranked Thomas Jefferson fifth. Once again, that should be a no–brainer, shouldn't it? Jefferson wrote the Declaration of Independence, and there are few documents in recorded history that have had the kind of influence on a culture that it has had.

Jefferson also was responsible for the Louisiana Purchase, which doubled the size of the United States at that time — and still represents roughly one–third of its land mass.

But, apparently, Obama feels his accomplishments in his first two years exceeded Jefferson's.

Sixth in Siena's survey was Jefferson's successor, James Madison. Before becoming president, he was the "Father of the Constitution." As president, he sought to continue Jefferson's policies, but he may be largely remembered for the crumbling of U.S.–British relations and the War of 1812, during which the White House, the Capitol and many other public buildings were burned.

Seventh in the rankings was Madison's successor, James Monroe, whose signature achievement probably was the Monroe Doctrine, which established the Western Hemisphere as the United States' sphere of influence and served notice to Europe that any attempt by any of its nations to interfere would be seen as an act of aggression and treated appropriately.

Ironically, America has not re–elected three consecutive presidents since Monroe's re–election in 1820. If Obama wins a second term next year, he would match Monroe's electoral achievement — but, apparently, he believes he has already bested Monroe as a president.

Siena's eighth–ranked president was Woodrow Wilson, a leader of the progressive movement. A Wilson biographer, John M. Cooper, wrote that Wilson's record of legislative achievement, which included child labor reform, the Federal Trade Commission Act and the Federal Farm Loan Act — was unmatched by any other president except FDR, and his advocacy of women's suffrage helped lead to the ratification of the 19th Amendment.

Perhaps it is subliminally, but Obama seems to think that what he did as president in 2009 and 2010 is greater than what Wilson achieved nearly a century earlier.

Ninth on the list was Harry Truman, whose low point in his approval ratings (22) was unmatched by any president until Obama's immediate predecessor, George W. Bush.

But that doesn't tell the whole story of Truman's presidency. From the day he succeeded FDR in April 1945 until he won the 1948 election, Truman did great things in spite of the fact that he had been virtually ignored by Roosevelt in his 82 days as vice president.

He knew nothing of the Manhattan Project, which gave him the weapon that he used to bring the war in the Pacific to a quick conclusion. The attitudes about his use of nuclear weapons in 1945 have changed over the years, but at the time and for years thereafter, it was believed to have saved hundreds of thousands who, it was said, would have perished in a fight–to–the–death invasion of Japan.

He had to deal with the transition from a wartime economy to a peacetime one, which always seems to be uneasy but was especially so after World War II. There were several economic conflicts that had gone unaddressed during the war years but boiled over when the war ended; Truman managed to deal with them all.

He was an advocate of the "Fair Deal," national health insurance and civil rights.

I would guess that Obama has quite a bit of respect for what Truman did as president — so much that he is clearly trying to duplicate Truman's "upset" victory in his re–election campaign in 1948. Truman won a full term largely by running against a "do–nothing Congress," and that seems to be Obama's strategy as well.

For that to work, you need a solid record of achievement to contrast with Congress'. Obama clearly believes he does, and so do his adoring supporters, but, judging from presidential approval ratings, millions are not convinced.

They are not convinced for much the same reason that the people of the late 1960s were not convinced about LBJ. They felt out of sync with their president's priorities. He was focused on domestic issues, which were (and are) important, but they were more concerned about the meat grinder of Vietnam.

In modern times, Obama's highest approval ratings have been for his handling of foreign affairs — when Americans are hurting at home, struggling to keep a roof over their heads and food in their stomachs. They need jobs.

The Siena survey ranked Dwight Eisenhower 10th. Eisenhower earned Americans' respect when he led the Allies to victory over the Axis powers in World War II, and he presided over a country that was at peace in the world but suffering from some postwar growing pains in the 1950s.

His most lasting legacy, I suppose, is the interstate highway system — and his warning, in the final days of his presidency, against the growing influence of the "military–industrial complex."

Both continue to influence American life, but Obama thinks his achievements are equal to or greater than Eisenhower's.

Maybe they are, but that will be up to the voters to decide next year.

Sunday, December 18, 2011

Cell Phones Don't Kill People

I was listening to the radio yesterday morning, and, for awhile, the topic of the discussion was banning cell phone use while driving. Should we or shouldn't we?

I missed the beginning of the conversation, but I assume it was in response to the National Transportation Safety Board's proposal this week for a ban on cell phone use and text messaging devices while driving.

Now, before I go any farther with this, I guess I should say that there are times when I feel like a refugee from another time.

Not to say that I am old — not yet (although there are times when I feel that I need to be wearing a shirt like the one my mother had — it said, "Hill? What hill? I didn't see any hill!") — but there are definitely times when I feel that technology has gone galloping past me.

Time, I have discovered, doesn't merely fly. It sprints. You younger folks will understand that one day.

Anyway, that's how I feel about cell phones.

As I have written here before, I taught journalism on the college level in the mid–1990s. I left the classroom for several years, but I gravitated back to it last year, taking a job as an adjunct journalism professor in the local community college system.

When I did, I quickly discovered how many things had changed in the intervening years. In the '90s, for example, none of my students had cell phones. Today, they all do. It was essential to implement rules about their use in class to maintain order — and get anything done.

It's a battle I'm still fighting.

On a personal level, I resisted cell phones for many years, and I had pretty good reasons. I'm not married, and I have no children. It was an additional expense, and, in the event of an emergency on the road, I figured (at first) that I could always use a pay phone.

Well, I'm still not married, and I still have no children. Cell phones are still an additional expense, but pay phones have just about disappeared. I finally decided it might be worth the expense to be sure I would have one if something happened — but I only use it when it is absolutely necessary.

See, I've learned that anything can happen — and it can happen all by itself. It doesn't need anyone's assistance.

And I have been wary of cell phones because I have long believed that they were likely to contribute to the accident rate — which certainly doesn't need any help.

When my parents taught me to drive, the thing they emphasized, more than any other, was to keep my eyes on the road. If your attention is distracted, they told me, even for a second, it can have tragic consequences, and one must be ever vigilant — because anything can be a distraction.

A distraction can be a very modest, very momentary thing, like the sound of a dog barking or a sudden movement one catches from the corner of one's eye. But cell phone conversations can go on indefinitely, and the distraction from the task at hand can be far from modest.

The introduction of texting into the mix just raised the risk level, as far as I was concerned. It certainly raised my awareness of the risks.

Perhaps it was due, in part, to the fact that I went without a cell phone for so long, but there were certain things about them that I just never considered — and, to be fair, there were other things that just weren't factors until recently.

Like texting.

And, perhaps because my cell phone is so basic, so ordinary, I'm not entirely acclimated to a world in which the internet is at your fingertips, wherever you are. When I was in graduate school, there was no internet (well, no real commercial internet). A few years later, that was a reality. It was a new frontier, but you could only explore it from your desk at home or at work.

Then, along came laptops, and you weren't tied to a physical location anymore. But laptops are still too big and bulky for some people so access to all of it has been condensed to the "smart phone," a gadget that fits in the palm of your hand.

(Oh, what we could have done with those when I was a general assignment reporter fresh out of college!)

The speed of technological advancements has made so many things possible that my poor mind never imagined most of them — and still needs time to absorb it all.

That point was made clear to me when I heard the listeners' calls.

One observed that he frequently uses the GPS app on his cell phone when he is driving in an unfamiliar area. The cell phone is equipped to "speak" to him so it isn't necessary for him to look at the cell phone, as he would if texting. And his car is equipped for hands–free operation of the cell phone so it really is no different than speaking to a human occupant of the vehicle.

He travels a lot, he said, but he rarely has a traditional conversation on his cell phone — and almost never does so when he is behind the wheel. But, when he is using this GPS feature, which he often does because his work requires him to spend a lot of time in unfamiliar territory, "I'm still talking on my phone," he pointed out, "so, technically, I would be in violation of the law."

True — but not necessarily its spirit.

The law is intended to discourage people from talking on the phone while they're driving — which is certainly a noble objective — although, in a culture in which people can be seen trying to eat cereal, apply makeup, even get dressed behind the wheel during the daily morning rush hour, one can be forgiven for wondering if such legislation goes far enough.

Before the discussion ended, a veteran police officer came on the line. Now, most policemen with whom I have spoken about this agree that cell phone use should be curtailed while driving; they just disagree on how the law should address it.

But this particular officer wasn't too concerned about the use of cell phones behind the wheel. It's just another distraction, he said, no worse than having a conversation with someone else in the vehicle — and he went on to point out that he had many electronic distractions in his police car.

It's all a matter of being mature enough to handle it, he said.

Cell phones don't kill people.

Wednesday, December 14, 2011

A Memory and a Milestone

My parents posed with me after I received
my master's degree from North Texas in 1991.

Today is a milestone for me.

It was on this day 20 years ago that I received my master's degree in journalism from the University of North Texas. That was a proud moment in my life.

Sometimes I must admit that it all seems like a dream. Maybe that is a by–product of the passage of time. The farther removed I am from an experience, the more it seems like another lifetime — and, in a way, it is.

I will always remember that day. It was a moment of real triumph after what had been maybe the most challenging year of my life — at least, to that point.

It was special, too, because I was able to share it with my parents. For reasons that I would rather not discuss in great detail, I didn't participate in graduation exercises when I received my B.A. The story is long to tell, but it boils down to some administrative snafus stemming from the fact that I transferred to the University of Arkansas midway through my sophomore year.

I eventually got my degree, but it was issued three months after I completed my degree work, and I had relocated roughly 150 miles away where I was working as a general assignment reporter. I didn't participate in my graduation ceremony — the U of A mailed my degree to me — and my parents didn't get to see me walk across the stage to accept my B.A.

It was, to put it mildly, anticlimactic to open an envelope and take out my degree. Boom! You're a college graduate. I always imagined hearing my name called out and walking across a stage to accept my degree. Never dreamed it would be like that. It was no more special than opening the monthly telephone bill.

But I was able to share this day with my parents 20 years ago — and, for that, I will always be thankful. My mother has been gone for more than 16 years, but she saw me walk across that stage. It was the fulfillment of her dream for me as well as my own.

And it is a memory that means everything to me now.

She and my father posed with me after the ceremony was over. You can see the picture at the top of this post. That is an irreplaceable souvenir for me.

It was kind of a typical December day in Texas, as I recall — a little chilly, overcast, a bit windy. It was the kind of day that reminds you that Christmas is coming, which, in turn, reminds me of a story.

For those of you who don't know it, when a person receives his/her master's degree, the ceremony is usually called a "hooding."

I know that may sound like some kind of Ku Klux Klan ritual, but it isn't.

If you look closely at the picture, you may notice a splash of red around my neck. That is neither a cape nor a muffler or scarf of some kind. (It isn't blood, either, although I often felt, as I pursued my master's degree, that I was shedding plenty of blood.)

It is a hood, the academic dress of one who has earned a post–graduate degree. The hood is in the color of your academic major. I don't know if that varies from one American school to the next, but, when I got my master's at the University of North Texas, the color for journalism majors was red.

The traditions of academic regalia originated in the medieval universities of Europe, and the colors may vary from school to school.

Consequently, I don't know if red is the color for journalism master's or doctoral students at other schools, or if that is just the designation at North Texas.

My understanding, in fact, is that not every school has a hooding ceremony for its master's and doctoral candidates; the ones that do seem to follow their own rules.

Thus, the conclusion is unavoidable that journalism majors at other schools that do have hooding ceremonies may use different colors.

In at least one country, red is the color for those receiving post–graduate law degrees. I definitely wasn't a law student, but I did have to study communications law (and that really does have more credibility than spending the night at a Holiday Inn Express).

Hoods also tend to have the primary color of the school where the degree was earned, and, at North Texas, that color is green. That meant that my hood was red and green — Christmas colors.

My mother pointed that out to me when I picked up my graduation cap, gown and hood shortly before the ceremony. That appealed to her sense of order, I guess. She loved the Christmas season, and she was pleased that my graduation came during it (even though that happened only because a close friend of mine was diagnosed with lymphoma that spring, and I put off finishing my degree work until after his death that summer).

I'm teaching journalism as an adjunct at the local community college these days. I haven't worn my gown and hood in years, but it hangs in my closet, and I see it from time to time. On those occasions, I am reminded of that period in my life, of that accomplishment for which I worked so long and so hard.

Of course, I can be reminded of that at any time. My master's degree is on a shelf in my apartment, and I see it every day.

But seeing the gown and hood that I wore on that day is different.

It's like a tangible link to my past.

Of course, the degree itself is, too, I guess — but not really.

The piece of paper that I was given when I walked across the stage on that December Saturday afternoon in 1991 was kind of an academic I.O.U., a promissory note. My degree would be mailed to me, it said.

It was like one of those dummy hand grenades that soldiers use in basic training — the ones that look official on the outside but are totally ineffective.

My memory is that I received my actual degree — the one that sits on my shelf today — a few weeks later. There was no extended wait for it, but that wasn't what I was holding in the picture you see attached to this post.

That was the dummy, the prop for pictures such as the one for which I posed with my parents.

The actual souvenirs that I have from that day are the graduation program (which I have somewhere although I can't put my hands on it right away), my gown and hood and the photo you see with this post.

And the memories they evoke.

Friday, December 9, 2011

Georgia On My Mind

I have this friend who lives in Atlanta. I would describe him as a devoted supporter of Barack Obama.

He says he has been disappointed and frustrated with Obama at times, but it often seems to me that he finds ways to justify or excuse those policies that he says have been disappointing and frustrating. This also leads, at times, to overly optimistic electoral expectations.

At one time, we were living parallel lives. We were pursuing our master's degrees in journalism at the University of North Texas, we were working full time at the same newspaper, and we were working part time as graduate assistants in UNT's editing lab.

Frequently, we were enrolled in the same classes. I used to tease him that I saw more of him than his wife or children did.

We got to know each other pretty well, and we found that we had a lot in common. We both considered ourselves Democrats, and we shared much the same world view.

Anyway, that friend and I went our separate ways eventually. He got his degree, and I got mine. He went on to get his doctorate at another school. I got a job teaching journalism. We had our different life experiences, as friends do.

To an extent, we've moved in different directions. He still considers himself a Democrat; I consider myself an independent. I guess his philosophy hasn't changed much; perhaps mine has, although I don't think of it that way.

But even if it is true, I don't look at it as a bad thing — more like what Joni Mitchell described in "Both Sides Now."
"But now old friends are acting strange,
They shake their heads,
They say I've changed.
Something's lost
But something's gained
In living every day."

Life has taken my friend to Atlanta, as I say — where, I presumed, he would obtain unique insights into the voting behavior of people in Georgia.

Maybe he has, but I'm inclined to think they are colored by his personal political perceptions, not necessarily by reality.

In 2008, he told me that Obama would win Georgia for two reasons — the black population of Georgia (roughly 30% of the total) would vote heavily for him (which it did, I suppose) and the presence of Libertarian — and Georgia native — Bob Barr on the ballot.

Barr, he said, would siphon off enough votes from John McCain to hand the state to the Democrats. He didn't.

More than 3.9 million people voted in Georgia in November 2008. About 28,000 of them voted for Barr.

That didn't really surprise me. Georgia has never struck me as being unusually susceptible to quixotic third–party candidacies.

When such a third–party candidate has caught fire elsewhere, in the region or the country at large — i.e., Ross Perot in '92 or George Wallace in '68 — Georgia has jumped right in there.

But, otherwise, third–party candidates have been non–factors in Georgia. Maybe the concept of a two–party system is too deeply ingrained in Georgians.

As someone who has lived in the South all his life, that sort of seems to me to be true of the South in general, and the percentages from the last election in which a third–party candidate played a prominent role — 1992 — support that.

According to "The Almanac of American Politics 1994," states in the South Atlantic region of the country (Florida, Georgia, Virginia and the Carolinas) gave a much smaller share of their vote to Perot (16%) than almost any other region. The states in the Mississippi Valley — Alabama, Arkansas, Kentucky, Louisiana, Mississippi and Tennessee — gave the smallest (11%).

In other words, even in a year in which the third–party candidate was bringing millions of previously politically inactive voters into the process, the South resisted the temptation to abandon the two–party arrangement.

The authors of the 1994 "Almanac," Michael Barone and Grant Ujifusa, used the numbers from the 1992 election to make the case for their observation of the "phenomenon" of straight–ticket voting that year. And I suppose it was a compelling argument for those who sought to explain what had happened that year.

Their analysis always struck me as being somewhat short–sighted, focused as it was on a single election.

See, I never really bought the idea that it was an isolated phenomenon. I have long believed that straight–ticket voting is a reality of American politics, particularly Southern politics. It was true in 1992. I believe it will be true in 2012 — and that the numbers from 2010 and recent presidential elections clearly suggest that the Democrats will lose every Southern state next year.

I know it was always a reality in Arkansas — but that was due, in large part, to the fact that there was really only one political party in Arkansas when I was growing up. The Democrats had a near monopoly on political power in Arkansas — and most of the South — in those days.

But that was really a different Democratic Party. As I have noted before, the politicians who led the Democratic Party in those days probably had much more in common philosophically with today's Republicans.

Eventually, in fact, many of them switched their party affiliations, but it took some time. The Southern Democrats of a generation or two back were trained at their mother's knees to be wary of Republicans.

Republicans were damn yankees, and the transition was a long time coming and really achieved incrementally. Southerners were voting for Republicans for president long before they started voting for Republicans for state and local offices.

The GOP, they were told, had inflicted Reconstruction on the South after the Civil War — and had been responsible for the poverty and misery that afflicted most who lived there, white and black, ever since. It was an article of faith, and so, with the exceptions of a few isolated pockets, most places in the South were run by Democrats for decades.

Many people mistakenly believe the South began moving away from the Democrats to the Republicans in 1980, when Reagan conservatives joined forces with Jerry Falwell and the Moral Majority, but, in hindsight, that was really more symbolic of the completion of the shift than its beginning. It was in 1980 that the Moral Majority served as the bridge for the last holdouts, the Christian evangelicals, who seemed, prior to that time, to exist outside politics — at least as an interest group or voting bloc.

The real breaking point came in the 1960s, in the midst of the civil rights conflict, campus unrest and general social upheaval. Even Lyndon Johnson, the architect of the Great Society, acknowledged that his greatest legislative triumphs, the ones that guaranteed voting rights and civil rights to all Americans, likely had handed the South to the opposition for a generation or more.

His words have truly been prophetic. Of the 11 elections that have been held since Johnson won by historic proportions in 1964, the Democratic nominee has lost every Southern state in six of them — and has only come close to sweeping the region once (in 1976) even though the party has nominated Southerners for president five times.

Most Southern states have voted for the Republican nominee for president even in years when Republicans were struggling elsewhere ... even in years when native Southerners were on the Democrats' national ticket.

I have always had mixed feelings about the fierce loyalty of Southerners. I have often felt it was more a point of pride, of not wanting to admit when one has been wrong, than a point of principle.

When Southerners give their hearts to someone, it is usually for life. Likewise, when the South gives its allegiance to a person or a political party, it is a long–term commitment — in spite of the behavior of some philandering politicians.

Giving up on a relationship — be it social or political — is a last resort for most Southerners. It is what you do when all else has failed.

(Regarding the dissolution of social/legal relationships, I have always suspected that attitude has more to do with the regional stigma about divorce that still persists, to an extent, today and the reluctance of many Southerners to legally admit a mistake was made than any theological concerns about promises made to a higher power.)

That's probably the main reason why it was so surprising when Obama won the states of Virginia and North Carolina in 2008. Virginia hadn't voted for a Democrat since LBJ's day. North Carolina voted for Jimmy Carter in 1976 but had been in the Republican column ever since.

For those states to vote for a Democrat after regularly voting for Republicans for years was an admission that could not have been easy for many of the voters in those states to make.

Numerically, it seems to have come a little easier to Virginians, who supported the Obama–Biden ticket by nearly 250,00 votes out of more than 3.7 million cast. North Carolinians, on the other hand, barely voted for Obama, giving him a winning margin of less than 15,000 votes out of 4.3 million.

I'm not really sure what this means for 2012. I mean, the 2008 results can't be explained strictly in racial terms, can they? The white share of the population is about the same in both states (64.8% in Virginia, 65.3% in North Carolina), and the black populations are comparable as well (19.0% in Virginia, 21.2% in North Carolina).

If anything, one would expect that a higher black population (along with half a million more participants) would produce a higher margin for Obama in North Carolina than Virginia — but the opposite was true.

What can be said with certainty is that both states voted Republican — heavily — in the 2010 congressional midterms.
  • North Carolina re–elected Republican Sen. Richard Burr with 55% of the vote. That's pretty high for North Carolina. Statewide races frequently are much closer.

    North Carolina Republicans also captured a House seat from the Democrats.

  • Virginia elected Republican Gov. Bob McDonnell in the off–year election of 2009, providing perhaps the first glimpse of what was to come.

    Neither of the state's senators was on the ballot in 2010, but Democratic Sen. Jim Webb, who defeated George Allen in the 2006 midterm election, announced earlier this year that he would not seek a second term. Ostensibly, his reason is that he wants to return to the private sector, but I can't help wondering if he has concluded that he caught lightning in a bottle six years ago and cannot duplicate the feat in 2012.

    Virginia Republicans grabbed three House seats from Democrats in 2010.
It was less surprising that Florida voted for Obama in 2008.

That's understandable. For quite awhile, Florida has been a melting pot for retirees from all over the nation so its politics tends to be quite different from just about any other Southern state. Until the advent of air conditioning, Florida was mostly a backwater kind of place with a population to match, but in recent decades, the only thing that has truly been Southern about Florida is its geographic location.

In many ways, its diverse population bears watching as an election year unfolds. It may be the closest thing to a political barometer, a cross–section of the American public, that one is likely to find.

The scene of an excruciating recount in 2000, Florida has now been on the winning side in 11 of the previous 12 elections — and conditions in 2008 were probably more favorable for the out–of–power party than at any other time that I can remember.

More than perhaps any other state in the region, Florida's vote seems likely to be influenced by prevailing conditions in November 2012. Obama won the state with 51% of the vote in 2008, but, again, few solid conclusions can be reached based on the racial composition of the electorate. Whites represent a smaller share of the population in Florida (about 58%) than in in Virginia or North Carolina.

But the black vote in Florida is also smaller (around 15%).

In fact, half again as many Floridians are Hispanic (more than 22%), and, while those voters will be affected by economic conditions like anyone else, they may also be sensitive to immigration issues and particularly responsive to proposed solutions to those problems.

There may well be compelling reasons for Hispanic voters to feel overly encouraged or discouraged by U.S. immigration policy under Obama.

What can be said of voting behavior in Florida in 2010 is that Florida's voters made a right turn.

Republicans seized four House seats from Democrats, elected one of the original tea partiers to the U.S. Senate and replaced an outgoing Republican governor with another Republican governor.

There has been persistent talk, in fact, that the senator — Marco Rubio — will be the GOP running mate, no matter who the presidential nominee turns out to be.

And if that turns out to be true, the party really will be over in Florida ...

... and elsewhere in the South.

Wednesday, December 7, 2011

War and Peace

We'll be hearing a lot today about war and peace.

Mostly war, I suppose, and that is understandable. Today is, after all, the 70th anniversary of the attack on Pearl Harbor — the event that literally pushed the United States into World War II although one could argue that it had been getting more and more involved in the conflict in the months leading up to the attack.

It is an event that still resonates with people of my parents' generation. They were children when the attack occurred, and, although my mother has been gone for many years now, I remember her telling me of the peaceful Sunday afternoon that suddenly changed when the news came across the radio that Pearl Harbor had been the victim of a sneak attack.

It is hard for me to imagine anyone going through the American education system and not hearing a recording of FDR's famous speech to Congress, when he said that Dec. 7, 1941 was a "date which will live in infamy."

That date has certainly lived on in people's memories.

For me, today brings back memories of 20 years ago when I was working for a small daily newspaper, and I participated in the production of a special section commemorating the 50th anniversary of that attack. For weeks, we solicited 1940s era photos of local residents, both living and dead, who served their country — and we published articles about many of them and their experiences.

That project coincided with my graduation from graduate school. A week later, I was going to receive my master's degree. There were many things demanding my time and attention.

It was a grueling period in my journalism career, to be sure. I had no idea there were so many WWII veterans in the county where I was living — until we took on that project.

Most of them were living then. Far fewer are apt to be living today — and it does make me wonder when we will stop observing Pearl Harbor Day in the kind of semi–official way that we have in recent years. It seems we are moving in that direction with the attrition of people who still remember that day.

That happens with some of history's significant dates. So much time goes by and the people who remember the event pass away, and we are left with holidays and/or anniversaries for which we must be reminded the origin.

Take Veterans Day. It used to be called Armistice Day, which was the observation of the anniversary of the end of World War I.

Hostilities in that conflict ended in 1918, more than 90 years ago. The last time I recall anyone mentioning that event was when I studied history in high school — and my memory is that my history teacher really didn't spend much time on it.

To be sure, the outcome of World War I wasn't very popular in Germany, which paid a heavy price — and that could be said to have played a role in the eventual rise to power of the Nazis in the 1930s. Kinda depends on one's interpretations of things.

Chronologically, though, it is beyond dispute that anyone who was old enough to serve in that war would have to be around 110 years old today. There are a few of those left in the entire world, but not many. Armistice Day long ago lost its meaning as the World War I generation dwindled — so today it is known by the more generic designation of Veterans Day.

Which is not to be confused with Memorial Day. That is a completely different holiday in a completely different time of year — but it does have a similar history.

It started out as Decoration Day, a day for honoring those who died during the Civil War. I don't think there is a particular anniversary connected with it; the graves of Confederate soldiers were decorated in several Southern cities during the war and the practice simply continued after it ended.

Obviously, no one who was alive in the mid–19th century is still living — so there is no one for whom Decoration Day has any meaning. We continue to observe it, though, under the more generic name of Memorial Day.

The purpose evolved to include remembering those who fought in all wars, not just the Civil War, and in recent years it has expanded to include memories of anyone who is no longer living, even if that person didn't serve in the military.

George Carlin used to point out that sports like football that tend to emulate war are played in facilities that use such generic names as War Memorial Stadium or Soldier Field. It is part of the competitive nature of sports, I suppose, that the places where these games are played should bear names that conjure violent images — even though a sport will never be as violent as war.

But not everything that happened on Dec. 7 has been violent.

Sometimes there has been peace and hope.

On this day in 1972, for example, Apollo 17, the last manned mission to the moon, was launched. As they left the earth and began making their way to the moon, the crew looked back and took a picture of the earth that is known today as the "Blue Marble."

Seen from that vantage point, the blue marble looks so peaceful, just floating along in the black velvet of space. One would never guess that so much turmoil exists on the surface of that marble, that there is savagery loose upon the land capable of causing great pain to millions without the slightest hint of remorse.

Yet the image of the blue marble sparks in many of us that wish for peace on earth and good will to men.

Not a bad thought to keep in mind during the Christmas season.

Tuesday, December 6, 2011

Quayle's Endorsement

If former Vice President Dan Quayle is smart — and I believe the ship sailed on that one quite awhile ago — I think I would avoid taking sides on the 2012 Republican presidential race.

Publicly, anyway.

Privately, of course, he can do as he pleases — like anyone else.

But Quayle apparently is going to publicly announce his endorsement of Mitt Romney for the presidency today in Arizona.

And that could really open a Pandora's box.

Quayle, who was born in Indiana, grew up in Arizona, then returned to Indiana where he graduated from high school and worked for the family newspaper and practiced law before embarking on a political career that took him to the U.S. House and U.S. Senate before his four–year term as George H.W. Bush's vice president.

I guess Quayle had a pretty good image in Indiana when Bush picked him to be his running mate. He never got less than 54% of the vote, including the time he unseated incumbent Sen. Birch Bayh in 1980, and, for most people outside Indiana, I guess, the sight of him at the 1988 Republican convention was their first real exposure to him.

The choice was controversial from the start.

Quayle didn't help matters — either during the campaign, when Lloyd Bentsen memorably told him he was "no Jack Kennedy," or after the election and subsequent inauguration, when he told American Samoans they were "happy campers" or when he supposedly said he regretted not having studied Latin in school so he could converse with a group of Latin Americans.

That latter item, incidentally, is said to have started as a joke about Quayle that took on a life of its own. Some of Quayle's defenders clearly have indulged in some revisionist history — it's hard to deny the statements that live on in video and audio tape — but others are correct when they suggest that many of Quayle's alleged malapropisms started as jokes that appeared credible because he really did utter so many others.

Depending upon the identity of the GOP's eventual running mate, he or she should study the Bush–Quayle 1988 campaign for tips on what not to do — and how to handle the inevitable setbacks and misstatements. It's all part of living under the microscope.

It's a pity that Sarah Palin — or her handlers — didn't try to apply any of the lessons that should have been learned from the Quayle experience.

Quayle, it seems, is still learning. He's been away from the vice presidency for nearly 20 years now, but people still remember things he said — even if he doesn't.

Shira Schoenberg of the Boston Globe observed that "Quayle is known for his rhetorical blunders — once, spelling potato with an 'e' on the end."

As long as Quayle doesn't jump headlong into the campaign and draw more attention to himself, as long as he makes his endorsement and then retreats into private life, there probably won't be too many more reminders of the weird old days — when Quayle said things that were actually attributable, like when he mangled the United Negro College Fund's slogan by saying "what a waste it is to lose one's mind, or not to have a mind is being very wasteful."

So my advice to Quayle would be this:

Express your opinion. Make an endorsement. Put a Romney sticker on your car.

Then shut up.

We already have enough of your misstatements to write a book.

Come to think of it, several people already have. No sense in providing ample material for a sequel.

Monday, November 28, 2011

Hoosier Buddy?

In the ongoing countdown to next year's election, there are 344 days to go until the votes are counted on Nov. 6, 2012.

That's 49 weeks from tomorrow.

Just think of all the things that will be determined — one way or another — between now and that night 49 weeks from tomorrow night.

In spite of that, though, there are a few things that can be taken for granted.

It is generally assumed, for example, that Barack Obama will receive his party's nomination. No challenger has emerged; in fact, no Democrat, prominent or otherwise, is even said to be considering a challenge.

Diehard Democrats have been saying for months that the absence of competition for the nomination is a good sign. Jimmy Carter was challenged for his party's nomination in 1980, they have pointed out, and went on to lose the general election. Bill Clinton, on the other hand, was not seriously challenged for his party's nomination in 1996 — and easily won a second term.

No challenger means Obama doesn't have to spend campaign resources on his pursuit of the nomination. He can hold the funds for the fall campaign, when he can concentrate on winning the battleground states and the states that he carried last time that Democrats rarely win — and he can start slinging mud, as an incumbent with an unemployment rate as high as the one in America today must (and, inevitably, will) do, at whoever is leading in the polls this week.

That wasn't Lyndon Johnson's problem. LBJ's nomination was never in doubt, but he did have some modest opposition. Primaries didn't play the pivotal role in the nominating process in 1964 that they do today, but there were a few, and Alabama Gov. George Wallace challenged Johnson — and did astonishingly well — in some primaries in Northern states.

In those primaries, historian Theodore H. White wrote in "The Making of the President 1964," Wallace sought "to test whether racism could magnetize votes in the North as well as the South."

In Indiana, Wisconsin and Maryland, Wallace got his answer.
"Wallace astounded political observers not so much by the percentage of votes he could draw for simple bigotry (34 percent of the Democratic vote in Wisconsin, 30 percent in Indiana, 43 percent in Maryland) as by the groups from whom he drew his votes. For he demonstrated pragmatically and for the first time the fear that white working–class Americans have of Negroes. ... in the mill town of Gary, Indiana, he actually carried every white precinct in the city among Democratic voters ..."

Theodore H. White
The Making of the President 1964

Barring the most wildly improbable of developments, Obama will be the Democrats' standard bearer in 2012. No suspense there.

But the identity of Obama's opponent remains a mystery, and no one knows what the economy will be like when people go to the polls next fall.

So there is some suspense as America prepares for the start of the primary/caucus season.

The conventional wisdom is that people make up their minds about a presidency, not necessarily a president, about six months before an election. And, while today's Democrats would like to think that people will make their voting decision based on whether they like Obama on a personal level, the fact is that liking an incumbent and approving of the job he has done are two entirely different things.

It does help if voters like the president, and survey after survey shows that Americans tend to like Obama personally. But those same surveys show that most Americans think the country is going in the wrong direction.

That can be decisive in places where the outcome is in doubt — in the modern–day battleground states, where many voters may feel torn between the fact that they like Obama but don't like where they think the country is headed.

For many reasons, I feel safe in predicting that the Republican nominee — whoever that turns out to be — will win Indiana next year.

Indiana was an unexpected bonus for Democrats on Election Day 2008. The state votes for a Democrat about once in a generation — if that. Obama's victory there was the first for a Democratic presidential nominee in 44 years.

If Johnson hadn't carried Indiana in 1964, Obama would have been the first Democrat in his lifetime to carry the state.

LBJ was the only Democrat to carry Indiana in the lifetime of Obama's mother. She was born in 1942, and the last Democrat to carry Indiana before Johnson was Franklin D. Roosevelt in 1936.

Indiana voted for FDR in 1932, too. It took something as big as the Great Depression to get Indiana to vote Democratic in consecutive elections. Before the 1930s, the last time Indiana voted Democratic in consecutive elections was in the years just before the outbreak of the Civil War — in the middle of the 19th century.

Indiana did vote Democratic four times in the 20th century. In addition to LBJ's 1964 landslide and FDR's landslides of 1932 and 1936, Woodrow Wilson won the state in 1912 — when Republicans were divided between President William Howard Taft and former President Theodore Roosevelt.

If the Republicans had been united that year and either Taft or Roosevelt had been their nominee, their share of the vote combined would have exceeded Wilson's by nearly 35,000 out of more than 650,000 cast — a narrow margin, sure, but more substantial than the margin in Indiana for the Republican running against Wilson when he sought re–election four years later.

When he wrote about Johnson's landslide nearly 50 years ago, White also wrote about patterns he detected in the election returns, including the "ripples and bubbles of protest" spawned by the civil rights movement and the general racial unrest across the nation.

Such "ripples and bubbles," White wrote, were so hard to spot that one was forced to "pore over charts to find them." But he did observe evidence that the Democrats, as LBJ himself would say the following year, were handing the South to the Republicans for half a century.

The South, White wrote, showed "significant" declines in Democratic support, and those declines clearly continued in the 1970s, 1980s and 1990s, through good years and bad years for both parties.

I guess it wasn't hard to identify that trend in the South in 1964. Five states in the Deep South voted Republican — some heavily — and the ones that remained in the Democratic column, as they had for generations, did so by much narrower margins than ever before, even when popular Republicans like Teddy Roosevelt and Dwight Eisenhower were on the ballot.

Almost no other states, even traditionally Republican ones, voted against Johnson in 1964. Nevertheless, White identified some ethnic "ripples and bubbles" in some northern states like Indiana — "Polish working–class wards" where the Republicans "managed to shave the Democratic percentages" in spite of the fact that it was an overwhelmingly Democratic year.

White acknowledged that he could not determine "whether this was an echo of backlash" or "ethnic identification" with the Republican running mate's Polish–American wife.

But the next 10 presidential elections suggested that Indiana's support for the Democrat in 1964 was an aberration, not the start of political realignment there.

And there is no reason to believe that Obama's victory there in 2008 was a realignment, either. His coattails weren't just short in Indiana, they were nonexistent. While Obama was winning a squeaker (50% to 49%) against John McCain with the help of young and minority voters in the cities, the Republican governor was being re–elected with 58% of the vote.

In 2010, Republican Dan Coats, who spent a decade in the U.S. Senate previously, was elected the state's junior senator with 55% of the vote. Six of the state's nine House districts elected Republicans, most of them with more than 60% of the vote.

Indiana&apo;s roots are planted deep in Republican soil, and its support for the Democrat in 2008 was an aberration. Any state–by–state prediction for 2012 that suggests that Obama will retain Indiana can be dismissed as unreliable.

On the evening of Nov. 6, 2012, Indiana is likely to be one of the first states projected for the Republican nominee.

You can take it to the bank.

Thursday, November 24, 2011

A Cold Case Turns 40

It was 40 years ago today that a man known to history primarily as D.B. Cooper hijacked a Northwest Orient Airlines 727, demanded $200,000 and parachuted from the plane into legend somewhere between Portland, Ore., and Seattle.

The conventional wisdom for these last four decades has been that Cooper (who actually purchased his ticket under the apparent alias of Dan Cooper, but, because of miscommunication, is remembered almost exclusively as D.B. Cooper) couldn't have survived the jump, given the terrain and the weather at the time — and the fact that he was wearing an ordinary business suit that offered little protection against the subzero temperatures.

But, if he did not survive, no sign of his remains have been found, and neither has any sign of the money he jumped with — except for a few thousand dollars found in 1980 that are said to have been part of the ransom that was paid to Cooper.

The balance — nearly $195,000 — remains unaccounted for.

So, 40 years later, Cooper still commands the attention of the FBI, which has maintained an active investigation and continues to follow up on leads, however remote they may seem. Special Agent Larry Carr has been heading a citizens' research unit for nearly five years; that unit recently caused a bit of a stir when it was revealed that traces of pure titanium, aluminum, stainless steel and bismuth had been found on the neck tie Cooper left on the airplane.

There was also a claim made by a woman that Cooper was her uncle.

As Gar Swaffar of Digital Journal writes, those traces did provide some clues — not about where Cooper was when he leaped into popular lore on that cold, stormy night 40 years ago but where he came from.

"The primary use of pure titanium at the time was in the chemical industry," notes Swaffar, "and the other place it would be found was in the facility producing the titanium."

Swaffar doesn't really talk about bismuth, which may be the least familiar to most people. It has recently been found to be slightly radioactive, but that would not have been known to the people of 1971 — so, while the introduction of radioactivity into the conversation may invite all sorts of sinister thoughts, one must remember to focus on how bismuth was used in the early 1970s if one expects it to serve as a legitimate clue to Cooper's origin.

Its presence on anything in 1971 suggests to me a link to possibly cosmetics and some over–the–counter medicines like Pepto–Bismol (in which small traces of bismuth can be found).

Anyway, the examination of that trace evidence appears to have yielded nothing that could help close the book on the story of D.B. Cooper — and the woman's claim to be the niece of a man her family always called "L.D." appears to have been discredited as well.

Today, 40 years after his daring jump, D.B. Cooper's fate is still as mysterious as it was in 1971. Did he survive the jump? If he did, did he get away with the rest of the money? And, if he did not, what happened to the money? And what happened to his remains?

The world may never know.

Friday, November 18, 2011

The LBJ Factor

I've watched the rapid descent of the popular image of Barack Obama since he took the oath of office.

And I've been intrigued by his apparent public evolution — from the early days of his presidency, when he was widely seen as the reincarnation of Lincoln, Washington and/or FDR, to the recent comparisons between the president and (at best) Bill Clinton following his party's disastrous losses in the 1994 midterms or (at worst) Jimmy Carter's one–term administration.

No one knows with absolute certainty what will happen between now and next November; consequently, no one knows if the voters will deny a second term to Obama or if they will re–elect him with a rousing vote of confidence.

Thus, all these comparisons — while each has certain valid points — are based largely on self–serving speculation.

Republicans would like everyone to believe we are witnessing Carter Redux — because that would mean we are on the brink of the ascendance of another Reagan.

Democrats would like everyone to believe that, in spite of criticism of Obama, we are witnessing a reprise of the Clinton years — and, in 2012, will see a reinvigorated president win a second term by approximately the same margin in the Electoral College that elected him the first time.

Time will tell if either scenario is correct — or if an entirely new paradigm is being written.

My money is on the latter — because, while history truly does repeat itself, it never seems to do things exactly the way it did before. Times change.

In other words, we might be witnessing a Republican resurgence similar to the one that overwhelmed Carter and the Democrats in 1980 — but it might not necessarily produce another Ronald Reagan.

And, even if it did, the times are different. This isn't like a TV rerun (outside of syndication, do they still do that anymore?) — or even a remake. The people would be different. The circumstances — and, hence, the decisions they must make — would be different.

Perhaps the differences would be subtle. Perhaps they wouldn't be so subtle.

Likewise, we could be witnessing another rebound of an embattled Democratic president whose party suffered massive midterm losses (or perhaps, as some Democrats have been suggesting with fondness, another "Dewey Defeats Truman" election in which the incumbent scores a completely unexpected victory), but it doesn't necessarily mean that Obama's second term would be more successful than his first.

(Actually, second terms often seem to be worse than the first. It's worth remembering that Truman's popularity really began to sink irreversibly after his inauguration in 1949, and, while Clinton was re–elected two years after the GOP seized both chambers of Congress, his second term was largely mired in his impeachment defense.)

At the moment, if I am inclined to compare Obama to anyone, it is Lyndon Johnson. I see several similarities/parallels between the two presidents.

At what may be the most basic level, Johnson was the last Democrat to carry states like Virginia and Indiana — until Obama in 2008 (Carter was the only other Democrat to win North Carolina; Carter and Clinton were the only other Democrats to carry Florida).

Both Johnson and Obama won landslide victories in the Electoral College — but so did Clinton (twice). He just didn't receive a majority of the popular vote.

Voting patterns, like poll results, are not infallible indicators of what to expect — but they do provide a certain amount of guidance in the right direction.

It is not in the voting patterns, though, that I see the most striking similarities between Obama and LBJ. It's in their priorities as president — and the public's response, via its approval ratings.

One could say many things about LBJ — and, without a doubt, most, if not all, the good and the bad, were true — but one that is absolutely undeniably true is that he was a great admirer of Franklin D. Roosevelt.

And LBJ wanted to leave his mark on domestic policy — as he believed FDR had. He wanted to exceed what his idol had accomplished.

Oh, sure, there were contemporaries of Roosevelt who would tell you that his skill in foreign affairs was evident in his handling of American participation in World War II — both before and after America officially entered the conflict.

But you can still see his hand behind many of the programs and policies that were created to battle the Great Depression of the 1930s — and still exist today.

LBJ was raised in poverty. Such conditions strongly (and, often, adversely) affect how a man approaches the issues and relationships in his later life, and LBJ earnestly wanted to eliminate poverty. He appreciated FDR's courage in the face of a savage economy, which he witnessed as a young man in Texas and then as a member of the House. "[Roosevelt] was the one person I ever knew, anywhere, who was never afraid," he said after FDR's death in 1945.

When LBJ became president, his #1 goal was to expand on FDR's New Deal with his "Great Society."

And when he won a full term on his own — with a share of the popular vote that surpassed anything that Roosevelt ever received — it was seen by many as an endorsement of his domestic agenda, whether it really was or not.

That part probably was irrelevant because the times forced the American people — as the times so often do — to re–focus their attention.

In the 1960s, that meant Vietnam.

It may have been Johnson's misfortune to become president right when fate made foreign affairs the topic that was increasingly of the most concern to Americans, but presidents don't get to choose what kind of world exists when they are in office.

When LBJ won by a landslide in 1964, Vietnam was still a faraway land that most Americans knew nothing about. The campaign did not focus on foreign policy, but, before long, Americans were dying at a terrifying pace in the jungles of Southeast Asia, and the Johnson administration seemed powerless to do anything about it.

Much may stem from the fact that Johnson's military experience was so limited. Born in 1908, he was too young to serve in World War I, and his early life was influenced by poverty, not the battlefield.

When America entered World War II, Johnson was in Congress. He became a commissioned officer in the Naval Reserve and asked for a combat assignment but was sent to inspect stateside shipyards instead. The closest he came to actual combat was his short–term assignment to a three–man observation team that was sent to look into conditions in the Southwest Pacific.

Two years after Johnson won more than 60% of the popular vote and about 90% of the electoral vote, his Democrats suffered a severe setback in the midterm election, losing 47 House seats. The war had been escalating and, in spite of his efforts to combat poverty, Johnson's domestic agenda didn't seem to be all that successful, either — with race riots occurring from coast to coast.

(Interestingly, one of the freshman Republicans elected to the House in 1966 was George H.W. Bush, the future president and father of another.)

LBJ's popularity dropped sharply, so sharply that, even though he could have run for another term in 1968, he decided not to.

It must have been a disappointing — not to mention dizzying — decline for LBJ. I really believe he wanted to be remembered as a great domestic president — and instead he was consumed by an ugly little war in Vietnam.

LBJ probably benefited enormously from the sympathy and good will of Americans following the assassination of President Kennedy. His popularity never dropped below 60 — and often was much higher — in his first two years as president — but then began its perilous decline in 1966 after Johnson said U.S. troops should remain in Vietnam until Communist aggression had been stopped there.

And when a president's approval numbers go in the tank and stay there for awhile, my experience is that it is really hard to pull them out.

We'll never know if Johnson could have overcome those numbers. By the end of April 1966, a quarter of a million American troops were in Vietnam, and Johnson's approval rating dropped below 50 for the first — but far from the last — time. Less than two years later, he dropped out of the race for the 1968 Democratic nomination.

Fast forward 40 years.

Obama is kind of Lyndon Johnson in reverse — at least when it comes to his policy preference. He wasn't interested in domestic policy. He certainly wasn't interested — or experienced — in economics. He wanted to be a foreign affairs president.

I'm not really sure what drove him to focus on foreign policy. Critics would say it was the political angle, that it was the topic everyone wanted to talk about in 2008. But I think it goes deeper than that.

He never served in the armed forces — but that isn't unusual for people in his age range or younger. Selective service was stopped at the conclusion of the Vietnam War, then registration resumed in 1980, but it's been an all–volunteer service ever since. The most Obama was obliged to do by law was register for a nonexistent draft after he turned 18.

Maybe it has something to do with the multicultural environments in which he grew up. Perhaps it is rooted in his biracial parentage. Whatever the influence was, he specialized in international relations when he studied political science at Columbia University in the early 1980s.

Clearly, that interest was there long before it may have been expedient for a presidential campaign. And it was reflected in his Senate committee assignments when he announced his intention to seek the presidency — Foreign Relations, Homeland Security & Governmental Affairs, Veterans Affairs.

And recent polls suggest that it is one of the few areas in which Americans tend to give him positive marks.

But the times don't call for a foreign affairs president.

That doesn't mean foreign affairs isn't important. It is always important, and, most of the time, the need for an international leader is sudden and unexpected — after all, even with Osama bin Laden gone, who knows when or if another 9–11 will occur?

But poll after poll after poll reports that the voters are overwhelmingly concerned about pocketbook issues, and Obama brought no practical experience in that to the White House. When Obama entered the 2008 race, the unemployment rate was about half what it is today. There were — as there always are — economic naysayers who claimed that this policy or that one would lead the country to ruin.

No one really took that kind of talk seriously when Obama launched his seemingly quixotic campaign in February 2007. In fact, the other Democrats who were lining up to run for the 2008 nomination — including Obama — were intent upon foreign policy, too — specifically, ending American involvement in Iraq and Afghanistan.

But once again, fate intervened. In the month before the first primaries and caucuses of the 2008 election season, a recession began. It wasn't clear to most Americans how severe things would get, how many jobs would be lost in the months ahead, but a recession had begun that would gather momentum and plunge America into an abyss.

The frustration has grown.

Obama isn't going to bow out of the race the way LBJ did. I don't think he is that pragmatic. He probably still thinks he can win — and maybe he can. But I strongly doubt it.

In the early days of his presidency, polls showed Obama's approval in the 60s, but, with the exception of a brief uptick following the death of Osama bin Laden, his ratings have been below 50 in most surveys for a couple of years now.

The numbers aren't quite as bad as LBJ's — and they are comparable to Clinton's — but I can't help but think that, even if Obama fights it out to the end, he faces essentially the same fate as LBJ.

Perhaps their mutual legacy is that they and the times in which they served were mismatched.

Sunday, November 6, 2011

When They Liked Ike

Americans are a puzzling bunch.

They can be hopelessly nostalgic, yearning for the simplicity of the past, yet demanding and unforgiving when things don't happen as quickly as they would like, ignoring completely the fact that speed is often achieved at the expense of other things.

It is a lesson that history has taught us repeatedly, but each generation seems intent upon re–learning it, and modern presidents are often the whipping boys, deservedly or not.

Since the midway point of the 20th century, nine different men have been elected president and only five have been re–elected.

One (John F. Kennedy) was assassinated before he could seek a second term so I suppose he really doesn't count. His successor (Lyndon Johnson) served less than a year before winning a full term on his own, but, although he could have sought a second term, he was so unpopular that he chose not to.

Another (Gerald Ford) was never elected; he was appointed to fill a vacancy in the vice presidency and then became president when the duly elected president resigned. When he ran for president 35 years ago, it was for the first time — even though he had been president for more than two years.

But even when you allow for those exceptions, America still has seen — in my lifetime — three sitting presidents (including the unelected Ford) who asked voters for four–year terms and were refused.

Such a thing was practically unheard–of for people of my parents' and grandparents' generations.

Of course, before 1950, one man (Franklin Roosevelt) was elected four times. And my grandparents were old enough to remember Woodrow Wilson, who was narrowly re–elected in large part because he had kept America out of war — only to be sucked in to World War I the following year.

A third president, William McKinley, was re–elected in 1900 but was assassinated the following year.

In the first half of the 20th century, two presidents (Theodore Roosevelt and Calvin Coolidge) decided not to run a second time, and one (Warren Harding) died before he could make the decision.

Two sitting presidents were refused re–election in the first half of the 20th century, and there were extenuating circumstances for each. One (William Howard Taft) didn't like the job and, from the accounts I have read of his re–election campaign in 1912, didn't make much of an effort to keep it. The other (Herbert Hoover) had presided over the start of the Great Depression.

Otherwise, the American people seemed willing, if not eager, to renew a president's contract in the first half of the 20th century. They just didn't always have the opportunity to do so.

Times change, of course, but I think it is fair to conclude that modern Americans have grown impatient. Perhaps it is due, to a certain degree, to the instantaneous nature of modern society.

When I was a child, it was a given that just about anything that was worth doing or worth having would take some time as well as an investment of money. Somehow, though, the investment of time seemed to make the achievement that much more special and valuable.

For instance, when I became old enough to receive an allowance and start making money decisions for myself instead of asking my parents for things I wanted, I had to start making decisions, when appropriate, to hold on to my money and accumulate it for larger purchases.

And sometimes I had to make sacrifices. I remember once in the late spring, when I was perhaps 9 or 10, and the neighborhood kids and I were idly tossing small rocks at the roof of my house, trying to get them to land on the roof and stay there.

Why were we doing that? I haven't a clue. Why do kids do anything?

My house was a two–story building, and it took considerable effort for a 9– or 10–year–old to heave even a small rock as high as our roof.

I remember throwing one as hard as I could — and hearing a sickening "cra–a–a–ack!" as it struck the window in my parents' bedroom.

My father came rushing out the front door minutes later, demanding to know what had happened. We were all too stunned by what had happened, I guess, to make up an alternative story, and the truth came tumbling out.

I was told that I would not receive my allowance until a new window had been paid for. As I recall, the window cost $3, which doesn't seem like very much now, but it represented a summer's worth of allowance money for me at the time. I wasn't able to buy baseball cards all summer.

When the window was paid for and I began receiving my weekly quarter again, I felt a genuine sense of accomplishment. In many ways, the time I had sacrificed in pursuit of this goal was as significant to me as the money itself.

When I was in college and I was working on a research paper, I had to spend hours, if not days, in the library, following leads that might or might not contribute much to my paper. A "term paper" was frequently descriptive — the work often did take an entire term to complete.

The same research, in the internet age, can be done in minutes.

Things are different today. We eat pre–cooked meals that we heat in microwaves, or pick up fast, artery–clogging food on the run. We record TV programs and watch them at times that are convenient to us instead of sharing the experience with millions at the same time. We take pills if we have even a slight pain or if sleep doesn't come to us right away.

We are a highly fragmented culture, obsessed with ourselves as individuals and our needs. It really isn't surprising that the names of some of the more popular magazines in the United States focus on the individual or small groups — i.e., Self, Us, etc.

The people of my parents' day became known as "the Greatest Generation" because of their dedication to long–term group goals and their willingness to sacrifice themselves for each other.

My generation was more self–centered, and it seems to have become easier to exist in that mode as time has passed. I've noticed that the people who have come along since my generation are even more prone to this kind of behavior.

We want what we want when we want it.

That's what makes what happened on this day in 1956 so intriguing for historians.

It may well have been the last time a president was elected almost entirely according to the standards that motivated the "Greatest Generation." Dwight Eisenhower, who was re–elected president 55 years ago today, had no political background when he ran for president the first time. He'd been an Army man most of his life, and he was in charge of the "Greatest Generation" when it stood up to the Germans, Italians and Japanese.

It probably didn't require much effort on the people of that time who had entrusted their lives and futures to Eisenhower to trust Ike with the presidency as well.

In many ways, it is the world of the 1950s to which people have been trying to return ever since. It was a world before my time so I can't say whether life was preferable then or whether the leaders of that time were more successful at selling the concept to people as the way things should be.

My thoughts are that it was a time like any other time. There were new and seemingly miraculous inventions, and there were the almost constant growing pains of an evolving culture. The civil rights movement was beginning to blossom, which meant white America had to start coming to terms with its racial past.

And there was a nuclear tension between the superpowers. Of course, terrorism was not part of the equation then — so I guess that's kind of a wash.

I remember, though, when the Happy Days show was on the air, and some of the kids in my class asked one of our teachers if the 1950s really had been "happy days."

He pondered the question for a minute, smiled, shook his head and said, "No."

I guess it's really all a matter of perspective. When Happy Days was on the air, I knew many people who would watch it and tell you, wistfully, that the 1950s really were happy days.

Those times would seem primitive — no cell phones, no computers, no cable TV — and hopelessly naive — no security procedures to speak of in most airports, even in the largest cities — to 21st century Americans if they could go back in time like Michael J. Fox in "Back to the Future."

But, from what I have read, the Eisenhower years were a time when Americans felt they had a paternal role model in the White House, a kindly father figure who could be trusted.

With the possible exception of the Reagan years (which is kind of ironic in itself), there has been no period like it in my memory.

Was it better? Was it happier? Who knows?

But that hasn't kept Americans from pursuing it, anyway.