Monday, December 26, 2011

A Close Call

Ordinarily, the American public doesn't know how a president perceives the decisions he must make while he is still in office.

There are, after all, so many decisions a president must make during his term. Seems like the destinies of most presidents, in the words of Forrest Gump, float on a breeze — and not a gentle one at that. They kind of go from one decision to the next without spending too much time (if any) reflecting on one that has already been made. By that time, there are already half a dozen more matters that need the president's immediate attention.

You usually have to wait until a president leaves office, catches his breath and writes his memoirs before you learn which decisions were the most gut–wrenching ones he had to make. They're usually pretty predictable, too — where (and whether) to put American troops in harm's way, which programs to support financially, etc.

Decisions that require courage, that call for the wisdom of Solomon.

Last week, Amie Parnes of The Hill offered Americans a rare glimpse into the mindset of a president — well, of this particular president — and it wasn't for all three years of his presidency, just the current "holiday season." Still, I rather doubt that it will be the subject of a political science lecture, though.

"The toughest call for the president this holiday season," wrote Parnes, "could be whether to join his family for Christmas in Hawaii or stay in lonely Washington."

Parnes conceded that "there's no ideal time for a presidential vacation," which is certainly true. I cannot remember a president who was not criticized for taking a little time away from the Oval Office (Harry Truman once called it the "crown jewel of the federal penal system"), but "this one comes at a particularly inopportune moment."

Yes, I guess you could say that. Obama was already requiring Congress to remain in Washington until an agreement was reached on the payroll tax cut extension. It wouldn't have looked very good if he had taken a "do as I say, not as I do" approach to the matter and skipped town to spend Christmas on the beaches of Hawaii.

Fortunately for Obama, the intransigent House Republicans gave in (no real reason why they shouldn't; as Obama observed, legislators in both parties favored the extension), the president signed the bill into law and got to Hawaii in time to spend the holiday with the wife and kids in their beach house.

While he was in Hawaii, the president also mixed some business with pleasure, attending Christmas services on a Marine base.

Whew! Crisis averted.

Of course, for millions of Americans, holiday travel wasn't an issue. They have no jobs — or, at least, no full–time ones — and, without that, it makes holiday travel something of a moot point.

Has the same effect on the payroll tax cut, too, for that matter.

Friday, December 23, 2011

Ghosts of '68



As the polls of Republican voters have been careening from one anti–Mitt to the next, chatter about the possibility of a brokered convention has been rising above the din.

My original inclination was to dismiss such talk. The prospect just seemed too remote.

My parents weren't old enough to vote the last time there was a brokered convention. But the rapid rise and fall of challengers to the consistent frontrunner, Mitt Romney, has led me to conclude that a brokered convention is a real possibility. It may still be remote, but it is real.

Many Republicans seem to think Romney isn't committed enough to Republican values so the search has been on for someone else. And there just might be enough of those disgruntled Republicans to deny Romney enough delegate support to wrap up the nomination on the first ballot.

The problem is that each someone else to whom Republicans have flocked has been shown to be flawed in some way. It wouldn't be so bad if the flaws were thought to be modest, but these flaws have been too great for most Republicans.

Romney is always accused of being too wishy washy, a waffler, a flip–flopper. The same thing might be said of any Republican who ran for office in dark–blue Massachusetts — but he has been the one true constant in this volatile campaign. His rivals have risen and fallen, but Romney has remained about where he was from the beginning — around 25%.

That level of support won't be sufficient, his detractors argue, and I agree — if it remains where it has been through most of 2011.

But it won't.

I think it will start to fluctuate once the primaries and caucuses begin — and the fluctuations are likely to be upward. There is a kind of finality about primaries — and caucuses, too, for that matter. It is different from the phase we have been watching this year, which is dominated by fluid and non–binding polls.

Actual delegates are committed to candidates in the primaries and caucuses; there are real numbers to hang your hat on. True, those delegates are only committed through the first ballot — but delegates are not assigned to candidates at random. Those who come to the convention pledged to support Candidate A reached that point because they were true believers in Candidate A.

Unless Candidate A drops out after the first ballot, my guess is that they are likely to remain with Candidate A even if they are no longer bound by party rules to do so.

All of that remains hypothetical at this stage because no Republican convention has gone past the first ballot since Tom Dewey was nominated in three ballots to face President Truman in 1948.

Democrats hope that history will repeat itself in 2012 — an embattled Democratic president, running against a "do–nothing Congress," comes from behind to stage an upset victory and triumphantly waves the early edition of a metro newspaper prematurely proclaiming his electoral demise.

(In the 21st–century version, I guess the image — digital, of course — would be of Obama holding up a laptop with the message "Romney (or Gingrich or Paul or whomever) Defeats Obama.").

Perhaps that is what fate has in store. I have my doubts. I still believe the prevailing economic conditions will have an overwhelming influence on the outcome of next year's elections — unless something totally unexpected happens to distracts the public's attention a week or two before the election.

Republicans, on the other hand, have been wishing for a repeat of 1980, when a charismatic Republican (Ronald Reagan) emerged to defeat an embattled Democratic president (Jimmy Carter) in the general election.

Which scenario you see developing may depend upon which side you favor, but, from a purely historical standpoint, the election I see as having the most in common with the campaign upon which we have embarked is the 1968 campaign.

Then, as now, a Democrat was in the White House. He had been quite popular when he was elected four years earlier, but his approval ratings had steadily declined and his party had lost a lot of seats in Congress in the midterm elections.

"Of the paradox of Lyndon Johnson historians will write many books," wrote Theodore H. White in "The Making of the President 1968."

"Few men have done more good in their time, and no president has pressed more visionary ideas into law. Yet few have earned more abuse and roused less love, loyalty and affection from those he sought to help."

Seems like the kind of thing that some of Barack Obama's supporters might be inclined to write about him when the story of the 2012 presidential campaign is written.

Johnson decided not to seek renomination after he nearly lost the New Hampshire primary to Eugene McCarthy's insurgent campaign, and the story of the Democratic nomination battle that spring was the story of the McCarthy–Robert Kennedy duel in the primaries.

Obama is not likely to withdraw from the race — nor does he appear likely to drop Vice President Joe Biden from the ticket — so the prelude to the Democratic convention next year probably will be quite different from what it was in 1968.

But I believe the most striking similarities are to be found on the Republican side.

All along, the frontrunner for the Republican nomination was Richard Nixon, the former vice president who had been beaten in a close race for the presidency eight years earlier and then had lost a race for governor of California two years later. Nixon had a reputation as a conservative anti–communist, which he toned down in pursuit of the 1968 nomination.

And, as he positioned himself in the party's center, he turned back all challengers, one by one.

First, there was Mitt Romney's father, George, whose candidacy collapsed after he famously claimed he had been "brainwashed" into supporting the war. Romney was ridiculed for the remark and wound up dropping out of the race before the New Hampshire primary.

Nixon's next challenger was New York Gov. Nelson Rockefeller, a darling of the antiwar wing of the Republican Party who emerged as its champion with a write–in campaign in New Hampshire.

Rocky did defeat Nixon in the Massachusetts primary, but, for the most part, his bid for the nomination was ineffective.

In the spring of 1968 — after the shooting of Martin Luther King and before the shooting of Bobby Kennedy — Nixon's fellow Californian, Gov. Ronald Reagan, was his next challenger.

At that time, Reagan was not the experienced executive he was when he was nominated in 1980. The 1968 Reagan had two years of experience as governor, which didn't really compare to Nixon's political record of more than two decades, yet he defeated Nixon in the California primary. Thanks to his margin there, Reagan finished the primaries (which were not nearly as widespread as they are today) with a slight edge over Nixon in the national popular vote.

But Nixon went to the convention a handful of delegates shy of securing the nomination. Reagan and Rockefeller reportedly were going to join forces in a final effort to deny Nixon the nomination, but neither would agree to endorse the other.

When all was said and done, Nixon won by a wide margin on the first ballot — and went on to win the presidency by a narrow margin in a three–candidate race that fall.

While the manner in which Nixon's rivals fell was not the same as it has been for Romney's, I find the parallels between 1968 and 2012 compelling.

Wednesday, December 21, 2011

Fourth-Best President Ever?



"I would put our legislative and foreign policy accomplishments in our first two years against any president — with the possible exceptions of Johnson, FDR and Lincoln — just in terms of what we've gotten done in modern history."

Barack Obama
60 Minutes interview

My, someone certainly has a high opinion of himself and his place in American history.

I didn't watch the president's recent interview on CBS' 60 Minutes, but, apparently, in a segment that was not aired originally, he claimed that his administration's "legislative and foreign policy accomplishments" were as good or better than any other "with the possible exceptions of Johnson, FDR and Lincoln."

As I have said here before, I'm something of an amateur historian. I minored in history when I was in college, and I have always had an interest in the American presidency and American politics in general.

I'm also a journalist. That was my major in college, and it is the subject I am teaching now. I was trained to write and to think in Associated Press style, which constantly strives for clarity and consistency. So, when a president compares his presidency to "Johnson, FDR and Lincoln," my question is, "Which Johnson?"

The statement, you see, is imprecise. There have been two presidents named Johnson. I'm pretty sure I know which one Obama meant — Lyndon, who succeeded John Kennedy nearly 50 years ago, not Andrew, who succeeded Lincoln nearly 150 years ago.

Until the Clinton presidency, Andrew Johnson was the only president to face an impeachment trial in the Senate — where he was acquitted by a single vote. He chose not to seek a full term on his own in 1868.

A Siena College survey that was released in July 2010 rated Andrew Johnson as one of the five worst presidents in American history.

No, I'm quite sure Barack Obama did not mean to compare himself to that President Johnson. His image has undergone some changes in a century and a half, but, in recent years, he has been remembered as a "white supremacist."

I'm convinced the first black president in American history does not want to be remembered as comparable to Andrew Johnson.

Lyndon Johnson, on the other hand, is almost a Lincoln–like figure for American blacks — and he was responsible for the most advancements — in housing, education, employment opportunities, voting rights, in fact rights in general — for blacks and all other underprivileged Americans.

But LBJ, as I wrote about a month ago, had the misfortune of being a president who wanted to do great things domestically (which he did) but served at a time when foreign affairs dominated.

I wrote that Obama appears likely to turn out to be LBJ in reverse — a president who first ran for the presidency because he wanted to end a war and wound up being undone by his inability to tame the economy.

In addition to teaching journalism, I have also been teaching basic writing, and one of the things I try to impress on my students is the importance of using the right word to express the right thought.

That isn't an easy thing for most people — even people who earn their livings (or who have earned their livings) as writers struggle at times to find the right word. I know I do. Most of the time, I keep a thesaurus within arm's reach whenever I sit down to write — and there are still times when I choose the wrong word.

Nor is it easy to select the right word when one is being interviewed without some notes or a TelePrompTer to help. Consequently, I do have some sympathy for Obama. I have seen many people "misspeak" (to use a word that was particularly popular during the Watergate days) in such a setting.

But this wasn't the first time Obama has been interviewed by someone. Far from it. He is no novice when it comes to being interviewed. He just has a tendency to stick his foot in his mouth when he does.

When Obama suggests that his presidency is the best in history "with the possible exceptions of Johnson, FDR and Lincoln," I really have to marvel at his use of the word "possible" and what it implies.

In hindsight, Obama himself might admit that it wasn't the most prudent word choice he could have made, but I believe it speaks volumes about what he really thinks of himself and his presidency.

I think he really does believe his presidency, in its first two years, accomplished more than any other president — but he will allow for the possibility that LBJ, FDR and Lincoln accomplished more.

Lincoln is kind of a no–brainer. The Siena survey listed him third, and most surveys rank Lincoln in the top three.

FDR was the top–rated president in Siena's survey, which is also kind of a no–brainer. The only president to be elected four times, he guided the country through its worst economic crisis ever and is credited with leading it through World War II even though he died a few weeks before hostilities ended in Europe.

But Siena's survey did not rank LBJ in its Top 10. Apparently, Obama holds him in much higher esteem than most historians — at least the ones who were surveyed.

They ranked Theodore Roosevelt second. Roosevelt is remembered for several achievements — trust busting, conservation, labor laws, public health and safety laws — that continue to influence American life.

T.R. was the first American to receive the Nobel Prize — but, unlike Obama, he was rewarded for an actual achievement (negotiating the resolution of the Russo–Japanese War), not merely for his potential. By his omission from Obama's statement, though, it appears the president thinks his accomplishments in his first two years were greater than Roosevelt's.

The survey listed George Washington as the fourth–best president, and that should be a no–brainer, too. He is remembered as the father of the country, its first president. Thanks to his selflessness (he declined the salary that was offered to him, preferring not to tarnish, in any way, his image as a public servant) and his insistence that the leader of the new country should not be a monarch, we call our presidents "Mr. President," not "Your Highness."

It set the tone for the last 200 years, but I can only conclude that Obama also believes his contributions to American life in his first two years as president are greater than Washington's.

The Siena survey ranked Thomas Jefferson fifth. Once again, that should be a no–brainer, shouldn't it? Jefferson wrote the Declaration of Independence, and there are few documents in recorded history that have had the kind of influence on a culture that it has had.

Jefferson also was responsible for the Louisiana Purchase, which doubled the size of the United States at that time — and still represents roughly one–third of its land mass.

But, apparently, Obama feels his accomplishments in his first two years exceeded Jefferson's.

Sixth in Siena's survey was Jefferson's successor, James Madison. Before becoming president, he was the "Father of the Constitution." As president, he sought to continue Jefferson's policies, but he may be largely remembered for the crumbling of U.S.–British relations and the War of 1812, during which the White House, the Capitol and many other public buildings were burned.

Seventh in the rankings was Madison's successor, James Monroe, whose signature achievement probably was the Monroe Doctrine, which established the Western Hemisphere as the United States' sphere of influence and served notice to Europe that any attempt by any of its nations to interfere would be seen as an act of aggression and treated appropriately.

Ironically, America has not re–elected three consecutive presidents since Monroe's re–election in 1820. If Obama wins a second term next year, he would match Monroe's electoral achievement — but, apparently, he believes he has already bested Monroe as a president.

Siena's eighth–ranked president was Woodrow Wilson, a leader of the progressive movement. A Wilson biographer, John M. Cooper, wrote that Wilson's record of legislative achievement, which included child labor reform, the Federal Trade Commission Act and the Federal Farm Loan Act — was unmatched by any other president except FDR, and his advocacy of women's suffrage helped lead to the ratification of the 19th Amendment.

Perhaps it is subliminally, but Obama seems to think that what he did as president in 2009 and 2010 is greater than what Wilson achieved nearly a century earlier.

Ninth on the list was Harry Truman, whose low point in his approval ratings (22) was unmatched by any president until Obama's immediate predecessor, George W. Bush.

But that doesn't tell the whole story of Truman's presidency. From the day he succeeded FDR in April 1945 until he won the 1948 election, Truman did great things in spite of the fact that he had been virtually ignored by Roosevelt in his 82 days as vice president.

He knew nothing of the Manhattan Project, which gave him the weapon that he used to bring the war in the Pacific to a quick conclusion. The attitudes about his use of nuclear weapons in 1945 have changed over the years, but at the time and for years thereafter, it was believed to have saved hundreds of thousands who, it was said, would have perished in a fight–to–the–death invasion of Japan.

He had to deal with the transition from a wartime economy to a peacetime one, which always seems to be uneasy but was especially so after World War II. There were several economic conflicts that had gone unaddressed during the war years but boiled over when the war ended; Truman managed to deal with them all.

He was an advocate of the "Fair Deal," national health insurance and civil rights.

I would guess that Obama has quite a bit of respect for what Truman did as president — so much that he is clearly trying to duplicate Truman's "upset" victory in his re–election campaign in 1948. Truman won a full term largely by running against a "do–nothing Congress," and that seems to be Obama's strategy as well.

For that to work, you need a solid record of achievement to contrast with Congress'. Obama clearly believes he does, and so do his adoring supporters, but, judging from presidential approval ratings, millions are not convinced.

They are not convinced for much the same reason that the people of the late 1960s were not convinced about LBJ. They felt out of sync with their president's priorities. He was focused on domestic issues, which were (and are) important, but they were more concerned about the meat grinder of Vietnam.

In modern times, Obama's highest approval ratings have been for his handling of foreign affairs — when Americans are hurting at home, struggling to keep a roof over their heads and food in their stomachs. They need jobs.

The Siena survey ranked Dwight Eisenhower 10th. Eisenhower earned Americans' respect when he led the Allies to victory over the Axis powers in World War II, and he presided over a country that was at peace in the world but suffering from some postwar growing pains in the 1950s.

His most lasting legacy, I suppose, is the interstate highway system — and his warning, in the final days of his presidency, against the growing influence of the "military–industrial complex."

Both continue to influence American life, but Obama thinks his achievements are equal to or greater than Eisenhower's.

Maybe they are, but that will be up to the voters to decide next year.

Sunday, December 18, 2011

Cell Phones Don't Kill People



I was listening to the radio yesterday morning, and, for awhile, the topic of the discussion was banning cell phone use while driving. Should we or shouldn't we?

I missed the beginning of the conversation, but I assume it was in response to the National Transportation Safety Board's proposal this week for a ban on cell phone use and text messaging devices while driving.

Now, before I go any farther with this, I guess I should say that there are times when I feel like a refugee from another time.

Not to say that I am old — not yet (although there are times when I feel that I need to be wearing a shirt like the one my mother had — it said, "Hill? What hill? I didn't see any hill!") — but there are definitely times when I feel that technology has gone galloping past me.

Time, I have discovered, doesn't merely fly. It sprints. You younger folks will understand that one day.

Anyway, that's how I feel about cell phones.

As I have written here before, I taught journalism on the college level in the mid–1990s. I left the classroom for several years, but I gravitated back to it last year, taking a job as an adjunct journalism professor in the local community college system.

When I did, I quickly discovered how many things had changed in the intervening years. In the '90s, for example, none of my students had cell phones. Today, they all do. It was essential to implement rules about their use in class to maintain order — and get anything done.

It's a battle I'm still fighting.

On a personal level, I resisted cell phones for many years, and I had pretty good reasons. I'm not married, and I have no children. It was an additional expense, and, in the event of an emergency on the road, I figured (at first) that I could always use a pay phone.

Well, I'm still not married, and I still have no children. Cell phones are still an additional expense, but pay phones have just about disappeared. I finally decided it might be worth the expense to be sure I would have one if something happened — but I only use it when it is absolutely necessary.

See, I've learned that anything can happen — and it can happen all by itself. It doesn't need anyone's assistance.

And I have been wary of cell phones because I have long believed that they were likely to contribute to the accident rate — which certainly doesn't need any help.

When my parents taught me to drive, the thing they emphasized, more than any other, was to keep my eyes on the road. If your attention is distracted, they told me, even for a second, it can have tragic consequences, and one must be ever vigilant — because anything can be a distraction.

A distraction can be a very modest, very momentary thing, like the sound of a dog barking or a sudden movement one catches from the corner of one's eye. But cell phone conversations can go on indefinitely, and the distraction from the task at hand can be far from modest.

The introduction of texting into the mix just raised the risk level, as far as I was concerned. It certainly raised my awareness of the risks.

Perhaps it was due, in part, to the fact that I went without a cell phone for so long, but there were certain things about them that I just never considered — and, to be fair, there were other things that just weren't factors until recently.

Like texting.

And, perhaps because my cell phone is so basic, so ordinary, I'm not entirely acclimated to a world in which the internet is at your fingertips, wherever you are. When I was in graduate school, there was no internet (well, no real commercial internet). A few years later, that was a reality. It was a new frontier, but you could only explore it from your desk at home or at work.

Then, along came laptops, and you weren't tied to a physical location anymore. But laptops are still too big and bulky for some people so access to all of it has been condensed to the "smart phone," a gadget that fits in the palm of your hand.

(Oh, what we could have done with those when I was a general assignment reporter fresh out of college!)

The speed of technological advancements has made so many things possible that my poor mind never imagined most of them — and still needs time to absorb it all.

That point was made clear to me when I heard the listeners' calls.

One observed that he frequently uses the GPS app on his cell phone when he is driving in an unfamiliar area. The cell phone is equipped to "speak" to him so it isn't necessary for him to look at the cell phone, as he would if texting. And his car is equipped for hands–free operation of the cell phone so it really is no different than speaking to a human occupant of the vehicle.

He travels a lot, he said, but he rarely has a traditional conversation on his cell phone — and almost never does so when he is behind the wheel. But, when he is using this GPS feature, which he often does because his work requires him to spend a lot of time in unfamiliar territory, "I'm still talking on my phone," he pointed out, "so, technically, I would be in violation of the law."

True — but not necessarily its spirit.

The law is intended to discourage people from talking on the phone while they're driving — which is certainly a noble objective — although, in a culture in which people can be seen trying to eat cereal, apply makeup, even get dressed behind the wheel during the daily morning rush hour, one can be forgiven for wondering if such legislation goes far enough.

Before the discussion ended, a veteran police officer came on the line. Now, most policemen with whom I have spoken about this agree that cell phone use should be curtailed while driving; they just disagree on how the law should address it.

But this particular officer wasn't too concerned about the use of cell phones behind the wheel. It's just another distraction, he said, no worse than having a conversation with someone else in the vehicle — and he went on to point out that he had many electronic distractions in his police car.

It's all a matter of being mature enough to handle it, he said.

Cell phones don't kill people.

Wednesday, December 14, 2011

A Memory and a Milestone


My parents posed with me after I received
my master's degree from North Texas in 1991.


Today is a milestone for me.

It was on this day 20 years ago that I received my master's degree in journalism from the University of North Texas. That was a proud moment in my life.

Sometimes I must admit that it all seems like a dream. Maybe that is a by–product of the passage of time. The farther removed I am from an experience, the more it seems like another lifetime — and, in a way, it is.

I will always remember that day. It was a moment of real triumph after what had been maybe the most challenging year of my life — at least, to that point.

It was special, too, because I was able to share it with my parents. For reasons that I would rather not discuss in great detail, I didn't participate in graduation exercises when I received my B.A. The story is long to tell, but it boils down to some administrative snafus stemming from the fact that I transferred to the University of Arkansas midway through my sophomore year.

I eventually got my degree, but it was issued three months after I completed my degree work, and I had relocated roughly 150 miles away where I was working as a general assignment reporter. I didn't participate in my graduation ceremony — the U of A mailed my degree to me — and my parents didn't get to see me walk across the stage to accept my B.A.

It was, to put it mildly, anticlimactic to open an envelope and take out my degree. Boom! You're a college graduate. I always imagined hearing my name called out and walking across a stage to accept my degree. Never dreamed it would be like that. It was no more special than opening the monthly telephone bill.

But I was able to share this day with my parents 20 years ago — and, for that, I will always be thankful. My mother has been gone for more than 16 years, but she saw me walk across that stage. It was the fulfillment of her dream for me as well as my own.

And it is a memory that means everything to me now.

She and my father posed with me after the ceremony was over. You can see the picture at the top of this post. That is an irreplaceable souvenir for me.

It was kind of a typical December day in Texas, as I recall — a little chilly, overcast, a bit windy. It was the kind of day that reminds you that Christmas is coming, which, in turn, reminds me of a story.

For those of you who don't know it, when a person receives his/her master's degree, the ceremony is usually called a "hooding."

I know that may sound like some kind of Ku Klux Klan ritual, but it isn't.

If you look closely at the picture, you may notice a splash of red around my neck. That is neither a cape nor a muffler or scarf of some kind. (It isn't blood, either, although I often felt, as I pursued my master's degree, that I was shedding plenty of blood.)

It is a hood, the academic dress of one who has earned a post–graduate degree. The hood is in the color of your academic major. I don't know if that varies from one American school to the next, but, when I got my master's at the University of North Texas, the color for journalism majors was red.

The traditions of academic regalia originated in the medieval universities of Europe, and the colors may vary from school to school.

Consequently, I don't know if red is the color for journalism master's or doctoral students at other schools, or if that is just the designation at North Texas.

My understanding, in fact, is that not every school has a hooding ceremony for its master's and doctoral candidates; the ones that do seem to follow their own rules.

Thus, the conclusion is unavoidable that journalism majors at other schools that do have hooding ceremonies may use different colors.

In at least one country, red is the color for those receiving post–graduate law degrees. I definitely wasn't a law student, but I did have to study communications law (and that really does have more credibility than spending the night at a Holiday Inn Express).

Hoods also tend to have the primary color of the school where the degree was earned, and, at North Texas, that color is green. That meant that my hood was red and green — Christmas colors.

My mother pointed that out to me when I picked up my graduation cap, gown and hood shortly before the ceremony. That appealed to her sense of order, I guess. She loved the Christmas season, and she was pleased that my graduation came during it (even though that happened only because a close friend of mine was diagnosed with lymphoma that spring, and I put off finishing my degree work until after his death that summer).

I'm teaching journalism as an adjunct at the local community college these days. I haven't worn my gown and hood in years, but it hangs in my closet, and I see it from time to time. On those occasions, I am reminded of that period in my life, of that accomplishment for which I worked so long and so hard.

Of course, I can be reminded of that at any time. My master's degree is on a shelf in my apartment, and I see it every day.

But seeing the gown and hood that I wore on that day is different.

It's like a tangible link to my past.

Of course, the degree itself is, too, I guess — but not really.

The piece of paper that I was given when I walked across the stage on that December Saturday afternoon in 1991 was kind of an academic I.O.U., a promissory note. My degree would be mailed to me, it said.

It was like one of those dummy hand grenades that soldiers use in basic training — the ones that look official on the outside but are totally ineffective.

My memory is that I received my actual degree — the one that sits on my shelf today — a few weeks later. There was no extended wait for it, but that wasn't what I was holding in the picture you see attached to this post.

That was the dummy, the prop for pictures such as the one for which I posed with my parents.

The actual souvenirs that I have from that day are the graduation program (which I have somewhere although I can't put my hands on it right away), my gown and hood and the photo you see with this post.

And the memories they evoke.

Friday, December 9, 2011

Georgia On My Mind

I have this friend who lives in Atlanta. I would describe him as a devoted supporter of Barack Obama.

He says he has been disappointed and frustrated with Obama at times, but it often seems to me that he finds ways to justify or excuse those policies that he says have been disappointing and frustrating. This also leads, at times, to overly optimistic electoral expectations.

At one time, we were living parallel lives. We were pursuing our master's degrees in journalism at the University of North Texas, we were working full time at the same newspaper, and we were working part time as graduate assistants in UNT's editing lab.

Frequently, we were enrolled in the same classes. I used to tease him that I saw more of him than his wife or children did.

We got to know each other pretty well, and we found that we had a lot in common. We both considered ourselves Democrats, and we shared much the same world view.

Anyway, that friend and I went our separate ways eventually. He got his degree, and I got mine. He went on to get his doctorate at another school. I got a job teaching journalism. We had our different life experiences, as friends do.

To an extent, we've moved in different directions. He still considers himself a Democrat; I consider myself an independent. I guess his philosophy hasn't changed much; perhaps mine has, although I don't think of it that way.

But even if it is true, I don't look at it as a bad thing — more like what Joni Mitchell described in "Both Sides Now."
"But now old friends are acting strange,
They shake their heads,
They say I've changed.
Something's lost
But something's gained
In living every day."

Life has taken my friend to Atlanta, as I say — where, I presumed, he would obtain unique insights into the voting behavior of people in Georgia.

Maybe he has, but I'm inclined to think they are colored by his personal political perceptions, not necessarily by reality.

In 2008, he told me that Obama would win Georgia for two reasons — the black population of Georgia (roughly 30% of the total) would vote heavily for him (which it did, I suppose) and the presence of Libertarian — and Georgia native — Bob Barr on the ballot.

Barr, he said, would siphon off enough votes from John McCain to hand the state to the Democrats. He didn't.

More than 3.9 million people voted in Georgia in November 2008. About 28,000 of them voted for Barr.

That didn't really surprise me. Georgia has never struck me as being unusually susceptible to quixotic third–party candidacies.

When such a third–party candidate has caught fire elsewhere, in the region or the country at large — i.e., Ross Perot in '92 or George Wallace in '68 — Georgia has jumped right in there.

But, otherwise, third–party candidates have been non–factors in Georgia. Maybe the concept of a two–party system is too deeply ingrained in Georgians.

As someone who has lived in the South all his life, that sort of seems to me to be true of the South in general, and the percentages from the last election in which a third–party candidate played a prominent role — 1992 — support that.

According to "The Almanac of American Politics 1994," states in the South Atlantic region of the country (Florida, Georgia, Virginia and the Carolinas) gave a much smaller share of their vote to Perot (16%) than almost any other region. The states in the Mississippi Valley — Alabama, Arkansas, Kentucky, Louisiana, Mississippi and Tennessee — gave the smallest (11%).

In other words, even in a year in which the third–party candidate was bringing millions of previously politically inactive voters into the process, the South resisted the temptation to abandon the two–party arrangement.

The authors of the 1994 "Almanac," Michael Barone and Grant Ujifusa, used the numbers from the 1992 election to make the case for their observation of the "phenomenon" of straight–ticket voting that year. And I suppose it was a compelling argument for those who sought to explain what had happened that year.

Their analysis always struck me as being somewhat short–sighted, focused as it was on a single election.

See, I never really bought the idea that it was an isolated phenomenon. I have long believed that straight–ticket voting is a reality of American politics, particularly Southern politics. It was true in 1992. I believe it will be true in 2012 — and that the numbers from 2010 and recent presidential elections clearly suggest that the Democrats will lose every Southern state next year.

I know it was always a reality in Arkansas — but that was due, in large part, to the fact that there was really only one political party in Arkansas when I was growing up. The Democrats had a near monopoly on political power in Arkansas — and most of the South — in those days.

But that was really a different Democratic Party. As I have noted before, the politicians who led the Democratic Party in those days probably had much more in common philosophically with today's Republicans.

Eventually, in fact, many of them switched their party affiliations, but it took some time. The Southern Democrats of a generation or two back were trained at their mother's knees to be wary of Republicans.

Republicans were damn yankees, and the transition was a long time coming and really achieved incrementally. Southerners were voting for Republicans for president long before they started voting for Republicans for state and local offices.

The GOP, they were told, had inflicted Reconstruction on the South after the Civil War — and had been responsible for the poverty and misery that afflicted most who lived there, white and black, ever since. It was an article of faith, and so, with the exceptions of a few isolated pockets, most places in the South were run by Democrats for decades.

Many people mistakenly believe the South began moving away from the Democrats to the Republicans in 1980, when Reagan conservatives joined forces with Jerry Falwell and the Moral Majority, but, in hindsight, that was really more symbolic of the completion of the shift than its beginning. It was in 1980 that the Moral Majority served as the bridge for the last holdouts, the Christian evangelicals, who seemed, prior to that time, to exist outside politics — at least as an interest group or voting bloc.

The real breaking point came in the 1960s, in the midst of the civil rights conflict, campus unrest and general social upheaval. Even Lyndon Johnson, the architect of the Great Society, acknowledged that his greatest legislative triumphs, the ones that guaranteed voting rights and civil rights to all Americans, likely had handed the South to the opposition for a generation or more.

His words have truly been prophetic. Of the 11 elections that have been held since Johnson won by historic proportions in 1964, the Democratic nominee has lost every Southern state in six of them — and has only come close to sweeping the region once (in 1976) even though the party has nominated Southerners for president five times.

Most Southern states have voted for the Republican nominee for president even in years when Republicans were struggling elsewhere ... even in years when native Southerners were on the Democrats' national ticket.

I have always had mixed feelings about the fierce loyalty of Southerners. I have often felt it was more a point of pride, of not wanting to admit when one has been wrong, than a point of principle.

When Southerners give their hearts to someone, it is usually for life. Likewise, when the South gives its allegiance to a person or a political party, it is a long–term commitment — in spite of the behavior of some philandering politicians.

Giving up on a relationship — be it social or political — is a last resort for most Southerners. It is what you do when all else has failed.

(Regarding the dissolution of social/legal relationships, I have always suspected that attitude has more to do with the regional stigma about divorce that still persists, to an extent, today and the reluctance of many Southerners to legally admit a mistake was made than any theological concerns about promises made to a higher power.)

That's probably the main reason why it was so surprising when Obama won the states of Virginia and North Carolina in 2008. Virginia hadn't voted for a Democrat since LBJ's day. North Carolina voted for Jimmy Carter in 1976 but had been in the Republican column ever since.

For those states to vote for a Democrat after regularly voting for Republicans for years was an admission that could not have been easy for many of the voters in those states to make.

Numerically, it seems to have come a little easier to Virginians, who supported the Obama–Biden ticket by nearly 250,00 votes out of more than 3.7 million cast. North Carolinians, on the other hand, barely voted for Obama, giving him a winning margin of less than 15,000 votes out of 4.3 million.

I'm not really sure what this means for 2012. I mean, the 2008 results can't be explained strictly in racial terms, can they? The white share of the population is about the same in both states (64.8% in Virginia, 65.3% in North Carolina), and the black populations are comparable as well (19.0% in Virginia, 21.2% in North Carolina).

If anything, one would expect that a higher black population (along with half a million more participants) would produce a higher margin for Obama in North Carolina than Virginia — but the opposite was true.

What can be said with certainty is that both states voted Republican — heavily — in the 2010 congressional midterms.
  • North Carolina re–elected Republican Sen. Richard Burr with 55% of the vote. That's pretty high for North Carolina. Statewide races frequently are much closer.

    North Carolina Republicans also captured a House seat from the Democrats.

  • Virginia elected Republican Gov. Bob McDonnell in the off–year election of 2009, providing perhaps the first glimpse of what was to come.

    Neither of the state's senators was on the ballot in 2010, but Democratic Sen. Jim Webb, who defeated George Allen in the 2006 midterm election, announced earlier this year that he would not seek a second term. Ostensibly, his reason is that he wants to return to the private sector, but I can't help wondering if he has concluded that he caught lightning in a bottle six years ago and cannot duplicate the feat in 2012.

    Virginia Republicans grabbed three House seats from Democrats in 2010.
It was less surprising that Florida voted for Obama in 2008.

That's understandable. For quite awhile, Florida has been a melting pot for retirees from all over the nation so its politics tends to be quite different from just about any other Southern state. Until the advent of air conditioning, Florida was mostly a backwater kind of place with a population to match, but in recent decades, the only thing that has truly been Southern about Florida is its geographic location.

In many ways, its diverse population bears watching as an election year unfolds. It may be the closest thing to a political barometer, a cross–section of the American public, that one is likely to find.

The scene of an excruciating recount in 2000, Florida has now been on the winning side in 11 of the previous 12 elections — and conditions in 2008 were probably more favorable for the out–of–power party than at any other time that I can remember.

More than perhaps any other state in the region, Florida's vote seems likely to be influenced by prevailing conditions in November 2012. Obama won the state with 51% of the vote in 2008, but, again, few solid conclusions can be reached based on the racial composition of the electorate. Whites represent a smaller share of the population in Florida (about 58%) than in in Virginia or North Carolina.

But the black vote in Florida is also smaller (around 15%).

In fact, half again as many Floridians are Hispanic (more than 22%), and, while those voters will be affected by economic conditions like anyone else, they may also be sensitive to immigration issues and particularly responsive to proposed solutions to those problems.

There may well be compelling reasons for Hispanic voters to feel overly encouraged or discouraged by U.S. immigration policy under Obama.

What can be said of voting behavior in Florida in 2010 is that Florida's voters made a right turn.

Republicans seized four House seats from Democrats, elected one of the original tea partiers to the U.S. Senate and replaced an outgoing Republican governor with another Republican governor.

There has been persistent talk, in fact, that the senator — Marco Rubio — will be the GOP running mate, no matter who the presidential nominee turns out to be.

And if that turns out to be true, the party really will be over in Florida ...

... and elsewhere in the South.

Wednesday, December 7, 2011

War and Peace



We'll be hearing a lot today about war and peace.

Mostly war, I suppose, and that is understandable. Today is, after all, the 70th anniversary of the attack on Pearl Harbor — the event that literally pushed the United States into World War II although one could argue that it had been getting more and more involved in the conflict in the months leading up to the attack.

It is an event that still resonates with people of my parents' generation. They were children when the attack occurred, and, although my mother has been gone for many years now, I remember her telling me of the peaceful Sunday afternoon that suddenly changed when the news came across the radio that Pearl Harbor had been the victim of a sneak attack.

It is hard for me to imagine anyone going through the American education system and not hearing a recording of FDR's famous speech to Congress, when he said that Dec. 7, 1941 was a "date which will live in infamy."

That date has certainly lived on in people's memories.

For me, today brings back memories of 20 years ago when I was working for a small daily newspaper, and I participated in the production of a special section commemorating the 50th anniversary of that attack. For weeks, we solicited 1940s era photos of local residents, both living and dead, who served their country — and we published articles about many of them and their experiences.

That project coincided with my graduation from graduate school. A week later, I was going to receive my master's degree. There were many things demanding my time and attention.

It was a grueling period in my journalism career, to be sure. I had no idea there were so many WWII veterans in the county where I was living — until we took on that project.

Most of them were living then. Far fewer are apt to be living today — and it does make me wonder when we will stop observing Pearl Harbor Day in the kind of semi–official way that we have in recent years. It seems we are moving in that direction with the attrition of people who still remember that day.

That happens with some of history's significant dates. So much time goes by and the people who remember the event pass away, and we are left with holidays and/or anniversaries for which we must be reminded the origin.

Take Veterans Day. It used to be called Armistice Day, which was the observation of the anniversary of the end of World War I.

Hostilities in that conflict ended in 1918, more than 90 years ago. The last time I recall anyone mentioning that event was when I studied history in high school — and my memory is that my history teacher really didn't spend much time on it.

To be sure, the outcome of World War I wasn't very popular in Germany, which paid a heavy price — and that could be said to have played a role in the eventual rise to power of the Nazis in the 1930s. Kinda depends on one's interpretations of things.

Chronologically, though, it is beyond dispute that anyone who was old enough to serve in that war would have to be around 110 years old today. There are a few of those left in the entire world, but not many. Armistice Day long ago lost its meaning as the World War I generation dwindled — so today it is known by the more generic designation of Veterans Day.

Which is not to be confused with Memorial Day. That is a completely different holiday in a completely different time of year — but it does have a similar history.

It started out as Decoration Day, a day for honoring those who died during the Civil War. I don't think there is a particular anniversary connected with it; the graves of Confederate soldiers were decorated in several Southern cities during the war and the practice simply continued after it ended.

Obviously, no one who was alive in the mid–19th century is still living — so there is no one for whom Decoration Day has any meaning. We continue to observe it, though, under the more generic name of Memorial Day.

The purpose evolved to include remembering those who fought in all wars, not just the Civil War, and in recent years it has expanded to include memories of anyone who is no longer living, even if that person didn't serve in the military.

George Carlin used to point out that sports like football that tend to emulate war are played in facilities that use such generic names as War Memorial Stadium or Soldier Field. It is part of the competitive nature of sports, I suppose, that the places where these games are played should bear names that conjure violent images — even though a sport will never be as violent as war.

But not everything that happened on Dec. 7 has been violent.

Sometimes there has been peace and hope.

On this day in 1972, for example, Apollo 17, the last manned mission to the moon, was launched. As they left the earth and began making their way to the moon, the crew looked back and took a picture of the earth that is known today as the "Blue Marble."

Seen from that vantage point, the blue marble looks so peaceful, just floating along in the black velvet of space. One would never guess that so much turmoil exists on the surface of that marble, that there is savagery loose upon the land capable of causing great pain to millions without the slightest hint of remorse.

Yet the image of the blue marble sparks in many of us that wish for peace on earth and good will to men.

Not a bad thought to keep in mind during the Christmas season.

Tuesday, December 6, 2011

Quayle's Endorsement

If former Vice President Dan Quayle is smart — and I believe the ship sailed on that one quite awhile ago — I think I would avoid taking sides on the 2012 Republican presidential race.

Publicly, anyway.

Privately, of course, he can do as he pleases — like anyone else.

But Quayle apparently is going to publicly announce his endorsement of Mitt Romney for the presidency today in Arizona.

And that could really open a Pandora's box.

Quayle, who was born in Indiana, grew up in Arizona, then returned to Indiana where he graduated from high school and worked for the family newspaper and practiced law before embarking on a political career that took him to the U.S. House and U.S. Senate before his four–year term as George H.W. Bush's vice president.

I guess Quayle had a pretty good image in Indiana when Bush picked him to be his running mate. He never got less than 54% of the vote, including the time he unseated incumbent Sen. Birch Bayh in 1980, and, for most people outside Indiana, I guess, the sight of him at the 1988 Republican convention was their first real exposure to him.

The choice was controversial from the start.

Quayle didn't help matters — either during the campaign, when Lloyd Bentsen memorably told him he was "no Jack Kennedy," or after the election and subsequent inauguration, when he told American Samoans they were "happy campers" or when he supposedly said he regretted not having studied Latin in school so he could converse with a group of Latin Americans.

That latter item, incidentally, is said to have started as a joke about Quayle that took on a life of its own. Some of Quayle's defenders clearly have indulged in some revisionist history — it's hard to deny the statements that live on in video and audio tape — but others are correct when they suggest that many of Quayle's alleged malapropisms started as jokes that appeared credible because he really did utter so many others.

Depending upon the identity of the GOP's eventual running mate, he or she should study the Bush–Quayle 1988 campaign for tips on what not to do — and how to handle the inevitable setbacks and misstatements. It's all part of living under the microscope.

It's a pity that Sarah Palin — or her handlers — didn't try to apply any of the lessons that should have been learned from the Quayle experience.

Quayle, it seems, is still learning. He's been away from the vice presidency for nearly 20 years now, but people still remember things he said — even if he doesn't.

Shira Schoenberg of the Boston Globe observed that "Quayle is known for his rhetorical blunders — once, spelling potato with an 'e' on the end."

As long as Quayle doesn't jump headlong into the campaign and draw more attention to himself, as long as he makes his endorsement and then retreats into private life, there probably won't be too many more reminders of the weird old days — when Quayle said things that were actually attributable, like when he mangled the United Negro College Fund's slogan by saying "what a waste it is to lose one's mind, or not to have a mind is being very wasteful."

So my advice to Quayle would be this:

Express your opinion. Make an endorsement. Put a Romney sticker on your car.

Then shut up.

We already have enough of your misstatements to write a book.

Come to think of it, several people already have. No sense in providing ample material for a sequel.