Wednesday, December 31, 2014

Remember the Y2K Scare?



How naive we were as we approached the new year 15 years ago.

In the days leading up to New Year's Day 2000, there was this overwhelming anxiety about what would happen to the nation's computers when asked to shift correctly from 1999 to 2000. Apparently, the storyline went, computers hadn't been programmed to handle a situation in which all four digits of a year changed.

Which made me wonder ...

Personal computers were still relatively new in 1999. It was still news in those days when someone established an online presence. Online shopping may be pervasive today, but then it was still a new thing for many people. Prior to Y2K, I can recall an intensive effort by many businesses to encourage people to shop online — but I honestly don't recall now if it was encouraged during the Christmas season of 1999.

Perhaps it required too much courage in the face of all the doomsday predictions that were circulating.

My point is, the developers of the personal computer were considered the best and the brightest of their generation. Weren't they bright enough to know that the year 2000 was coming up?

All sorts of apocalyptic scenarios were proposed in the days leading up to New Year's Day, causing considerable fear among the many Americans for whom personal computers were still new and intimidating things. I'd like to think that people have learned since then, but sometimes you have to wonder.

As they apprehensively approached the dawn of a new millennium — which was incorrect, too, but I long ago reached the conclusion that I wasn't going to win that argument — many of those Americans believed they could engage in any behavior that suited their whims and remain completely anonymous online or that, by simply pressing delete, they could permanently remove embarrassing or incriminating comments or photographs. Unfortunately, it appears some people still do.

Well, anyway, back to New Year's Day 2000.

Remember what happened? Nothing. Well, that isn't completely true. As I recall, there were a few very minor glitches — the kinds of things that wouldn't raise any eyebrows today. But lots of people took it seriously.

Businesses, too. Somehow some folks got the idea that they could avoid any problems if they switched off their computers before midnight on New Year's Eve, then switched them back on the next day.

Which made me wonder ...

If computers really weren't programmed to accept a four–digit year change, what made those people think it would behave any differently when power was restored to it? What was so special about having the power off at midnight? It still wouldn't be programmed to accept a four–digit year change.

It did seem like the logical evolution in thought from those who, when forced to deal with video issues on an old–fashioned TV that needed rabbit ears to pick up signals, responded by hitting it on the side. Aside from maybe knocking loose some of the TV's innards, I couldn't figure out what they hoped to accomplish.

Maybe people lost their ability to reason because we weren't changing one digit or even two. We were changing all four digits — and people approached New Year's Day 2000 (dubbed "Y2K") with more apprehension than they did Mayan Calendar Day a couple of years ago.

"Of course, it wasn't long before it became clear that all the fears associated with the turn of the millennium were for naught," wrote TIME's Lily Rothman.

Well, I guess it's a good thing we don't have to worry about a computer revolt at midnight this year. If you don't buy into the end–of–days scenarios, the next generation that will have to worry about issues surrounding a millennium change won't begin to show up for more than 900 years.

Happy New Year.

Friday, December 26, 2014

A Decade After the Boxing Day Tsunami



Do you remember what you were doing on this day in 2004?

It was, of course, the day after Christmas. I had made plans to meet my brother to see "The Life Aquatic with Steve Zissou," which had just premiered the day before.

We never really decided on a time or place to see it, though, until virtually the last minute that day. We met for lunch at a burger place and looked through the movie listings in the Dallas paper until we found a good starting time at a theater that was reasonably close. It was a Sunday, and it was kind of wet and dreary. The Cowboys were playing the Washington Redskins that afternoon, and we kind of hoped that would keep people at home in front of their TV sets, but nobody really seemed to care about the game. The Cowboys were on their way to a dismal 6–10 finish.

Consequently, my memory is that the theater was kind of full, and we wound up getting seats that were less than ideal.

Such a problem would be seen as trivial a few hours later after the world became aware of the deadly tsunami that had rolled across the Indian Ocean that day. The tsunami was triggered by an underwater earthquake that registered a magnitude of 9.3; that is only an estimate, of course, but if it is accurate, that would make it the second– or third–strongest earthquake in recorded history.

How strong is a 9.3 earthquake? The one that struck 10 years ago today is thought to have had the energy of 23,000 Hiroshima bombs. It caused at least 227,898 deaths.

In the aftermath of the tsunami, some proposed the creation of a global tsunami warning system, but because of the relative rarity of tsunamis in some areas — including the Indian Ocean, even though earthquakes are fairly common in Indonesia — a global network of sensors would be necessary, and that can be too costly for poor countries. Also, the world has so little experience with tsunamis that it would be extremely difficult to find enough people with the expertise to monitor and assess global conditions for tsunamis in the making. The first real sign of a tsunami is the earthquake itself, but if it happens far from shore, the tsunami may travel a great distance, as it did in 2004, before striking areas where the earthquake was barely felt, if at all, before it is noticed.

Tsunamis eventually reach a point where they begin to dissipate if they don't strike land, but they can still cause damage when they do; and tsunamis can be deceptive. Initially, they may resemble rising tides.

Something else to keep in mind — not all undersea earthquakes produce tsunamis. An undersea earthquake in almost the same area about three months later was estimated to be 8.7 (which would still make it one of the 15 strongest earthquakes in recorded history) but produced no tsunami.

The 2004 earthquake struck, as I recall, off the west coast of Indonesia that morning, which would have been Christmas evening here in the United States.

But, initially, no one knew what had happened, and it wouldn't become apparent to the world that anything out of the ordinary had happened until the tsunami wave had traveled across the Indian Ocean to the east coast of Africa, a journey that probably took about 10 hours.

Actually, what many people don't realize is that a tsunami is not a single wave but rather a series of waves that can come in surges separated by five minutes to an hour. The first wave is not always the most dangerous.

"A tsunami, when it approaches, is silent," observed survivor Alexa Moses, a writer from Australia, in The Age. A tsunami simply doesn't attract attention until it strikes land. The longer it takes to strike land, the more strength it can accumulate — until it reaches that point where its strength begins to diminish.

And a portion of the tsunami did strike India shortly after the earthquake, but most of it traversed the Indian Ocean unobstructed until it reached Africa.

For that matter, more than 130,000 of the casualties were in Indonesia, but not all of those deaths could be blamed on the tsunami. If you've ever seen footage of the aftermath of land–based earthquakes, you know that people die when buildings and bridges collapse, when they are struck by falling debris, etc., and it is reasonable to assume that many of the deaths in Indonesia were the result of being near the epicenter of a 9.3–magnitude earthquake.

Many deaths, of course, were the result of the tsunami, which was quite powerful in the immediate vicinity. Take a look at the link to Moses' article. You will see aerial photographs that clearly show how the topography was changed.

None of what had happened was being reported on TV as I prepared to meet my brother or on the radio as I drove to the burger place. After I got home from the movie, I saw the first reports I had seen of the destruction. It was astonishing.

It was also astonishing to see the world's response to the disaster. Relief efforts raised $14 billion. Many of the survivors of the tsunami still have a long journey in front of them, but that money made getting started on that journey less difficult.

Thursday, December 25, 2014

Christmas Musing: Why I Write



It is early on Christmas morning, and I am awake, but it isn't like it was when I was a kid. I'm not up because I want to find out what is under the tree. I have no tree in my apartment.

Actually, I am up because I have had a touch of some sort of virus lately that has me congested, unable to breathe. So I am awake before sunrise on Christmas morning, like when I was a boy — although, clearly, not for the same reason.

It is cold and clear this morning. The forecasters have said it will be warmer today (but very windy), which would make it one of the milder Christmases I have experienced in Dallas. I didn't grow up here, but I spent most of my Christmases here visiting my grandparents and my parents' old friends, and I have spent most of the Christmases of my adult years here, too.

That doesn't make me an authority on Christmas in Dallas, but it's close! And, more often than not, Christmas in Dallas is cool — even cold at times. I remember a few warm ones when I was growing up, Christmases when my brother and I could go outside and play in shorts and T–shirts. We could climb the pecan trees in my grandmother's yard unencumbered by winter coats.

A couple of times when I was growing up, my family drove to South Padre Island near the U.S.–Mexico border to spend Christmas there, and it was always nice and warm (today, for example, the temperature is supposed to be 71° in Brownsville, close to 80° tomorrow and Saturday).

Anyway, this morning I have been listening to Mannheim Steamroller. I don't know how long they've been putting out Christmas albums — decades, I suppose — but I have one that came out nearly 20 years ago. It is the only purely Christmas album in my collection. I have Christmas songs that various artists have recorded, but they are always part of more general albums.

I remember when I got this album. It was about six months after my mother was killed in a flash flood. I was teaching journalism in Oklahoma and commuting to Dallas on weekends to see about my father. On one of my weekend trips, I heard "Pat a Pan" on the car radio and decided I had to have it. It has been in my collection ever since.

Listening to it really can be an exercise in free association. When I hear it, I think of those days after my mother died, and then I think about her (although I am sure that she never heard this album) — and that leads me to thoughts of my childhood. Mom was my biggest booster, and I am sure she must have encouraged me to take the path I took in life — writing. I have worked at other kinds of jobs, but writing has always been at the core of who I am.

It is a path that has led me to the job I have today as editorial manager for a stock–trading oriented website. I am very happy to have that job on Christmas 2014. Of course, I guess an argument can be made that, after slogging my way through the last six years following the economic implosion, I would be very happy to have any job. And I suppose there is an element of that. But the truth is that I like the people with whom and for whom I work.

Not everyone can say that, and I really am thankful for my job. It allows me to write for a living. I know some professional writers who fret about a lot of things, including writer's block, and writing becomes work for them.

Not me. Writing has always been fun for me. When I have some spare time, I would just about always prefer to write about something. I write three blogs (one of which is this one) so I always have an outlet for any inspiration I may have.

That's what it is. Inspiration. That must have been what my mother encouraged in me when I was little. Mom was about creativity, which has a symbiotic relationship with inspiration. She taught first grade, and I think most of the people who came through her classroom and their parents would tell you she was the most creative teacher they ever knew.

After she died, my family received hundreds of letters from old friends scattered across the country, a few even halfway around the world. One friend who knew her when she was a teenager sent us a letter with some photos of Mom participating in a play in junior high or high school. In the photos, she was clearly hamming it up in her usual way, and the friend remarked in his letter, "I always thought that, if Mary had not gone into teaching, she would have gravitated to the stage."

A career on the stage might have satisfied her yearning for creative outlets. She found other outlets, one of which was encouraging me to write. I had other influences along the way, but I am quite sure she was my earliest. When I was in elementary school, she arranged for me to take piano lessons, which I did for many years. I haven't kept up with it, but all that practice made my fingers quite nimble, and I am sure it contributed to my typing ability, which has been valuable to me all these years. I have certainly found it to be an advantage since personal computers took over the workplace. Many of my colleagues still hunt and peck, but I took typing in junior high and I already had the advantage of several years of piano lessons under my belt.

Of course, typing alone is not the same as writing. Simply stringing words together in grammatically correct sentences is not the same as writing unless you explore related ideas and themes. That is something I have worked on for years, and I really think it has paid off. I have people who read my blogs all over the world. Some sign up as followers who are notified whenever I post something new; others just pop in from time to time to catch up on what I've written.

Occasionally, they write to me. One wrote, "I can't wait to see what you will write about next."

I suppose that sums up how I feel about writing. I often know what I want to write about; I just don't know what I will say about it until I sit down and write.

That is the pleasure I get from writing — discovering what I think or how I feel as a result of writing about it. Sometimes I honestly do not know how I feel about something until I start writing about it. Sometimes, I am as surprised as my readers at what I think.

And it is appropriate to think about that on Christmas — because that is a gift my mother gave me.

Thanks, Mom.

Saturday, December 6, 2014

A Nation of Witch Hunters

When I was growing up, "innocent until proven guilty" was practically a mantra whenever someone was accused of a crime. Even if everyone knew the accused was guilty, it simply was not considered American to speak of someone as guilty until a jury had reached that conclusion.

That, after all, was the kind of thing the early settlers came to America to escape (and then, ironically, engaged in their own witch hunting in Salem, Mass.).

The newsrooms where I worked in my newspaper days were always sensitive to that. For a time, when I was a police/courts reporter, my editors always reminded me, when I came to the newsroom to write about the day's proceedings in court, to refer to the defendant as "the accused" or "the alleged" until the jury reached its verdict.

Even if we knew the defendant was guilty. We couldn't say so until it was official — meaning that a jury had reached that conclusion.

Saying so in print only made it seem — and rightly so — that the press had already reached its conclusion. To hell with the jury.

That has never been the role of the press. The press' job is to be the eyes and ears of the community. The newspapers for which I worked, as I say, were always very sensitive about that kind of thing. They earnestly sought to maintain an aura of neutrality, and most of the reporters with whom I have worked would have bristled at the suggestion that they were not absolutely fair.

It's been awhile since I worked in a newsroom so I don't know when that began to change. All I know is that it did — probably tentatively at first but grew progressively bolder as the press began to discover that no one was going to hold it accountable for prejudging criminal defendants.

Even if the press was wrong.

Today, all that is needed for the public to turn on someone is for someone else to say something. Anything. Doesn't matter if it is true. It is accepted on face value. Look how quickly people have turned on Bill Cosby, one of the most beloved entertainers of his day. He has been accused of truly reprehensible behavior. If those accusations are true, he should be held accountable. But they haven't been proven in court, which is where every American who is accused of something is entitled to face his/her accuser and defend himself/herself against the charges if possible. That's what the people who braved the unknown to settle this land wanted.

Well, at least, that's how it used to be.

How about the case of cable TV cooking star Paula Deen, who admitted using the "N word" many years ago and apologized profusely — only to be driven from the airwaves anyway by those whose only motive appeared to be a desire to see how the other half had been living all these years — not a quest for justice.

In Ferguson, Missouri, the grand jury, as you undoubtedly know, has been investigating the August shooting death of Michael Brown, an 18–year–old black man. The grand jury's decision not to indict the white police officer who shot Brown sparked riots and looting.

If you look at the transcripts of the grand jury proceedings, you will see that most of the witnesses' accounts supported the officer's version of events — and most, if not all, of those witnesses were black. The facts simply did not support accusing the officer of a crime and spending who knows how many taxpayer dollars in a futile attempt to convict him.

And that is what grand juries really are designed to do — filter the unsupported cases from the supported ones. Do you believe that there are too many frivolous cases clogging up the judicial system? Grand juries have been doing their part to keep the frivolous cases out of the system in this country for a couple of centuries. If you think it is bad now, try living in an America that doesn't have grand juries to serve as courthouse gatekeepers.

Apparently, there are, to misquote Jack Nicholson, people who can't handle the truth, though. In spite of the testimony of those witnesses, there are still people who say justice wasn't served — and that race was the reason.

That is mere speculation unless there is proof to support it. Astonishingly, there are people who continue to cling to claims that have been recanted, citing them as evidence in this case — when, in fact, they are no such thing.

Things are a bit murkier in the choking death of Eric Garner in New York in July. I haven't seen those grand jury transcripts, and I would like to because it could give me some insight into the jurors' mindset. From looking at the video, it appears that, at the least, a charge of negligent homicide might be in order — but a video doesn't tell you everything you need to know.

Videos do help, of course, and I like the idea of equipping police officers with body cameras so investigators can see precisely what the officer saw when something like this happens. It's a worthy goal, but Barack Obama's pledge to provide federal funds to help police departments pay for such cameras is one more example of how Obama ignores feasibility in order to pursue what he believes would be an ideal world.

America is already $18 trillion in debt. The wise thing — the prudent thing — would be to focus on bringing down the debt, not adding to it. Hard choices must be made. Such choices almost always involve sacrifice, and, in the last six years, many Americans have had to make sacrifices they never thought they would have to make. Their leaders must give careful consideration before asking for more.

Of course, homicides aren't the only things getting attention these days. There have been a couple of cases of rape — or, rather, alleged rape — in the news. Now, don't get me wrong. I'm not saying that rape is anything other than what it is — an act of violence. But it is the kind of charge that sticks to someone even if he's been cleared.

I covered a rape trial once. The defendant was acquitted, but he was forever linked to the charge. He lost his job, couldn't find another one locally and, eventually, had to leave town. I've always hoped he was able to pick up the loose threads of his life and get back on track.

I also left that experience thinking that, if newspapers voluntarily withhold the names of alleged rape victims (and that is a voluntary thing — it is not mandated by law — freedom of the press, don't you know), they should also withhold the names of the accused until they have been convicted.

Rape is an incendiary charge. Bill Cosby, as I have pointed out, hasn't been convicted. He hasn't even been formally charged, yet his long–time associates are throwing him under the bus, one after the other. Maybe they're right to do so. But what if they are wrong?

Yes, sexual assault is an incendiary charge. It must be handled judiciously, which makes the case of actress Lena Dunham both fascinating and troubling.

For the last couple of months, Dunham has been hawking her memoir, "Not That Kind of Girl: A Young Woman Tells You What She's 'Learned,'" which includes her account of an occasion when she was raped.

Well, to be fair, she never actually accuses anyone of rape. But she does describe an evening of what is best described as non–consensual sex.

Dunham, in case you don't remember, made advertisements for Obama's re–election two years ago. Those advertisements were intended to appeal to young voters, equating casting one's first vote with losing one's virginity.

I do not mention that to explain any conclusions I may have reached about Dunham or her moral compass or anything like that — I think most readers are capable of doing that on their own — but because her political leanings are important to remember in the context of a portion of her narrative. I refer to her description of an occasion when she claims to have been raped by a prominent "campus Republican" named Barry when she was a student at Oberlin College.

Oberlin is in Ohio and, from what I have heard, put the liberal in "liberal arts." Just about any Republican would stick out like a sore thumb there.

Her account has been effectively debunked by John Nolte of Breitbart. It was praised for its "truthiness" in TIME back in September.

Now that the reliability of the story has been brought into question, Eugene Volokh of the Washington Post wonders if this prominent "campus Republican," identified in Dunham's book as "Barry," has grounds for legal action against her.

The most egregious example of this willingness — nay, eagerness — to blindly accept anything that is said could be found in the pages of Rolling Stone last month. The article described the horrific gang rape of a woman identified as Jackie at a University of Virginia frat house.

There were angry protests and the school suspended all fraternity activities for a year. Those would be appropriate responses except for one thing — "there now appear to be discrepancies" in the account, Rolling Stone's managing editor says. More than a few, actually. There are more holes in the story than you'll find in the average block of Swiss cheese.

As a journalist, I am embarrassed by the blatantly sloppy fact checking. It is shoddy journalism, and it is inexcusable.

Rolling Stone's managing editor was right to acknowledge that the "failure is on us," but the mistakes were so basic that a first–year journalism student, never mind a newsroom full of seasoned vets, would have spotted them.

The thing that concerns me, though, is this: What if the editors at Rolling Stone knew in advance about the problems with the story, and they gambled that no one would call them on it? That it wasn't sloppiness after all?

I am reminded of the bogus charges leveled by Tawana Brawley against a group of white men back in the late '80s. Do you happen to recall who one of her chief supporters was? Al Sharpton.

Thursday, November 20, 2014

The Fine Art of Compromise ... and Lost Opportunity



"The trusts and combinations — the communism of pelf — whose machinations have prevented us from reaching the success we deserve should not be forgotten nor forgiven."

Letter from Grover Cleveland to Rep. Thomas C. Catchings (D–Miss.)
August 27, 1894

I have mentioned here that I have been studying the presidency most of my life.

And Grover Cleveland has always fascinated me. He always stood out because he was — and still is — the only president to serve two nonconsecutive terms. (He was also president half a century before presidents were limited to two terms — so, presumably, he could have sought a third term in 1896, but his party repudiated him. More on that in a minute.)

I have found it fascinating, too, to observe all the different presidents in American history to whom Barack Obama has been compared.

That didn't really begin with Obama. Incoming presidents are almost always compared to presidents from the past. I don't know why. Maybe to try to get an idea of what to expect. There have been no other black presidents so Obama couldn't be compared to anyone on a racial level.

When he was about to take the oath of office for the first time, Obama was compared, at different times and for different reasons, to great presidents from American history like Abraham Lincoln and Franklin D. Roosevelt.

Lincoln, of course, was a natural, having presided over the Civil War and issued the Emancipation Proclamation. There were some comparisons, as well, to Franklin D. Roosevelt, mostly because FDR had taken office during the most perilous economic period in the nation's history, even to John F. Kennedy, perhaps because both were young and their elections made history.

Over the course of his presidency, Obama has been compared to less accomplished presidents. In recent years, it has frequently been asked if he is more incompetent than Jimmy Carter, who is generally regarded as the most incompetent president in recent memory.

Six years ago, about three weeks before Obama took the oath of office the first time, political scientist Michael Barone suggested that Dwight Eisenhower might be the more appropriate comparison, and I wrote about that.

Barone's point was that Eisenhower had done little to help his fellow Republicans, many of whom "grumbled that Ike ... was selfish.

"Eisenhower, I suspect, regarded himself as a unique national figure,"
Barone wrote, "and believed that maximizing his popularity far beyond his party's was in the national interest."

I was reminded of that tonight when I heard Obama's speech on immigration. Many congressional Democrats are supporting the president — publicly, at least — but some are not. Regardless of the negative ramifications of his executive order — and a poll conducted Wednesday night indicates that nearly half of respondents oppose Obama's acting via executive order — Obama seems determined to prove that he is still relevant.

Coming a mere two weeks after Democrats lost control of the U.S. Senate in the midterm elections, it seems to me a president who was more concerned about his party's future than his own would act more prudently. Bill Clinton, after all, lost control of both chambers of Congress in the midterms of 1994, and Democrats didn't regain the majority in either chamber for 12 years.

Clinton did manage to retake some his party's lost ground when he ran for re–election in 1996 and then again after surviving an attempt by the Republicans to impeach him before the 1998 midterms, defying all logic.

I've always felt that a lot of that was because Clinton was appropriately chastened by his party's massive losses in the midterms. I felt, at the time, that many of the voters who had voted Republican in 1994 believed Clinton had learned an important lesson and were more open to supporting him and the members of his party in 1996.

Obama has now been through two disastrous midterm elections, and he has emerged from the second not chastened but defiant. He appears to be entirely ready to do everything on his own, completely ignoring the role that the Founding Fathers intended for Congress to play. An opportunity to let compromise and cooperation be what the Founding Fathers envisioned in their fledgling republic is being squandered.

Once such an opportunity is lost, once such a president takes this kind of approach, it is hard, if not impossible, to establish a rapport with the other side.

Obama isn't the first to do this, which brings me back to Grover Cleveland. A little background information is called for here.

Cleveland was first elected president in 1884. He was the first Democrat elected to the office in more than a quarter of a century — in spite of the revelation that Cleveland had fathered a child out of wedlock. It was close, but Cleveland managed to pull it off.

Four years later, when Cleveland sought a second term, conditions were good. The nation was at peace, and the economy was doing pretty well, but there was division over the issue of tariff policy. The election was another cliffhanger. Cleveland again won the popular vote by a narrow margin, but his opponent, Benjamin Harrison, received enough electoral votes to win.

So Cleveland left the White House in March 1889, but he returned as the Democratic nominee in 1892 and defeated Harrison. It was the second time a major party nominated someone for president three straight times. The first one, Andrew Jackson, also won the popular vote all three times; like Cleveland, though, he was denied the presidency once because he lost the electoral vote.

Perhaps it was the experience of having been returned to the White House after losing the electoral vote four years earlier that contributed to Cleveland's messianic complex. To be fair, it would be hard not to feel that there was an element of historical inevitability at work.

But that doesn't really excuse how Cleveland approached the outcome of the 1894 midterms.

One cannot tell the story of the 1894 midterms without telling the story of the Panic of 1893 for it defined Cleveland's second term as well as the midterms. It was the worst economic depression the United States had experienced up to that time. Unemployment in America was about 3% when Cleveland was elected in 1892. After a series of bank failures, it ballooned into double figures in 1893 and stayed there for the remainder of Cleveland's term.

The depression was a key factor in the debate over bimetallism in 1894. Cleveland and his wing of the Democratic Party were known as "bourbon Democrats," supporters of a kind of laissez–faire capitalism. They supported the gold standard and opposed bimetallism, in which both gold and silver are legal tender.

The economy was already the main topic of the campaign, and a major coal strike in the spring didn't help. In fact, it hammered the fragile economies of the states in the Midwest and the Northeast. Republicans blamed Democrats for the poor economy, and the argument found a receptive audience.

Republicans gained House seats just about everywhere except the Southern states, which remained solidly Democratic, and states where Republicans already held all the House seats. Democrats went from a 220–106 advantage to a 104–226 deficit. It remains the most massive shift in House party division in U.S. history.

Under circumstances such as these, a president has two choices — he can be conciliatory and try to move to the political center, as Bill Clinton, George W. Bush and Ronald Reagan did, or he can dig in his heels and be even more intransigent.

Much as Obama is doing 120 years later, Cleveland chose the latter approach after the midterms in 1894. Perhaps he felt he had no allies in Washington anymore, but I've always felt his go–it–alone approach was a big reason why he was repudiated by the Democrats in 1896. The fragmented party chose instead to go with William Jennings Bryan, who would be nominated three times and lose each time. In fact, with the exception of the Woodrow Wilson presidency, no Democrat would win the White House for the next 36 years.

For that matter, they didn't regain the majority in the House until the 1910 midterms, but they lost that majority six years later in spite of the fact that President Wilson was at the top of the ballot. It took the stock market crash of 1929 to restore Democrats to majority status in the House in the midterms of 1930.

That is one cautionary tale that emerges from this year's midterms. Another is the exaggerated importance given to the turnout. I know it is a popular excuse to use after a party has been slammed in the midterms, but it is misleading.

In 2006, when Democrats retook the majority in both chambers for the first time in 12 years, they treated it as a mandate for change. But roughly the same number of voters participated in 2006 as participated in 2014. Granted, there has been an increase in the overall population in those eight years so the share of registered voters who participated is different, but the overall numbers are the same.

Republicans, too, pointed to low turnout in 2006. My advice to them would be not to duplicate the Democrats' mistake. They believed their success was permanent — and it never is in politics.

It can last longer, though, if you lead.

Saturday, November 15, 2014

The Anniversary of the 'In Cold Blood' Killings



"Now, on this final day of her life, Mrs. Clutter hung in the closet the calico house dress she had been wearing and put on one of her trailing nightgowns and a fresh set of white socks. Then, before retiring, she exchanged her ordinary glasses for a pair of reading spectacles. Though she subscribed to several periodicals (the Ladies' Home Journal, McCall's, Reader's Digest and Together: Midmonth Magazine for Methodist Families), none of these rested on the bedside table — only a Bible. A bookmark lay between its pages, a stiff piece of watered silk upon which an admonition had been embroidered: 'Take ye heed, watch and pray: for ye know not when the time is.'"

Truman Capote
In Cold Blood

They happened before I was born, but the murders of the Clutter family 55 years ago today in Holcomb, Kansas, still have the power to grip people.

I re–read Truman Capote's riveting account of those murders, "In Cold Blood," about a year ago. I was just as engrossed by it as I was when I first read it in college. As a reading experience, it reminded me of Vincent Bugliosi's account of the Manson Family murders, "Helter Skelter."

Capote did a lot of writing in his life, but "In Cold Blood" was the book he was born to write. It seems almost like the kind of book that would write itself, that all it needed was a person to be the go–between. But writers are a funny sort, and my understanding is that Capote agonized over aspects of his book. Some writers are like that. The creative process makes impossible demands on them.

So writing "In Cold Blood" may have been a very emotionally trying experience for Capote. It may have been unimaginably wrenching to try to put everything on paper. I know it took awhile for him to finish it. Some writers find it very difficult to achieve the level of detachment that is necessary to write about unpleasant things. It is often essential, I have observed, to be detached in the news business. You must express in print the shock and revulsion people feel upon hearing about such things — without letting those things affect you personally. It is why many talented writers don't make it as news writers.

Such a level of detachment must have been necessary for the local officials who investigated the murders. In a small town like Holcomb (which, more than half a century later, has a population that barely exceeds 2,000), everyone knows everyone else, and Herb Clutter, the family patriarch, was a pillar of the community. He was a farmer, he hired people to work on his farm, and, by all accounts, he treated them well. He was rumored to be very wealthy — after all, he didn't drink or smoke. Had no vices of any kind, as far as anyone could tell. He was also rumored to keep all his money in a safe in his home.

At least, that is what one fellow in particular had heard. This fellow had worked for Clutter about 10 years earlier and told a jailhouse cellmate about him and the money he supposedly had in his remote country farmhouse. Truth was, Herb Clutter didn't have a fortune in his home. He didn't have a safe, either. This cellmate didn't know that, though, and he started planning to rob this farmer as soon as he and another buddy of his were released.

Fifty–five years ago, they were both free, and they made their way to Holcomb, where they intended to rob the Clutters. When they discovered that there was no safe and no fortune, they could have left and, in all probability, never been charged with a crime. Instead, they killed each member of the family so there would be no witnesses and left with $42 in cash, a radio and a pair of binoculars.

The crime shocked America, which was a more innocent place (at least, it seems so in hindsight) in the 1950s than many people today realize — even with all the jokes that are made about the simplicity of that decade. It's my opinion, though, that the difference between that time and today is the level of technology. I doubt that shocking crimes happened any less frequently then than they do today; people just didn't hear about them as much.

Nearly two years earlier, the nation was transfixed by the murder spree of Charles Starkweather and Caril Ann Fugate, the inspiration for "Natural Born Killers." It must have taken a lot to transfix the nation in those days. TVs were not fixtures in every American home in those days — maybe 60% would be my guess. Cable didn't exist, nor did the internet. The primary sources for news and information probably were newspapers and radio.

Those same news sources must have been the primary sources for most Americans when the Clutter family was killed, and the word spread so far that it reached Truman Capote via the New York Times — and he and his lifelong friend, Harper Lee (author of "To Kill a Mockingbird"), traveled to Holcomb to do research for a book on the case.

What is often lost in the telling of the murders is the fear that the victims must have experienced in those early morning hours. They did what people are usually told to do if they are abducted — cooperate with your abductor, do whatever you must to stay alive. Yet, they did not live through the night.

Their deaths led to Capote's book and at least two movies of which I am aware. For Capote, of course, it was a career–defining book — which has been criticized frequently since its publication for fabricating conversations and scenes it described. Sometimes that was obviously necessary, given that it described conversations and/or scenes that no living person could verify. But sometimes Capote appears to have deliberately misquoted some people whose versions of events did not support his narrative.

Sometimes that wasn't terribly important to the story; other times, though, it was. That seems to be how it is with the new journalism, the nonfiction novel.

One fact cannot be changed or fabricated. The Clutter family has been dead for 55 years.

Sunday, November 9, 2014

The Day the Wall Fell Down



Unless you are at least 60 years old today, you probably had no memory on this day in 1989 of a time when the Berlin Wall did not exist. It was 25 years ago today that the wall was brought down, fulfilling Ronald Reagan's famous 1987 challenge to "tear down this wall."

If you are under 30, you almost certainly have no memory of a time when the Berlin Wall did exist.

But, for anyone who remembers most or all of the years between 1961 and 1989, the Berlin Wall was a constant reminder of the tensions between East and West.

It was a fact of life for seven presidents, from John F. Kennedy, whose administration witnessed the construction of the wall in the summer of 1961, to George H.W. Bush, whose administration saw it fall 25 years ago today.

Most Americans — regardless of age — probably had no idea the wall was about to fall, probably had no understanding of the events in that part of the world that were leading to this day. My memory is that it caught most Americans by surprise. They had heard Reagan's plea a couple of years earlier — if they were old enough, they remembered Kennedy's "Ich bin ein Berliner" speech in the shadow of the wall two years after its construction — but such speeches were mostly regarded as symbolic, valuable as propaganda for stirring up the masses. Just as the wall itself was a symbol. I guess Americans were conditioned to believe the wall would always exist. The Berlin Wall took on the same kind of mythical aura as the Great Wall of China — with the added value of armed guards. It was there. It would continue to be there. Never mind that it had not always been there.

("Whatever happened to the kind of inspirational presidential oratory that helped bring down that wall — and Soviet communism?" wonders USA Today's Rick Hamson.)

After it happened, it was easy to see — as it always is — the progression of events that led to that moment. But, before it happened, the collapse of the Berlin Wall was seen as, at best, wishful thinking and, at worst, delusional fantasy.

Personally, I never thought it would happen. I couldn't imagine a world with a unified Berlin. And today I can't imagine a world in which the wall could be resurrected — yet, with Russian aggression in the Ukraine and militant Muslim aggression in the Middle East, one can only wonder if the last 25 years have been merely an interlude.

Freedom, the adage says, isn't free.

Is it possible there could be another wall — perhaps not in Berlin but somewhere else?

Saturday, November 8, 2014

No Way to Run a Railroad



"It is better to be roughly right than precisely wrong."

John Maynard Keynes

And so it begins again. One political party was slapped down by the voters, and everyone wonders if this is the end of that party. Eight years ago, they wondered the same thing about the Republican Party. It has happened several times in my lifetime. It will happen again. That is the one sure thing about politics in America. Success is never permanent, yet still the recriminations come. The finger pointing begins.

In spite of evidence to the contrary, people always want to believe that one party or the other is on the verge of eliminating that party. In the aftermath of Watergate, it was popular to wonder if the Republican Party was dying. A decade later, when Ronald Reagan and the Republicans were ascendant, people wondered if the Democrats were finished. The pendulum seems to swing one way, then the other about every 8–10 years.

The finger pointing happens every election cycle, though. It's like clockwork. Every election. Doesn't matter which party is judged the overall winner and which is judged the overall loser — or by how much (although I think everyone pretty much agrees that this was a decisive setback for the Democrats). At one time or another, each has been both, and the same scenario has been played out.

Well, this time it is a little different. Most of the time, whoever the president happens to be is usually humbled, chastened by the experience because it is almost always the president's party that suffers in the midterms — especially when the president told the world that his policies were on the ballot even if he wasn't. That's like when Gary Hart dared reporters to follow him in search of evidence of infidelity. Then they did and found out that he was being unfaithful to his wife.

And that was the end of Gary Hart's presidential ambitions.

Most presidents have been too smart to remind voters that their policies were being judged — and hand the opposition a neat little sound bite in the bargain — especially when their approval ratings were in the crapper.

In light of the fact that Obama did precisely that, though, there really is no other way to view Tuesday's results except as a rejection of those policies from sea to shining sea. This wasn't simply the South throwing a tantrum. Polls showed Senate races in the South to be close; they weren't. Polls showed Democratic incumbents outside the South had a good chance of winning. They didn't.

This wasn't merely a wave election. This was a tidal wave election.

The numbers on the federal level tell a somber story for Democrats. When Obama first took the oath of office in January 2009, Democrats have lost more than 60 seats in the House and about 15 seats in the Senate. The numbers won't be official yet, but that is how it is looking. On the state level, Democrats were expected to gain ground with Republicans defending more than Democrats, but Republicans picked up some governorships. That is about the worst showing for any two–term president in American history, and the losses weren't just on the federal and state levels, either. They were local, too.

For a party that insists on living in the 19th century, the Democrats have, appropriately, fallen to controlling "the lowest number of state legislatures since 1860," reports Reuters. This was across–the–board, top to bottom repudiation.

The president says he will work with the members of the victorious opposing party — because that is clearly what the voters want, and it is probably what most expected from Barack Obama's post–election press conference.

"No one reasonable expected the president to grovel," wrote Megan McArdle for BloombergView, "but it seemed reasonable to think that he'd seek whatever narrow ground he and Republicans can share."

Think again. This president has burned so many bridges with the Republican leadership that one wonders if there are any left. If there is one left standing, it seems to me it is the immigration issue. There is common ground to be found there, and presidents with experience in consensus–building — like, for example, Bill Clinton — would see this as an opportunity. But Obama seems determined to drench this bridge in gasoline and strike a match before anyone tries to cross it.

The prudent thing for a president to do in this situation would be to encourage compromise. Obama's presidency only has two years left, and, like it or not, this is the Congress with which he must work. Both sides must be willing to give a little to get a little.

But that is a lesson in leadership that Obama never learned.

Thursday, November 6, 2014

Reagan's Resounding Re-Election



Thirty years ago tonight, the nation witnessed its most recent classic national landslide when President Ronald Reagan won 49 of the 50 states against former Vice President Walter Mondale. When the numbers were counted, Reagan had more than 58% of the popular vote and more than 500 electoral votes.

Other presidential elections have been labeled landslide, but they weren't really. Not by the statistical definition of a landslide. The generally accepted benchmarks for a landslide have been when a candidate receives (1) at least 55% of the popular vote, (2) at least 400 electoral votes and (3) more votes than anyone else in at least three–fourths of the states.

In 1988, Reagan's vice president, George H.W. Bush came closest of anyone since Reagan's time to winning a true landslide. Bush won more than three–fourths of the states worth more than 400 electoral votes, but his popular vote tally was 53%. If he had won about 1.5 million votes that went instead to Democrat Michael Dukakis or other candidates on the ballot, Bush could have claimed a legitimate landslide.

Bill Clinton's victories in 1992 and 1996 have been mentioned as landslides, but Clinton never exceeded 50% of the popular vote, nor did he win at least 400 electoral votes or carry three–fourths of the states.

George W. Bush was the winner of two cliffhangers. Barack Obama's margins were larger than Bush's, but he didn't meet any of the three requirements for a landslide, either.

It isn't easy to win by a landslide. Frankly, it is hard enough for most candidates simply to win. But Reagan was one of those people to whom triumph seemed to come easily. But that was really misleading. Reagan had his share of setbacks earlier in his life. Most Americans — outside the Californians who knew him as their governor — only really knew him in his later years, when things really did seem to come easily to him.

Reagan had his issues, and there are those who claim to this day that, when he won his second term, he was already experiencing the early stages of the dementia that eventually took his life, but his electoral accomplishments are beyond dispute.

His first election had been impressive — beating an incumbent president by 10 percentage points and sweeping all but half a dozen states — but his second election was resounding. It left no room for doubt about who was preferred by the voters.

Teddy Tosses His Hat in the Ring



Through much of American history, if a sitting president wanted to be nominated for another term, it was his. Incumbent presidents have seldom been challenged from within their own party, no matter how much of a mess they may have made of things.

But, for awhile there in the latter part of the 20th century, an incumbent president could not depend on that.

In 1968, President Lyndon Johnson faced an insurgent challenge from Sen. Eugene McCarthy. Primaries were not the place where most delegates were won in 1968, but McCarthy did far better than expected against Johnson in the New Hampshire primary, and Johnson announced shortly thereafter that he would not seek another four years in the White House.

In 1976, President Gerald Ford was challenged by former Gov. Ronald Reagan in a down–to–the–wire fight for the GOP nomination that wasn't resolved until the party's convention that summer.

And four years later, President Jimmy Carter faced a challenge from Sen. Ted Kennedy that began — officially — in Boston's famed Faneuil Hall on this day in 1979.

It was a moment that most, if not all, political observers never expected to witness after the Chappaquiddick tragedy 10 years earlier. In the 13 months following Bobby Kennedy's assassination, nearly every pundit of the time expected Teddy to pick up his brothers' dropped torch and seek the presidency, but he was seldom mentioned in connection with the presidency after Chappaquiddick.

Even before Kennedy jumped into the race, I wondered why he was doing it. He hadn't really seemed to desire the presidency earlier in his political career. He seemed content to leave that to his brothers. But his brothers were gone, and I believe Ted felt obligated to seek the presidency on their behalf. He never seemed to take any joy from the campaign.

And, frankly, I sensed something of relief on his part when it became official that he would not be the party's nominee. He acted disappointed in his public posturings, but I suspect that, privately, he was relieved. He had given it a shot, and he had fallen short.

He had done his duty, and he never sought the presidency again — even though his speech to the delegates at the Democratic convention left the door open for another run sometime in the future.

Tuesday, November 4, 2014

The Beginning of the Iranian Hostage Crisis



Today is Election Day in the United States.

It is also the 35th anniversary of an event that had a tremendous influence on the election that was held the next year, in 1980, and many elections to come. It still influences thoughts and acts in the 21st century.

I'm speaking of the takeover of the American embassy in Iran on Nov. 4, 1979.

By comparison, I guess the world of 1979 seems quaint when stacked up against the world of today. In today's world, an American diplomat can be killed in an attack on a U.S. embassy, and many Americans won't even bat an eye. But, in 1979, the takeover of an American embassy was a shock to complacent Americans.

The only real interaction Americans had had with the Middle East was over the price of oil. Now, they were faced with political Islam, and they had no idea what to do.

Not unlike Barack Obama's experience with Benghazi, the Jimmy Carter administration was warned by the embassy in Tehran that radical Islamists would attack it. This warning came only weeks before the actual takeover. In the wake of the Islamic takeover, the American–supported shah of Iran fled to Mexico, where it was discovered that he was suffering from cancer. It was recommended that he be allowed into the United States for treatment.

The embassy warned Washington that it would be overrun by radical Islamists if the shah was allowed into the United States. Carter permitted the shah to be allowed into the country, and the embassy was taken over.

We may not know how Obama reacted to Benghazi until after he leaves office and writes his memoirs — if then. According to Carter, he agonized over the hostage crisis. "I would walk in the White House gardens early in the morning," Carter wrote in his memoirs, "and lie awake at night, trying to think of additional steps to gain their freedom without sacrificing the honor and security of our nation."

Carter did mention the warning in his memoirs, observing that Secretary of State Cyrus Vance told him in early October that diplomat Bruce Laingen was reporting that "local hostility toward the shah continues and that the augmented influence of the clerics might mean an even worse reaction than would have been the case a few months ago if we were to admit the shah — even for humanitarian reasons."

One by one, Carter wrote, his foreign policy advisers sided with allowing the shah into the United States for medical treatment. "I was the lone holdout," Carter wrote. Eventually, though, he relented, permitting the shah into the country. Less than two weeks later, a group of Iranian students, believing that the move was part of a plot to restore the shah to power, stormed the U.S. embassy.

Nov. 4, 1979, was "a date I will never forget," Carter wrote. "The first week of November 1979 marked the beginning of the most difficult period of my life. The safety and well–being of the American hostages became a constant concern for me, no matter what other duties I was performing as president."

Nevertheless, he believed initially that "the Iranians would soon remove the attackers from the embassy compound and release our people. We and other nations had faced this kind of attack many times in the past but never, so far as we knew, had a host government failed to attempt to protect threatened diplomats."

Things were different this time, though. The hostages were held through the next year's presidential election and were not released until after Ronald Reagan, Carter's successor, had been sworn in. Iran insisted the captors had treated them well, and many Americans took solace in the belief that their countrymen did not suffer needlessly — but stories of beatings and torture eventually emerged.

Carter probably will be forever linked in the public's memory to the Iranian hostage crisis, just as Richard Nixon is linked to Watergate and Lyndon Johnson is linked to Vietnam. For many Americans, it summed up the feeling of powerlessness with which they were all too familiar.

President Carter — by that time former President Carter — flew to Germany to greet the hostages, who had been released within minutes of Ronald Reagan being sworn in as Carter's successor. Apparently, the hostages were divided over whether they held Carter responsible for their ordeal. When he greeted them, Carter hugged each one, and some let their arms hang at their sides, refusing to return Carter's hug.

It reminded me of the scene at the Democrats' convention the previous summer, when Carter brought everyone of note in the Democrat Party to the podium and shook each one's hand, even the ones with whom he had clashed, in a show of party unity. But Carter had to chase Ted Kennedy, the man who had challenged him in the primaries, around in a fruitless pursuit of the handshake he desired the most, the one that might reconcile him with disaffected Democrats.

All that was still in the future on this day in 1979. As I recall, the takeover of the embassy didn't really cause that much of a stir initially in the United States. Maybe that was because Americans just hadn't dealt with this kind of thing very much. Maybe they figured it was simply a matter of negotiating with the students who had taken over the embassy and that the hostages would be released in a day or two. That was how it usually worked out.

Not this time.

Why Do You Want to Be President?



It was a very simple question, the kind of thing that is the very least that voters should know about anyone who seeks to lead the United States. Any voter can ask that question, and every voter deserves an answer to it. But Ted Kennedy, when asked that question on this day in 1979, stumbled through an obviously off–the–cuff response to the one question for which he should have had a definitive answer.

"Why do you want to be president?"

If there is such a thing as a softball question in presidential politics, that is it. After all, it didn't suggest that Kennedy should not have run — although I think the results from the 1980 Democrat primaries indicate that quite clearly (Carter received 51.13% of the vote in the primaries while Kennedy carried 37.58%). It was an uphill battle from the start. Incumbent presidents are seldom challenged for their party's renomination, and they usually prevail whether the challenge is serious or not. To succeed, Kennedy needed to be able to articulate a vision the way many Americans remembered his brother doing 20 years earlier.

Kennedy swung wildly when CBS' Roger Mudd asked him that question in an interview that was broadcast on this night in 1979 — and he missed with a rambling recitation of loosely linked talking points.

It inspired one of my favorite Doonesbury comic strips — in which an exchange between Kennedy and reporters was depicted. I don't remember now if the Kennedy of the comic strip was asked why he wanted to be president, or if he was asked about a more specific topic, but the answer was another rambling recitation. By the fourth frame of the strip, one of the reporters impatiently blurted out, "A verb, senator! We need a verb!"

There was more to it than the rambling answer, though. Kennedy had that kind of deer–caught–in–the–headlights look when Mudd asked him that question. How could he possibly have failed to prepare an answer for it? After all, he hadn't been asked to defend a bad, possibly embarrassing vote he cast in the Senate or some poor or reckless decision he had made, either professionally or personally. He hadn't even been asked about Chappaquiddick. He was merely asked why he wanted to be president. What did he want to accomplish? What was his vision for the nation?

If that isn't a softball pitch, what is?

It was an invitation to summon forth the Kennedy charisma, the soaring eloquence of "Ask not what your country can do for you." In hindsight, I believe that was the kind of thing Americans yearned for in 1979 and 1980. The country sought inspiration in 1980. Mudd's question tried to coax it from Kennedy.

It didn't even summon forth a grammatically correct sentence.

Monday, November 3, 2014

Nixon's Appeal to the Great Silent Majority



"So tonight, to you, the great silent majority of my fellow Americans, I ask for your support. I pledged in my campaign for the presidency to end the war in a way that we could win the peace. I have initiated a plan of action which will enable me to keep that pledge. The more support I can have from the American people, the sooner that pledge can be redeemed. For the more divided we are at home, the less likely the enemy is to negotiate at Paris."

Richard Nixon
Nov. 3, 1969

If it hadn't been for the speeches and statements he made during the Watergate scandal, the speech that Richard Nixon gave 45 years ago tonight might have been remembered as his most significant presidential address.

The American people were divided over the war in Vietnam, a division that came to be perceived along all sorts of other (mostly irrelevant) lines — by race, by age, by gender, by region, by economic status — that weren't entirely irrelevant.

On this night in 1969, Richard Nixon introduced the concept of the mythical "great silent majority" into the American dialogue — suggesting that, in spite of themselves, most middle class Americans agreed with each other but were too polite to say so, and implying that he was one of them — a true–blue patriotic American who had remained silent too long. He contrasted his strategy of political realism with the vocal minority and its unrealistic idealism.

He tried to strike a somewhat defiant note. "If I conclude that increased enemy action jeopardizes our remaining forces in Vietnam," Nixon declared, "I shall not hesitate to take strong and effective measures to deal with that situation." He assured his listeners that was not a threat but a promise.

Nixon appealed for their support — and urged them (again by implication) not to participate in demonstrations against the war or to side with the counterculture — on this night 45 years ago. He had been elected president a year earlier by an extremely narrow margin, and he wanted to build a consensus.

He did better than that, at least initially. Before the speech, Gallup reported that his approval rating was 56%; after the speech, his approval was up 11 points to 67%, nearly the highest of his presidency.

And, in reality, it may well have laid the foundation for his 49–state landslide re–election in 1972.

In many ways, though, I have believed that the speech was a logical extension of the "Southern strategy" Nixon used to win the 1968 presidential campaign. At that time, Democrats still dominated Southern politics, but by using subtle and not–so–subtle appeals to racism, Republicans began chipping away at the Democrats' grip on politics in the South.

The most immediate effect was to siphon off votes upon which the Democrats' presidential nominee could always depend in the past — with the most direct recipient being independent candidate George Wallace. The Republicans hoped to pick off a few Southern states with Wallace and Hubert Humphrey dividing a vote that almost certainly would have defeated Nixon if it had remained united. At least, that is how Nixon saw it.

Nixon learned that divide and conquer works. Wallace still won nearly half a dozen Southern states, but Nixon managed to carry Florida, Virginia, Tennessee and the Carolinas — and, in so doing, won the election. But the Southern strategy was regionally confining. "Silent majority" transcended regional boundaries.

After his speech 45 years ago tonight, Nixon used appeals to patriotism to define Republicans — and to divide groups of Americans — having already begun a process that would fulfill Lyndon Johnson's prophecy that his advocacy of civil rights legislation had handed the South to the Republicans for a generation.

In hindsight, I would say the ongoing shift from Democrat to Republican in the South truly began in that 1968 election. What other conclusion can one draw? The Democrats swept the nation and many Southern states in 1964, when Johnson faced Barry Goldwater. But, in the 1964 election results, there were clues to be found, hints about the direction the South was traveling.

It happened quietly and gradually. It certainly was not achieved overnight. But little by little, one by one, Southern states voted for Republicans on the federal level, then they began to do it on the state and local levels as well. And, one by one, Republicans picked off each state.

Tomorrow, it is quite likely that my home state of Arkansas will be the last Southern domino to completely fall when Sen. Mark Pryor appears poised to lose his bid for a third term — perhaps the last time that Nixon's political legacy will be felt in America.

The Southern strategy has seen some backsliding in recent elections, though, with Virginia and North Carolina voting for a Democrat for president for the first time in decades and Florida voting Democratic as well — and Democrats representing states like Louisiana and North Carolina in the Senate — but much of that can be attributed to the arrivals of Democrats whose jobs have brought them there from Northern and coastal cities. So the Southern strategy may be around for a few more elections. There may still be some work to be done in some place.

But, by and large, that transition is nearly complete now.

That shift in party preference may have been the most remarkable domestic political development I have witnessed in my lifetime — the transformation of an entire region, the South, from reliably Democrat to reliably Republican. I grew up in the South; it simply went without saying that just about everyone there was a Democrat, and political squabbles came down to the liberal, conservative and moderate wings of the party. Winning the primary in the spring or early summer was tantamount to election; beating the Republican in November was a formality.

Some people would say that Republicans have always been right–wingers, but the truth is that there really was no rightward shift in Republican ideology, in the South or elsewhere, until 1980, when the Reagan campaign and the emergence of the Moral Majority combined to coax conservative Christians into politics. There were pockets of Republican support throughout the South before then. Up until that time, I saw no real political involvement on the part of the churches — but I sure saw it after that.

Nearly all of the Republicans who were nominated in the decades before Reagan — including both Nixon himself and the man Nixon served as vice president, Dwight Eisenhower — are increasingly viewed as too moderate for the modern Republican Party.

It was only after Reagan won the nomination that the momentum for Republicans in the South really became noticeable.

All that was still many years away when Nixon spoke to the "great silent majority" 45 years ago tonight. Nixon drew the lines in the dirt that night.

"Let historians not record that, when America was the most powerful nation in the world, we passed on the other side of the road and allowed the last hopes for peace and freedom of millions of people to be suffocated by the forces of totalitarianism," Nixon said.

Something to think about when you watch the election returns tomorrow night.

Saturday, November 1, 2014

Prepare Yourself for the Sixth-Year Itch



When I was growing up, political observers spoke of Election Day as if an invisible army of voters marched to the polls on that one day. It was often referred to, informally, as Decision Day.

But since the advent of early voting, Election Day really is more of a deadline, a finish line if you will. Election Day in the United States is on November 4 this year. In most states, voters have been trickling in for weeks. For the most part, I guess the only ones left who haven't voted really are undecided — or they have been prevented from voting early for any of a number of reasons, like work or illness or family obligations.

In short, the decisions have probably already been made in many places. We just won't know the outcomes until sometime Tuesday night when the official counts are known.

And so the suspense, such as it is, continues.

There is no suspense in the House. Republicans are all but sure to retain the majority, perhaps even add to it. Conventional wisdom holds, though, that Republicans probably already control nearly all of the districts in which (officially or unofficially) Republicans outnumber Democrats. After the 2010 midterms, Republicans held 242 House seats, their highest number since the first Truman midterm in 1946, when the GOP held 246 seats.

Two years ago, the Republicans lost eight seats in the House so their total now is 234, which is still greater than the number of seats Republicans held after the 1994 midterms. They would need a net gain of 12 seats to match their postwar high.

I don't really pay much attention to House races besides the one in my own district. They aren't very good barometers of national trends or moods. They're primarily local races, especially in the big cities where they may cover only a few square miles — as opposed to the mostly rural Arkansas district in which I grew up, which encompassed (and still does) several counties. However large or small they may be geographically, a district's issues tend to be local in nature. What matters to voters here in Dallas County, Texas, probably will not matter at all to folks in King County, Washington, or Franklin County, Missouri.

So I don't spend much time on House races — unless there is clearly an illogical imbalance that seems likely to be reversed. There was a time, earlier in this election cycle, when the popular mindset among Democrats was that they would hold the Senate and perhaps seize a majority in the House. Those hopes took a pounding when Republicans won a special election to fill a House vacancy left by the death of the Republican incumbent, who had won more than 20 consecutive elections. Democrats believed they had a good chance to win the seat because the district voted for Barack Obama in 2008 and 2012 — and the idea of making that seat flip fueled hopes of an unlikely midterm shift in the direction of the president's party.

No one spoke of that after the special election. Larry Sabato's Crystal Ball, which is almost always accurate in its predictions, says the GOP is on course to gain nine seats, which would create the Republicans' second–highest total since World War II, eclipsing the number of seats the Republican held after the 2010 midterms.

There is no real suspense in this year's House races, except for a handful of districts, many of which are open seats.

There is, however, a lot of suspense surrounding Senate races. The Republicans need to win six seats to seize control of the chamber. Five would produce a 50–50 split, and, since vice presidents vote in case of a tie, a vote that goes straight down party lines would end up voting the way Democrats want because Joe Biden would break the tie.

The only way Republicans can avoid that is to win an outright majority. That seemed much more problematic for them a year or so ago, but today Sabato says Republicans are likely to win between five and eight Senate seats. He says it is all but certain Republicans will win open seats in Montana, South Dakota and West Virginia, which have been generally conceded to Republicans for months now, as well as the seat in Arkansas.

Sabato also thinks Republicans are in a position to win Democrat–held seats in Alaska, Colorado and Iowa, but polls have been showing those races as neck and neck.

And, to further complicate matters, there are those races in Louisiana, Georgia and Kansas. Louisiana and Georgia could go to overtime, so to speak, if no one wins 50% of the vote on Tuesday. In Kansas, the incumbent Republican is facing a serious challenge from an independent who has been coy about which party with whom he would caucus if elected.

Republicans insist that North Carolina Democrat Kay Hagan is in trouble — indeed, recent polls show her lead within the margin of error. Heading into the final weekend of the campaign, the North Carolina race is regarded as too close to call.

As I have been saying all along, a president's approval rating is always a factor in any midterm election — but especially when it is a president's second midterm election. I'm sure everyone remembers the 2010 midterm election, when Republicans took more than 50 seats from the Democrats. Barack Obama's approval rating was in the mid–40s just before that election. It's two or three points lower than that now.

OK, let's look at the approval ratings for presidents who were midway through their second terms. That doesn't apply to everyone, of course — only those presidents who were in office for two midterm elections. One–term presidents like Jimmy Carter and George H.W. Bush are excluded.

In 2006, George W. Bush's approval rating was mostly in the upper 30s when voters went to the polls. Democrats gained a net of six Senate seats and 30 House seats in that election.

In 1998, Bill Clinton's approval rating was in the 60s just before the election. He managed to buck the trend of the so–called six–year itch, in large part because of the public's perception of congressional Republicans having overreached in their attempt to impeach Clinton. The numbers in the Senate were unchanged as each party took three seats from the other. Democrats won a net of five seats in the House.

In 1986, Ronald Reagan's approval rating was in the 60s when the elections were held, but his party still lost eight seats in the Senate and five seats in the House.

In 1974, Gerald Ford had to preside over the midterms in the wake of Watergate and Richard Nixon's resignation. Ford's approval rating plummeted after he pardoned Nixon, but it was still in the upper 40s, even lower 50s when the elections were held. In what was likely more backlash against Nixon (as well as Ford's pardon), voters gave Democrats 49 House seats that had been held by Republicans and three Senate seats.

In 1966, Lyndon Johnson was in office for his first midterm, but it was the second of the Kennedy–Johnson years. Johnson had been elected by a landslide in 1964, but the public mood had soured in the subsequent two years, and Johnson's approval rating was in the mid–40s. Johnson's Democrats lost 47 House seats and three Senate seats.

President Eisenhower was pretty popular through most of his presidency. In 1958, his approval rating was in the low to mid–50s, but that didn't help his party in his sixth–year midterm. Republicans lost 48 House seats and 13 Senate seats.

Harry Truman wasn't elected to two terms, but, after succeeding Franklin D. Roosevelt three months into his fourth term, it was pretty close. He was president during the midterm of 1946 and again during the midterm of 1950. His approval rating in late October 1950 closely mirrors Obama's today. Democrats lost 28 House seats and five Senate seats.

That is the trend just since the end of World War II, but it has been repeated throughout American history. Prior to the end of World War II, the last two–term president whose party did not lose ground in both chambers of Congress in the sixth–year midterm was Theodore Roosevelt in 1906, who succeeded the assassinated President William McKinley in the first year of his second term. In what would have been the second midterm of McKinley's presidency (and the first of Roosevelt's), Republicans did gain ground in both the House and Senate.

All that predates approval rating polls; we do know, however, that Roosevelt's party won three Senate seats in the 1906 midterm but lost 28 House seats. And Woodrow Wilson's Democrats lost ground in both chambers in 1918. Eight years later, in the sixth–year midterm of the Harding–Coolidge administration, Republicans lost ground in both the House and Senate.

Even Franklin D. Roosevelt's Democrats lost ground in the House and Senate in the second midterm of his presidency in 1938, two years after he was re–elected by a landslide.

The odds are always against an incumbent president's party in the second midterm of his presidency. To beat the six–year itch, a president has to have phenomenal approval ratings, which Obama doesn't have, and extremely favorable domestic and foreign conditions, which he obviously doesn't have.

I'm going to predict that Republicans win the Senate seats in (1) the three states that have been conceded to them all along — Montana, South Dakota and West Virginia — plus (2) the Democrat–held Senate seats in Arkansas, Louisiana, Colorado, Georgia and Iowa. I also think they will hold on to the seats in Kansas and Georgia. The Louisiana and Georgia races might come down to a runoff, but, in the end, I think the Republicans will prevail.

I think Kay Hagan might be re–elected in North Carolina simply because she appears to have run a smarter race than most of her colleagues.

Thus, my prediction is that Republicans will gain eight Senate seats — enough to give them the majority in both chambers of Congress.

Monday, October 27, 2014

A Rendezvous With Destiny



"If we lose freedom here, there is no place to escape to. This is the last stand on earth."

Ronald Reagan
Oct. 27, 1964

It was 50 years ago today that Ronald Reagan gave the speech that is often credited with launching his political career — "A Time for Choosing."

"There are perhaps four speeches in American history that so electrified the public that they propelled their orators to the front rank of presidential politics overnight: Abraham Lincoln's Cooper Union Address of 1860, William Jennings Bryan's 'Cross of Gold' speech at the 1896 Democratic convention, Barack Obama's keynote address to the 2004 Democratic convention and Ronald Reagan's 'A Time for Choosing' speech," writes Steven F. Hayward in the Washington Post.

You may disagree with some — or all — of those choices. I certainly do. But all should be in the conversation.

Of course, there have been people whose political careers clearly began with a single speech or a single event, but, in my experience, most followed a gradual path to political prominence — if, indeed, it could be said that they achieved prominence. And Reagan certainly did, defeating a sitting president and winning re–election by a landslide four years later.

But most went into politics — or politically oriented fields — early in life. I suppose it is somewhat ambiguous in Reagan's case. He began his professional life as an actor and spent the better part of the next three decades making movies. His first political office, I guess, was in the early 1940s when he was an alternate to the Screen Actors Guild's board of directors. He later served as SAG's vice president and president.

Reagan was a Democrat early in his life and campaigned for Democrats, but the last Democrat he actively supported for the presidency was Harry Truman. He supported Dwight Eisenhower and Richard Nixon before officially switching parties in 1962.

And 50 years ago today, he revealed his political ideology. It didn't help Republican nominee Barry Goldwater, who went on to lose to President Lyndon Johnson in one of the most lopsided landslides in American history, but it laid the foundation for Reagan's rise to the presidency.

"The Founding Fathers knew a government can't control the economy without controlling people," he said. "And they knew when a government sets out to do that, it must use force and coercion to achieve its purpose. So we have come to a time for choosing."

If the emergence of modern conservatism can be traced to a single event, it is Reagan's speech. He put the choice in the starkest terms he could.

"This is the issue of this election," he said, "whether we believe in our capacity for self–government or whether we abandon the American revolution and confess that a little intellectual elite in a far–distant capitol can plan our lives for us better than we can plan them ourselves."

American Rhetoric ranks the speech higher than any of Reagan's speeches as president — except the one he gave following the Challenger disaster.

But the speech that Reagan gave 50 years ago today was different, as Hayward (the Ronald Reagan distinguished visiting professor at Pepperdine University's School of Public Policy) observes.

"The Reagan whom Americans saw ... was not the avuncular, optimistic Reagan of his film roles, or of his subsequent political career that emphasized 'morning in America' and the 'shining city on a hill,'" Hayward writes, "but a comparatively angry and serious Reagan."
"In this vote–harvesting time, they use terms like the 'Great Society,' or as we were told a few days ago by the president, we must accept a greater government activity in the affairs of the people."

When I read the text of Reagan's speech today, I cannot help but see stark parallels between that time and this one, particularly with an election only a week away — as it was when Reagan delivered his speech.

"This is the issue of this election," Reagan said. "Whether we believe in our capacity for self–government, or whether we abandon the American Revolution and confess that a little intellectual elite in a far–distant capitol can plan our lives for us better than we can plan them ourselves."

As he wrapped up his speech, Reagan told his listeners, "You and I have a rendezvous with destiny. We'll preserve for our children this, the last best hope of man on earth, or we'll sentence them to take the last step into a thousand years of darkness."

A columnist for the Paris (Tenn.) Post–Intelligencer says Reagan's words "ring true to this day, though the magnitude of today's problems dwarf[s] those faced then."

That may or may not be an exaggeration. Every generation is warned that it is taking the path to destruction. It hasn't happened so far.

But the fact that it hasn't happened doesn't mean that it won't.

For that reason, I guess, messages like Reagan's "a time for choosing" will always find an audience, just as there will always be an audience for the message of "hope and change."

How loudly the message resonates depends upon the nature of the times — and the appeal of the messenger.