Friday, December 25, 2015

Over the Line

"I have to admit yesterday when I saw that cartoon — not much ticks me off but making fun of my girls, that'll do it."

Sen. Ted Cruz (R–Texas)

I have always been an advocate of the First Amendment.

Now, I was brought up to believe in all of the freedoms outlined in the Bill of Rights and the Constitution, but the First Amendment has always been my thing. That is no surprise, I guess, given my background; ordinarily, I will come down on the side of freedom of speech and freedom of the press over just about anything else.

When I was in college, I took what amounted to an exception–free stance. I saw no circumstances in which freedom of the press or freedom of speech could justifiably be abridged. To do so, I felt, was contrary to the concept of true liberty.

As time has passed, though, my positions have modified, and I have come to believe that there are limits. Freedom of speech does not give one the right to yell "Fire!" in a crowded theater — to actively encourage public hysteria. There is the greater good to be considered.

And freedom of the press does not give anyone the right to publish anything. People who are in the public eye are one thing. Most of them chose to be where they are — there are exceptions, of course, but I'm not talking about people who are thrust into the spotlight through no choice of their own. I'm talking about politicians, movie stars, professional athletes. They knew — or should have known — what to expect. But usually their families are off limits.

The Washington Post crossed that line with its cartoon of Ted Cruz and his two young daughters this week.

Now, it is important to remember that there is no law that prevents a publication from running a cartoon on any topic the editor and/or the editorial board desire. There is no legal obligation for any newspaper or magazine or TV program to avoid mentioning a politician's children, but there is a moral one. It is the guideline of good taste and sound judgment, and it is a line that most news outlets, regardless of their editorial leanings, will not cross. This week the Washington Post went over the line.

One can debate, I suppose, Cruz's judgment in using his children in one of his television commercials, but the truth is that he is far from the first politician to do so. In fact, I can't recall a truly serious candidate for the presidency in my lifetime, whether he was his party's nominee or not, who did not use his family in his campaign. And I can't recall a single candidate for a lesser office, from my developmental years in Arkansas through my adult years in Oklahoma and Texas, who didn't bring forth the family during the campaign. Photo ops, TV commercials, rallies, the spouse and kids were everywhere — especially if they were photogenic.

This is the first time in my memory, however, that a candidate's children were attacked editorially for participating in that candidate's campaign advertising.

The editor of the Post tried to wriggle out of it by observing that, because Cruz had used his family in a Christmas–themed political commercial, he could understand why cartoonist Ann Telnaes thought the Post's prohibition on such depictions of a prominent politician's children had been lifted, at least in this case. He admitted failing to review the cartoon before it was published and said he disagreed with Telnaes' assessment.

"When a politician uses his children as political props, as Ted Cruz recently did in his Christmas parody video in which his eldest daughter read (with her father's dramatic flourish) a passage of an edited Christmas classic, then I figure they are fair game."

Ann Telnaes
Washington Post cartoonist

But the damage has been done, and the Post now acknowledges that the episode was a "gift" to the Cruz campaign, which has criticized the media for its double standard in its coverage of Democrats and Republicans. It gives him lots of ammunition to whip up the faithful in the weeks and months ahead. It may give Cruz added momentum heading into the Iowa caucuses and New Hampshire primary.

I can only imagine the outcry if Barack Obama's daughters were portrayed in an editorial cartoon as monkeys.

Tuesday, December 22, 2015

The Return of the Bradley Effect

I have written here before of the "Bradley effect," but that was in the context of the 2008 presidential election, a time when voters were deciding whether to elect the first black president. That wasn't as inconsequential a decision as you might think now, more than seven years after the fact. After all, Barack Obama has been elected president twice now. Voters face a different kind of decision in 2016.

Simply put, the Bradley effect refers to the 1982 California gubernatorial campaign of Los Angeles Mayor Tom Bradley. Bradley was black, and polls prior to the election showed him leading the race. But he lost — narrowly — to Republican George Deukmejian (who won by less than 100,000 votes out of more than 7.5 million).

I suppose it goes without saying that the outcome of that election prompted a lot of soul searching, and the general conclusion that most people seemed to reach was that Bradley fared better in the polls because those who were polled did not wish to appear racist — so even if they were undecided or leaning toward the Republican, they told the pollsters that they would vote for Bradley, thus creating an artificial lead for him.

When they went into the voting booths, though, the voters did not have to concern themselves with what others would think of them, and they pulled the lever for the Republican — regardless of what they may have told the pollsters.

I bring this up because I think we could be seeing a new — and fascinating — twist on that theme in the campaign of Donald Trump. It isn't necessary to say much about Trump. So much has already been said about him, including the seemingly daily assertions that the Trump campaign has peaked, which always seem to be followed by a new poll showing Trump with even more support than he had before. Clearly, this guy is tapping into something, with his rhetoric about Muslims and immigration, but it's something a lot of people don't seem to want to acknowledge.

It's all about perceptions. Thirty–three years ago, a lot of Californians didn't want to appear to be racist. Today, perhaps a lot of people don't want to appear to be supporting a racist. Well, a perceived racist.

In case you haven't heard, a Quinnipiac University poll that was released today suggests that half of Americans would be embarrassed if Trump became the president. Could the same dynamic be at work here?

It reminds me in a way of a congressional campaign in my district in central Arkansas a couple of years after the Bradley election. There was a rather flamboyant sheriff in Pulaski County at the time named Tommy Robinson, who had apparently been angling for higher office with some publicity stunts.

Now, that district was represented for decades by Democrat Wilbur Mills, and he was succeeded by a Democrat when he stepped down, but that Democrat chose to run for the U.S. Senate after serving a single term, and a Republican was elected to the seat. The Republican held the seat for six years, then he, too, chose to run for the Senate, and Tommy Robinson announced he was running for the House seat as a Democrat. Historically, it was the right choice. Until the Republican incumbent was elected, the district hadn't been represented by a Republican since Reconstruction.

I don't remember much about the advertising in the Democratic primary (which Robinson won in a runoff) or the general election, but I do remember one of Robinson's ads. I think he used it in both the primary and the general election. It showed a series of vignettes in which friends were talking and one would say, "Who are you voting for for Congress?" and the other would say something like, "Well, you'll probably be surprised, but I'm voting for Tommy Robinson."

And that opened the door to confront all the negative stories that had been circulating about Robinson for years.

Now, as it turned out, Robinson was a crook who got caught up in the House banking scandal. But that was still several years in the future when Robinson first ran for the House.

And my memory of that campaign is that pollsters were quite certain that Robinson's Republican opponent, Judy Petty (who lost to Mills 10 years earlier), would win. Polls were showing her leading, which wasn't really hard to imagine. Petty was running as a Ronald Reagan Republican in a year when Reagan carried three–fifths of the vote in that district.

But the Gipper's popularity didn't trickle down enough for Petty to prevail. Or perhaps those people who told the pollsters they would vote for Petty actually voted for Robinson instead.

And perhaps Donald Trump, like George Deukmejian and Tommy Robinson, will have the last laugh.

Saturday, December 19, 2015

Going in the Wrong Direction?

A couple of months ago, I wrote about the "Six–Year Itch" — the clear tendency since the end of World War II for American voters to turn on the party in power by the time of the administration's sixth year in office. We saw ample evidence of that happening in last year's midterms.

I wrote that with presidential approval ratings in mind, but I have been studying the results of the elections a little more closely since I wrote that, and I have concluded that you really can't grasp the situation unless you consider another question that pollsters usually ask.

It concerns the direction of the country, and the question that Gallup tends to ask is this: "In general are you satisfied or dissatisfied with the way things are going in the United States?"

Pollsters haven't been asking that question very long at least when compared to other basic polling questions. But it's kind of like another question that tries to gauge the country's mood — the one that asks whether people approve or disapprove of the job Congress is doing. The answer in both cases is almost always negative, resoundingly so. That is why presidential nominees' coattails are so important down the ballot in Senate and House races.

It's the trickle–down theory applied to politics, but it still just a theory. Some presidential candidates have been more successful than others in transferring their political popularity to other candidates in their parties.

Folks who are in government service — and those who want to be — are wise to watch the results closely. Just how negative is the response? How did the out–of–power party fare when disapproval of the direction of the country was similar to the level we have today?

Obviously, the lower the better for the incumbent party, but that is seldom the case. The country consistently falls short of most people's expectations — even though, if you study your history, it is clear that this nation has always been a work in progress. To expect it to become a utopia in a single presidency is naive and unrealistic.

Well, there's a lot of that going around.

It's been more than 10 years since the question about the direction of the country got a positive response — and that probably had more to do with the residual effect of the rally–'round–the–flag atmosphere following the 9–11 attacks. Usually, more than 50% of respondents — typically far more than 50% — are negative. Dissatisfaction with the direction of the country has been over 70% most of the time for years.

Satisfaction with the direction of the country rose into the 30s in the early months of the Obama presidency, but it slipped below 30% by his first Labor Day. It went above 30% in the months before Obama's re–election but quickly fell below 30% again, then briefly returned to the 30s last winter.

Dissatisfaction with the direction of the country isn't always fatal to a president's hope of being re–elected, but it is almost always impossible for a candidate of a term–limited president's party to win if both satisfaction with the direction of the country and approval of a president's job performance are in negative territory.

Barack Obama's most recent approval figure was at 42%, his lowest level in at least a year. Combined with 71% of respondents who currently say the country is going in the wrong direction, that makes Hillary Clinton's task of becoming Obama's successor considerably more daunting.

Does that mean Donald Trump will be the next president of the United States? Possibly, but the selection of the nominee for the out–of–power party is a different subject — not entirely removed from the topics of presidential approval and satisfaction with the direction of the country but not unrelated, either. It's just that there are other factors to consider — and, while Trump's campaign has been defying political wisdom, it is important to remember that no one has voted yet. Republicans won't officially begin the process of choosing their nominee until after the holidays.

We're still nearly 10 months from Election Day, and that is an eternity in politics. Much can happen, and those numbers could turn around. The window won't stay open indefinitely.

Realistically, voter attitudes tend to harden by the May before an election, so there isn't as much time as Democrats probably would like to think.

Sunday, December 13, 2015

Terrorism and Politics

"Lawmakers burst into debate over gun control, philosophers analyzed the nature of violence, and the nation was described as grieving.

"Yet 'grief' suddenly seemed like a faintly obsolete word. Nor would 'shock,' 'rage,' 'dismay' do, either. Such anthropomorphic words have been, for generations, the most convenient shorthand of political observation, inviting writers to describe millions of people as if their emotions were fused by a single spasm of 'agony,' 'despair,' 'vengeance' or 'sorrow' — as if, indeed, they were one community. But it is impossible ever to describe a great nation as if it were a community — and, in 1968, the essence of the matter was that the old faith of Americans in themselves, as a community of communities, seemed to be dissolving."

Theodore H. White, 'The Making of the President 1968'

Donald Trump's meteoric rise in the polls — in defiance of all conventional wisdom — is clearly baffling to many people (although the latest poll from Iowa hints that Trump may finally have peaked). They don't know what it means. Is it racism? Is it fascism? Should we pass more laws that would have been totally ineffective in preventing the latest massacre?

I think it is fairly easy to see what is happening in this country today — in large part because I can remember what happened in this country many yesterdays ago — and I have formed a theory about it and the 2016 presidential campaign.

I am speaking of a time when the United States really appeared to be coming apart at the seams — 1968 — when political assassinations and violence in city streets were commonplace.

I was only a child at the time, and I didn't fully understand everything that I saw and heard, but I could comprehend a lot of it. I saw TV reports of riots in the streets of big cities. I saw protesters being beaten by police, and I saw protesters throwing rocks and bottles at the police in response. I saw reports of prominent Americans being assassinated.

I knew fear and chaos when I saw it, and I see the same thing happening now.

Don't get me wrong. There was unrest all over the world. There always is — somewhere. But not usually everywhere — and that is what seemed to be happening in 1968. I'm not saying that actually is what was happening. But it sure seemed like it.

And it was frightening.

You had a pretty good idea in those days which places were best to avoid. In the summer of '68, for example, you didn't want to be near the Democrats' convention hall in Chicago.

You could avoid the obvious places for protests — but those places aren't so obvious anymore. We've seen riots recently that occurred in unpredictable places. That kind of thing tends to make people feel unsafe, you know?

So do seemingly random attacks like the one in San Bernardino, Calif., less than two weeks ago.

Now, we all know that bad things can happen to any one of us at any time. That's life. And, eventually, life is going to end for us all. We may get sick or injured and never recover, or we may be in a fatal accident of some kind. Or any of 10,000 or so other potential causes of death. (The list is virtually endless.) I think most of us have accepted that. So we continue to drive our cars to restaurants and concerts and work, always with that reality tucked away in the backs of our minds.

We know that we will never get out of this world alive. We don't like to be reminded of it on a daily basis. And we don't expect death to come when we're shopping or eating — or participating in an office holiday party.

I think it was Woody Allen who said, "I'm not afraid to die. I just don't want to be there when it happens." If we're honest with ourselves, no matter what we think happens or doesn't happen when we stop breathing, that's how we feel, too.

Of course, the kind of spiritual leaders that we have historically had here in the West — ministers, priests, rabbis — remind us that we will die, but they do so as part of a long–term campaign for souls, not to encourage listeners to hasten the day when others' souls will be won or lost — for good. That image, fairly or not, is what many Americans see in their mind's eye when they think of mosques and Muslims.

I think Barack Obama and Hillary Clinton are probably correct when they say that most Muslims are peaceful, but you can't ignore the fact that most of the recents terrorist acts, both here and overseas, have been committed by Muslims — and that causes fear. And when refugees are streaming across the border, it isn't possible to tell which ones can be trusted and which ones cannot.

I see 1968 and 2016 as being comparable. Then, as now, people felt unsafe, and they looked for a leader who would take a firm stand against what scared them. President Johnson left a vacuum in this regard, much as Barack Obama has left a vacuum; Hubert Humphrey was left holding the bag for the administration in '68, and he lost a close race to Richard Nixon — the only man in modern American history to win the presidency after having lost a previous presidential election — because the administration had repeatedly demonstrated that it didn't have a clue what to do.

And Nixon won with a third–party firebrand named George Wallace running. Wallace received more than 13% of the vote, with most of those votes coming from the South, and he carried five Southern states that almost certainly would have voted for Nixon if Wallace had not been in the race.

History tells us that the Republicans won five of the next six presidential elections — in large part because they won the battle for the hearts and minds of the voters on the issue of law and order.

I hear many Republicans fretting about Trump running as an independent if denied the GOP nomination. If that happens, the logic says, Hillary Clinton will be the beneficiary just as they allege that her husband was elected because Ross Perot got one–fifth of the popular vote in 1992. I don't think that is true. History shows that third–party candidacies, when they are most appealing to voters, tend to be a problem for whichever party is in power.

Exit–poll surveys in 1992 indicated that, if Perot had not run, Clinton and George H.W. Bush each would have picked up about 40% of his supporters, and the remaining 20% would not have voted at all. The numbers would fluctuate by state, of course, and it is fair to suggest that states where the race between Clinton and Bush was close could have swung the other way if Perot had not been on the ballot. But in the states where Clinton or Bush had decisive leads, it is unlikely that Perot's absence from the race would have meant much.

If the '92 exit polls are correct — and I have neither heard nor seen any evidence that would lead me to believe they are not — I suppose many Republicans believe Bush could have won that 20%, but I'm inclined to think those voters wouldn't have chosen from the major parties' nominees. They were drawn in to the process by Perot and most likely would have receded into the shadows from which they came if he had not been on the ballot. They weren't responsive to Bush or Clinton.

A dozen years before that, in 1980, there was talk right up until Election Day that Rep. John Anderson would siphon off enough votes from both President Jimmy Carter and former Gov. Ronald Reagan to force their race into the House of Representatives. Anderson had run against Reagan in the Republican primaries before deciding to mount a third–party campaign, and he was widely praised as an alternative to the major nominees. But the experts overestimated his influence on the campaign. Anderson won no states and received only 6.6% of the vote.

Perhaps Carter would have won most of Anderson's votes if Anderson had not run as a third–party candidate, but, outside the South, where Carter lost nearly every state but held Reagan under 50% in most, it hardly seems it would have mattered. Reagan won in a landslide.

The issue right now is not whether Trump would fracture the party and allow Hillary Clinton to win next year. The issue for voters is who makes them feel safe. Trump has been successful at that. If his Republican challengers want to be relevant in the 2016 campaign, they will need to address it, too.

Because 2016, like 1968, is going to be about an increasingly insecure nation and how it deals with its greatest fear.

Sunday, December 6, 2015

A Rare Event

Tonight at 7 (Central) we will witness something that has been a rare event in this presidency — an address to the nation from the Oval Office.

People can — and do — call Barack Obama many things, but one thing no one could call him is camera shy. He seldom hesitates to say what is on his mind (which is the very definition of loose cannon, is it not? But I digress ...) yet, in fact, this will be only the third time in nearly seven full years as president that he has spoken to the American people from the Oval Office.

It took an oil spill in the Gulf of Mexico to prompt the first one. The second was a victory lap after the short–lived departure of troops from Iraq — and now it is (presumably) to clarify his position on ISIS, a fully functioning and extremely threatening terrorist group that Obama has famously dismissed as "the jayvee team" and, more recently, as being "contained."

Both assessments have been demonstrably false.

Presidential addresses from the Oval Office were commonplace for every president I can remember prior to Obama — and I'm not speaking of the weekly five–minute radio addresses that are usually delivered from the Oval Office. An Oval Office speech was usually (but not always) a good indication that the subject was an important one and every American needed to hear what the president had to say about it. That wasn't always true, of course. Presidents from both parties misused the bully pulpit, and because the forum has been abused in the past, I can appreciate the notion that an Oval Office address should be reserved for truly significant moments and issues.

I really don't know if that is how Obama feels about it, but, whether it is or is not, the fact remains that, under Obama, the pendulum has swung much too far in the other direction. The American people have been left in the dark on too many important issues for too long. They are entitled to better than that from their leaders.

Obama's speech, of course, comes days after the horrific attack in San Bernardino, Calif., that killed 14. In the hours after that attack, Obama hemmed and hawed when asked whether it was terrorism even though his own FBI, which has followed the president's lead on terrorism in the past and hesitated to label such acts, was calling it terrorism within hours. says, "This is the first sign that the Obama White House is preparing to address the threat of terrorism seriously after appearing reluctant to define the attack in California as terrorism."

That's pretty generous. And, you must admit, it is remarkably fair, a virtual acceptance that White House spokesman Josh Earnest was correct when he said that, in addition to speaking about the shooting itself, "[t]he president will also discuss the broader threat of terrorism, including the nature of the threat, how it has evolved and how we will defeat it."

I already know about the nature of the threat — and, whether they will admit it or not, I think most Americans do, too — and how it has evolved. I need no presidential lectures on those.

Personally, I'm still waiting to hear the president call this what it is — Islamic terrorism. These acts are being carried out by Islamic extremists who have interpreted the Qur'an as permission from God to kill all who disagree with them. The president has yet to acknowledge this. He and Hillary Clinton insist on reminding us that Islam is a peaceful religion, and the United States is not at war with Islam.

That's a straw man.

No one (to my knowledge) has suggested that this is a religious war. It is a war against extremists who are hell–bent on killing others. They clearly don't care about the religious beliefs of their victims. Other Muslims have been killed in their attacks as well as Christians and Jews.

The fact that these extremists, these murderers all claim to be Muslims is an identifying trait. Some people will say that is profiling, and I suppose it is, but it is also a fact that cannot be ignored. It may be a regrettable fact of modern life that we must take a closer look at Muslims who try to enter this country. That doesn't necessarily mean that Muslims who live here are being or will be denied their right to freedom of religion.

Well, I guess I can't make a blanket assertion like that. Most assuredly, there will always be bigots for whom unpleasant but necessary restrictions on certain groups are nice little byproducts.

But that is one of the things about which we need to have a national — and rational — conversation. We may also need to talk about how and whether to monitor and have legal provisions for shutting down mosques or any similar facility where violence is encouraged.

I know. This kind of thing smacks of the Nuremburg Laws, doesn't it? But the key difference, it seems to me, is that the Jews of Germany and Europe were not hijacking airplanes, attacking diners in restaurants or shooting up Christmas parties.

To deal with a modern threat it is necessary for us to label the enemy.

Identifying the enemy is the first step in defeating it. Once Obama has done that, I will listen to what he has to say about defeating it.

Until then, I have no tolerance for useless drivel about closing gun show loopholes or issuing executive orders to make it even more difficult for Americans to arm themselves.

If someone is determined to kill — and the willingness, even eagerness, of these animals to kill themselves and leave orphaned children, even infants, behind in the process is pretty good evidence of just how determined they are — whatever is available will do. These terrorists do not need guns to kill. They share information about making bombs, and they are constantly experimenting with new ways to conceal explosives. They have used knives in the past when no other means for killing were available. No doubt they would resort to throwing rocks if that was all they had.

No, they don't need guns to kill, but they won't let a minor annoyance like a gun control law keep them from getting guns if they need them.

Instead of talking about closing gun show loopholes, we should be talking about closing the other loopholes that made the San Bernardino shootings possible. There is an enormous loophole along this country's borders. If Obama doesn't think there are terrorist "sleeper cells" all across this country whose members have practically waltzed across the border, he is truly living in a fantasy world.

Everyone wants to be fair on immigration. No one wants to deny the hope of citizenship to those who truly wish to come to America and co–exist with all kinds of people. But it only makes sense to have a process in place that safeguards the people who are already here from immigrants who, knowingly or unknowingly, threaten their safety.

In the Ellis Island days, that usually meant temporarily quarantining people who might have been exposed to a deadly disease. Today quarantining immigrants would be done with the intention of giving authorities enough time to do background checks.

And I'm not talking about the cursory background checks that have been conducted — if time and resources permitted them to be conducted at all — up to this point.

The time has long since past when we could keep terrorist cells out of this country or quarantine enough of the suspicious immigrants long enough for background checks to weed out the most dangerous ones.

For tonight's Oval Office address, as rare as they have been in Obama's tenure, to have any historical meaning, it must spark a serious discussion about the most effective way to keep the American people safe.

That doesn't mean belittling those who have dedicated their lives to being first responders when crisis strikes.

That doesn't mean letting political correctness overrule common sense.

It means being a leader. Under this president, who almost always leads from behind, being a true leader has been even more rare than Oval Office addresses.

Saturday, November 14, 2015

Paris Is in the Crosshairs, But the Target Is Western Civilization

I suppose I hoped that the attack on the offices of Charlie Hebdo nearly a year ago would have made it too clear to be misunderstood or ignored. Yet the eyes of the world are drawn once again to Paris, the scene of yesterday's horrific series of coordinated terrorist attacks — because those who should have learned from that earlier experience did not.

A virtual anarchist's cookbook of tactics was on display as the terrorists struck at any place people tend to gather on an evening in Paris, one of the largest cities in the world. For centuries, Paris has been known the world over for its culture, its arts, its music, and people have been drawn there to experience it. Technology did not bring culture to Paris. Instead, Paris' culture brought technology there — and, lately, not for good.

On Friday terrorists used bombs and guns at cafes, at a stadium where a soccer match was in progress, at a theater where a concert was taking place. Even though most of the perpetrators appear to be dead now, those attacks are sure to have at least a temporary chilling effect on Paris' cultural scene — not unlike the dramatic drop in air traffic in the United States after the Sept. 11, 2001 hijackings.

Appropriately, it is the deadliest attack on French soil since World War II — and I say "appropriately" because this is a war. Too many people have been unwilling to acknowledge that — and, I am sure, many are still reluctant to do so, perhaps because they feel it is a war against Islam, which it is not.

But Muslim extremists are waging a war on Western civilization. The target today is Paris — but the real target, the objective, is the overthrow of Western civilization, and that will mean that the war, inevitably, will be waged on our soil. We did not seek this war any more than we sought a war with Japan in the 1940s, but Pearl Harbor dragged us into the conflict.

Wars are regrettable, but sometimes they are necessary to preserve a way of life.

But, at long last, we must acknowledge the fact that this war is not a conventional war. Just because there hasn't been a major attack like the one more than 14 years ago — with a high body count and lots of mayhem — doesn't mean the war is over. The terrorists are patient — and they're smart the way that criminals are always smart. They apply logic to their objectives. It was why in 2001 they selected those jets that had enough fuel for a coast–to–coast trip — they wanted plenty of jet fuel to cause maximum damage when the planes crashed into buildings — and why they chose weekdays instead of weekends to carry out their plots. They knew there would be fewer people on board to resist.

The attacks in Paris were well coordinated and indicate extensive planning. Why did they pick yesterday to carry them out? Was it in response to the United States' drone attack that killed Jihadi John? Or was it planned ahead of time, and the timing was a happy coincidence for the terrorists?

I'm pretty sure it wasn't because yesterday was Friday the 13th, but I guess you never know ...

I sympathize with the reluctance of many to see the United States engaged in a war. The Iraq/Afghanistan experience left a bad taste in many people's mouths, and it is an experience no one wishes to repeat. (Afghanistan, of course, was targeted because the terrorist attack was planned there. Iraq was different. It was a war of choice and could have been avoided. But that is a discussion for another time.)

In case you haven't noticed yet, life affords no one the luxury of controlling events. The United States has always desired peace, but outside influences sometimes force us to go to war (OK, one time it was due to inside influences). Those wars in Iraq and Afghanistan began as responses to the 9–11 attacks — well, Iraq got piggybacked in because of the alleged presence of weapons of mass destruction — and they were very popular at first. They became much less popular as they dragged on.

True, the perps in these terrorist attacks are always Muslims, but this is a war with the extremists, not mainstream Islam. Those who call this what it is are not calling for an FDR–like roundup and segregation of all who fit a general description. Those who call this what it is are being realists. Does that sound like profiling to you? Well, if it does, you must remember that profiling, when correctly applied, serves a useful purpose — if, for example, there has been a series of break–ins somewhere, and witnesses report that the apparent perps were in a certain age group and appeared to be in a particular racial group, authorities won't squander valuable time interrogating people who do not fit the description — but it can be abused. There is no doubt about that. There must be adequate, diligent oversight to prevent abuse.

The idea behind profiling is a good one — to provide useful information that can enable authorities to resolve criminal cases faster. The implementation needs to be fine–tuned.

In France today, there is no massive manhunt as there was in January. My understanding is that all the attackers are now dead. But if any were alive, it would be good for authorities to have a physical description of them and/or their colleagues.

As I write this, the death toll has fluctuated. CNN reported 128 casualties last night, and ABC News reports 127 casualties this morning. I don't know the actual number — maybe no one does — but many, many more are injured, some critically, and the death toll is sure to rise in the coming days.

The latest figure is 129 — from The Telegraph. As I say, though, that number will surely rise.

French President François Hollande — who was attending that soccer match — calls it what it is. He said it was an "act of war."

It seems to be a little late to be reaching that conclusion — but better late than never, I suppose.

Sunday, November 8, 2015

Changing Times

My stepsister and I were talking about our vehicles the other night, and we discovered that we both drive standard transmissions. That, of course, is a vanishing breed.

I'm not sure when my stepsister bought her vehicle, but I bought mine about a year ago. It was used — a couple of years old — and it was a five–speed standard transmission. I saw it advertised on the internet and went to investigate on a Saturday.

The salesman was a friendly fellow — they always are, aren't they? — and he was glad someone was interested in the vehicle, but he hesitantly got around to mentioning (apologetically) that it was a standard transmission. Was I aware of that? he asked.

"Oh, yes," I replied. "That is what I want," and he seemed relieved to hear that. I explained that I have been driving standard transmissions nearly all of my driving life. I probably wouldn't know what to do with my left foot if I didn't drive a standard.

I guess the first car I drove regularly was an automatic. My mother and grandmother taught me how to drive. We went out in the country — there was a lot of it around where I grew up — and I practiced basic maneuvers. My parents had two cars, one an automatic and the other a standard. Mom felt I should learn to drive both.

She told me that there might come a time when an emergency would come up and the only vehicle that could be used was a standard. In such a situation, it would be good if I knew how to drive a standard. The other people around me might not know how.

That made sense to me — except that later, as I reflected on Mom's reasoning, I thought that, if I had not been the one who drove the standard to wherever this situation occurred, the owner of that standard must be there, too. Wouldn't that person be able to drive the vehicle? It seemed Mom had overlooked that detail. Perhaps not, though. Perhaps the owner broke a leg or was rendered unconscious. Then, by process of elimination, it might be up to me to save us all — or, at least, get us the hell out of Dodge.

So I could accept Mom's reasoning on that. Maybe she did touch all the bases in her reasoning based on what she knew to be true at the time — but she and I both failed to anticipate a time (in my lifetime) when standard transmissions would virtually cease to exist. That seems to be where we are headed. Standards, as I observed earlier, are dwindling. Someday in the future — perhaps the near future — a vehicle with standard transmission may be a special order kind of thing — if it still exists at all.

This vehicle I am driving now may well turn out to be the last of its kind for me. In the future, I may not have a choice about what kind of transmission to have in my vehicle. It might be regarded as a luxury option — luxury in the sense of additional cost.

That will mean yet another adjustment in my life, but that really doesn't bother me too much, I suppose. I've been through that kind of thing before.

What really bothers me is future generations, who are being deprived of more of the simple pleasures of life and not really getting something better — or even just equal — in return.

I saw a meme on Facebook the other day that pointed out that modern cell phone users will never know the satisfaction of slamming a telephone receiver to end a frustrating call. I'm sure it never sounded as dramatic on the other end, but it sure did feel good, didn't it? Pressing a button to end a call just never has been the same.

And future drivers operating an automatic transmission will never feel as liberated as shifting into fifth gear on an open highway and watching the countryside race by can make you feel.

Of course, these days, there is talk of driverless cars. I'm not sure how I feel about that. It is said that driverless cars will permit their owners to relax, perhaps read the morning paper, while being taken to work by someone who shares the same family tree with Manti Te'o's girlfriend.

I don't think I could relax or read with a ghost behind the wheel.

Sunday, October 18, 2015

Scratching the Six-Year Itch

For the last seven or eight years, American voters have seemed to be intent upon turning history on its ear.

They elected and re–elected a black president while taking away his party's advantages in first the House and then the Senate. Aggrieved Democrats have complained that, somehow, the system is rigged against them in midterms. Yet, while these congressional shifts were extreme by historical standards, the pattern has been unmistakable.

The party that is not in possession of the White House almost always does better in midterms than the party in power. Sometimes the Democrats benefit. Sometimes the Republicans benefit. Depends on who holds the White House.

It is Americans' way of preventing the political pendulum from swinging too far in one direction or the other. We like to think of ourselves as fair and balanced, tolerant of all and open to all — whether we really are or not — and we use the ballot to pursue equilibrium. (If we ever actually achieve equilibrium, it is short–lived.)

I have written of this before, and you can find those posts archived elsewhere on this blog.

But a lot of that has addressed congressional politics. It naturally leads to an interesting phenomenon I have observed in presidential politics — but have not written about. Others have, though, to an extent. I think political analyst Charlie Cook wrote something to the effect that, in discussions of presidential politics, whenever the conversation turns to the dynamics of a campaign, the introduction of the phenomenon is "as sure as the sun coming up in the morning."

It is called the "Six–Year Itch," and it holds that voters are inclined to look favorably upon the out–of–power party by the time the current administration has been in place for six years. This really goes beyond the midterm elections, which, as I say, almost never go well for the incumbent party, and has more to do with the popularity of the incumbent during the time of the midterms.

After all, even popular presidents see their parties lose ground in midterms, especially second midterms (which fall in a re–elected president's sixth year in office). About a week before the second midterm of his presidency, Ronald Reagan's approval rating was 63% — but his personal popularity failed to help his Republican Party maintain its grip on its majority in the Senate — a majority it had held since Reagan was first elected in 1980.

Voters, though, treat legislative elections and executive elections differently. Following the '86 midterms, Reagan's popularity took a beating during the Iran–Contra affair, but he bounced back and helped his vice president win the presidency two years later when he was constitutionally prevented from running.

Following Barack Obama's re–election in 2012, the Washington Post sought to shoot holes in the notion of a six–year itch.

"It's overrated," wrote Aaron Blake for the Post. He wrote that column, it is worth noting, less than six weeks after his employer endorsed Obama's re–election bid so you need to consider that as a counterweight to Blake's argument. I was inclined to agree with him, to an extent, when he wrote, "It's not so much that a second midterm isn't trouble for an incumbent president, as much as midterms in general are trouble. And the American public scratches that itch nearly as often in a president's second year as in his sixth year."

That, it seems to me, supports what I wrote about that political pendulum correction. So does the fact that today more Americans than ever do not identify with either party and call themselves independent.

Whether they do so consciously or not, I think most Americans are inclined to give a president — of either party — the two four–year terms in office to which he is constitutionally limited, all things being equal. I guess Americans tend to be reluctant to admit having made a mistake in electing someone a first time. But it depends on what he does with his first four years, and experience tells me that is largely a matter of perception.

If a negative perception takes hold early — if the president suffers a string of setbacks at the start of a presidency — and the perception of misfortune is allowed to harden, it can be almost impossible to overcome. If the president is perceived to have made a mess of things — as Jimmy Carter was — voters look elsewhere for leadership. If a president is perceived to have exceeded expectations, that reservoir of good will makes a landslide re–election likely.

That, I think, is a big reason why some presidents who don't seem to share the same belief system with many of their constituents nevertheless win their votes for second terms.

In those second terms, a president's popularity really is more of a concern to whoever his party nominates to replace him. That is the true coattail effect of which political analysts often speak, and it is the last (if not only) opportunity for a president to have an electoral influence. Coattails are not really factors in House and Senate midterm elections, which are not national and tend to be decided by issues that matter only within the boundaries of states and congressional districts, but they can be factors in national campaigns.

But there is a catch.

Historically, the United States has not been likely to elect candidates who are nominated by an incumbent president's party to succeed that president. Well, I guess that should be narrowed down to the post–World War II period. Prior to the war, the United States was hardly hesitant to stick with the same party in more than two consecutive presidential elections.

It elected Franklin D. Roosevelt four times, then elected the man who succeeded him following his death and just prior to the end of the war, Harry Truman, to a full four–year term of his own. Just prior to FDR's time, Republicans won three straight elections. In fact, Woodrow Wilson's two terms in office in the early 20th century and Grover Cleveland's two nonconsecutive terms in the late 19th century were the only interruptions in a period when Republicans won 14 of 18 national elections.

But since World War II and Truman's decision not to seek another term in 1952, Americans have only elected the same party three straight times once. That was in the 1980s, when Reagan won twice and then his vice president, George H.W. Bush, was elected to succeed him.

Reagan, as I pointed out, enjoyed solid approval ratings just before his party sustained significant Senate losses in 1986 — but that was on the legislative side. His personal popularity benefited Bush in the 1988 election.

Not that Bush's opponent, Michael Dukakis, didn't seem to do everything in his power to sabotage his own campaign.

And that, I think, underscores an important point about the six–year itch. It is susceptible to the dynamics that are unique to each campaign.

The popularity of the incumbent president seems to have a lot to do with the outcome, but that is no guarantee. Dwight Eisenhower enjoyed approval ratings that exceeded 50% for much of his presidency, but his vice president lost narrowly in his first bid for the presidency.

That leads me to another observation: It is also important for the president's would–be successor to take advantage of the resource of a popular incumbent. To my knowledge, Richard Nixon never distanced himself from Eisenhower, but the Republican ticket was hurt by the recession the country experienced in 1960.

Al Gore didn't embrace the popular Bill Clinton in 2000, and that was a decision that apparently cost him the presidency. Clinton's approval rating just before the 1998 midterms was over 60%, but Gore, while winning the 2000 popular vote, lost the Electoral College.

Since the advent of the polling era, few presidents have been popular with a majority of voters at the ends of their presidencies, and their would–be successors suffered for it. In 1966, after six years of the Kennedy–Johnson presidency, Democrat Lyndon Johnson had an approval rating of about 43% — roughly the share of the vote his vice president, Hubert Humphrey, received on Election Day two years later.

1974, after nearly two full terms of the Nixon–Ford presidency, Republican Gerald Ford's approval rating was around 47%, and he narrowly lost the election to Jimmy Carter in 1976.

History says the voters will have an itch to scratch next year, and Obama, like Ford, hovers below the 50% mark. Ford, of course, had the advantages of incumbency in the election year of 1976, and Obama will not be allowed to seek a third term, which suggests that 2016 will be an uphill climb for the Democrats' nominee.

It looks like it will be the Republican's race to lose.

Saturday, October 10, 2015

America Needs Another Ike

"Censorship, in my opinion, is a stupid and shallow way of approaching the solution to any problem. Though sometimes necessary, as witness a professional and technical secret that may have a bearing upon the welfare and very safety of this country, we should be very careful in the way we apply it, because in censorship always lurks the very great danger of working to the disadvantage of the American nation."

Dwight D. Eisenhower
April 24, 1950

As a student of history, I tend to believe that Dwight Eisenhower could not have been elected president in the modern incarnation of a world that was only beginning to develop when he served as America's commander–in–chief.

Presidents tend to be products of their times, not the other way around. Even if they enter the presidency with a specific agenda, circumstances often force them to change direction in ways they never anticipated. Presidents aren't prophets, and few probably would have chosen the crises they had to face.

But they are also influenced by the technology that exists when they live and serve. Some presidents have been slower than others to embrace emerging technology, and some have been ill–equipped to do so. Most presidents have been the first presidents to do something, but history always remembers things like:
  • the first president to be photographed (John Quincy Adams — although he wasn't president when the photograph was taken);
  • the first president to ride in a train (Andrew Jackson);
  • the first president to have a telephone installed in the White House (Rutherford B. Hayes);
  • the first president to ride in a submarine and an airplane (Theodore Roosevelt in both instances);
  • the first president to own an automobile (William Howard Taft);
  • the first president to give a radio broadcast from the White House (Calvin Coolidge), and
  • the first president to appear on television (Franklin D. Roosevelt).
In a more recent presidential first, Bill Clinton was the first president to send an email, but apparently — and contrary to what his wife has said — he hasn't made any use of email in his post–presidential years.

Eisenhower, who was born 125 years ago next Wednesday, was the last president born in the 19th century. He was not far removed from his heroic military leadership in World War II, an experience that clearly shaped his view of the world, and he benefited from the public's good will because of it. But America was only beginning to see emerging technological advances, often made possible by war–related research and development, that would come to play important roles in American politics in the not–so–distant future.

In Ike's day, for example, it wasn't crucial to look good on television because TV wasn't yet a commonplace item in every home. By the time Ike's vice president, Richard Nixon, was elected president, there were a lot more TVs in American homes, and how a candidate came across on television was more important. Today it is impossible to imagine a candidate who does not give a good impression on television being much of a success.

In many ways, that is reflected by a growing tendency to favor candidates because of which demographic group(s) they are believed to bring to the electoral table. The face of America is its president, and Americans increasingly show an inclination for that face to be a particular color or gender — and, in equal and opposite proportions, disdain for what Martin Luther King Jr. would call the content of a person's character.

Ike wasn't very photogenic, when you get right down to it. And he wasn't a stemwinder of a speaker, either. But he had some core virtues. Modern politicians would do well to follow his lead. The country certainly would benefit.

He said things that made a lot of sense, things that both Democrats and Republicans ought to study today, but he showed no penchant for what is known today as a "sound bite." He probably thought they were frivolous and overly simple, but such things win elections these days. Common sense often cannot be boiled down to a single phrase that is suitable for a bumper sticker — although "I Like Ike" wasn't bad for its day.

Ike might have been persuaded to run as a Democrat. He had no party affiliation and was pursued by officials from both parties to seek their nominations. It is interesting that House Speaker Sam Rayburn brushed off talk about Eisenhower seeking the presidency when the topic was first raised in 1948: "Good man," Rayburn said, "but wrong business."

Eisenhower decided not to seek the presidency in 1948, and many people thought he had passed up his only opportunity. It was widely assumed at that time that Tom Dewey, who had lost the 1944 election to Franklin D. Roosevelt, would be elected over Roosevelt's successor, Harry Truman. It was further assumed that Dewey would be re–elected in 1952 — and Eisenhower, at age 66, would be too old to seek the presidency by 1956.

But Truman won in what is still regarded as a major upset, then became phenomenally unpopular and chose not to seek another term in 1952. By that time, Eisenhower was ready to declare himself a Republican after voicing his disagreements with Democrat policies. He may have been just as motivated by a desire to prevent Sen. Robert Taft, a non–interventionist, from winning the Republican nomination.

Eisenhower did deny Taft the nomination — after one of the closest, most bitterly fought presidential nomination battles in American history — but I have always wondered if it had as much (if not more) to do with Taft's unpopular opposition to the postwar Nuremberg trials. (In the interest of fairness, I should point out that future President John F. Kennedy praised Taft in "Profiles in Courage" for taking a principled stand in spite of public opposition.)

During his tenure, Ike balanced the budget three times and cut the federal debt as a share of GDP. He was criticized as a "do–nothing" president, probably because of his domestic record, particularly Ike's record on civil rights. Seen from the 21st century, Ike's record on promoting racial equality appears to be unimpressive, but he took some important steps. Truman gets credit, and rightfully so, for desegregating the military, but Ike took it farther, ending the segregation that existed in VA hospitals and schools on military installations. His administration also navigated legislative waters in 1957 to pass the first civil rights act since Reconstruction.

Having grown up in Arkansas, one of the first things I learned about Eisenhower was that he enforced a desegregation court order that had been defied by Orval Faubus, the governor of Arkansas. It's worth noting that one of the members of Congress who opposed the president's action was Democrat John F. Kennedy.

I studied this when I took Arkansas history in school; in those days, I think it was a class everyone took in the fifth grade. For me that would have been more than a decade after the Little Rock Central crisis, but my memory is that our textbooks were brand–spanking new, so new that the books squeaked when you opened a cover or turned a page. Ours was the first class in my hometown to study an unbiased account of that moment in our home state's history. Those books had not been in use the previous year, when a text that was less balanced and tended to favor Faubus was used.

The New Republic's Richard Strout, bewildered by Eisenhower's soaring popularity (which seldom strayed below 50%), complained that "the less he does the more they love him." He didn't understand, as Ike did, that the American public was weary from the back–to–back experiences of the Great Depression and World War II. In the '50s, Americans craved stability.

Black Americans were still inclined to heavily support Democrats, as they had been since the ascent of Franklin D. Roosevelt to the presidency in 1932, but in 1956 Eisenhower received 39% of black America's vote when he sought a second term. Within a decade, Republican presidential nominees were receiving much less than 10% of black votes. Win or lose, has any Republican presidential nominee even come close to matching Eisenhower's achievement in the last 60 years?

In his rather modest, soft–spoken Midwestern way, Eisenhower achieved things without feeling the need to resort to self–promotion. He respected constitutional limits — on the use of military power, of the capacity of the government and the role of the president — and worked within them. He didn't try to get around them.

But there were still times when he wanted credit for things he did.

"The United States never lost a soldier or a foot of ground in my administration," he said after leaving the White House. "We kept the peace. People ask how it happened — by God, it didn't just happen, I'll tell you that."

We could use another Eisenhower today. Unfortunately, no candidate in either party remotely resembles him.

Sunday, September 27, 2015

The Eternal Randomness of Presidential Politics

"There's something happening here
But what it is ain't exactly clear."

Buffalo Springfield

Peggy Noonan recently observed in The Wall Street Journal that, so far, the 2016 presidential campaign has been full of surprises.

She made this observation in the context of another column that she wrote earlier this year in which she anticipated a "bloody" battle for the GOP's presidential nomination and a "boring" one for the Democrats' nod.

Now, she writes, the Republican campaign has become "exciting" with a record–setting debate night, and the Democrats' campaign has become "ominous." In other words, the presidential campaign — in which not one single vote has been cast in either party — has been full of surprises for Noonan.

That in itself surprises me. I've been aware of Noonan for 30 years, going back to when she wrote President Reagan's moving speech to the nation after the explosion of the Challenger in January 1986. If she's been around presidential politics at least that long, she should know how unpredictable it can be. Really. When has it ever been anything else?

As we approached the time last spring when Hillary Clinton made her candidacy official, I began to have a peculiar feeling about this campaign. Everyone acted as if it was a done deal that Hillary would not only win the Democrats' nomination but would breeze to victory in the general election.

Now, in my experience, nothing is that positive — and I have been following presidential politics most of my life. To be sure, there have been times when non–incumbent front–runners ended up cruising to the nomination as expected, but they usually struggle along the way, losing at least a primary or two. In keeping with history, it hasn't been the fait accompli that Hillary Clinton's march to the nomination appeared to be only a few months ago — and no one has even voted yet.

Now, Hillary insists that she never expected an effortless glide to the nomination, that she always expected it to be competitive. Part of that may be the residual effect of having been the presumptive nominee in 2008 only to lose it to an inexperienced — and largely unknown — guy named Barack Obama when the party's voters began participating in primaries and caucuses. And at least part of it is sure to be P.R.

It reminds me of Election Night 1980, when Hillary's husband lost a narrow race for re–election as Arkansas' governor. I guess you had to be in Arkansas at the time to understand just how popular Bill Clinton was there then — and how shocking it was that he had been voted out of office. True, he lost his first race, in 1974, for the U.S. House seat representing Arkansas' Third District, but he took 48% of the vote in that heavily Republican northwest quadrant of the state. Two years later, he was elected Arkansas' attorney general, facing only modest opposition in the primary and none in the general election. Arkansas elected its statewide officials every two years in those days, and, in 1978, Bill Clinton was elected governor.

1980 turned out to be a Republican year, with Reagan sweeping Jimmy Carter out of the White House and Republicans seizing control of the U.S. Senate. There were clear indications prior to the election that it would turn out that way nationally.

But Arkansas was solidly Democratic in those days. Four years earlier, it had given Carter his highest share of the popular vote outside of Carter's home state of Georgia. Even with a Reagan victory more or less expected, the feeling in Arkansas was that Carter would prevail there again.

But he didn't, and neither did Clinton. Both lost narrowly, and, when speaking to his supporters that night, Clinton said that he and his campaign staff had been aware, in the closing days of the campaign, of shifts within the electorate that pointed to the possibility that he would lose. It didn't come as a shock to them, Clinton insisted.

But I'll guarantee it came as a shock to many Arkansans.

I was probably too young at the time to recognize that for what it was — an early manifestation of the Clintons' obsession with controlling the conversation, whatever it was about. Even if you have been blindsided, never let 'em know that.

That trait is often interpreted as deceitful, and perhaps it is. What I have known about Hillary Clinton for a long time — and others only seem to be understanding now — is that she is a cold fish politically. Her husband is a scoundrel, but he is a likable scoundrel. He has sure–footed natural political instincts. It is why he hasn't lost a general election since he was beaten in that 1980 campaign I mentioned earlier. He lost some presidential primaries but always won the nomination he sought.

Hillary has none of her husband's strengths and all of his weaknesses. It is a combination that isn't likely to hurt her much in the race for the nomination — but it is apt to be troublesome when she is trying to win as many independent and even Republican votes as possible. Because she can't win a national election on the votes from her party alone. No one can — not in a country where more than 40% of voters identify as independents.

Self–defined independents are important because they now outnumber Democrats and Republicans. They may lean to one side or the other, but the fact that they call themselves independent suggests that they cannot be taken for granted.

In spite of what Noonan says, though, I'm not sold — yet — on the narrative that holds that the emergence of Bernie Sanders on the campaign trail and the possible entry of Vice President Joe Biden — who met with Sen. Elizabeth Warren recently in what may have been the strongest signal yet that he will throw his hat in the ring — suggest that a race Noonan once described as "boring" is becoming "ominous." Well, perhaps "ominous" really isn't the right word. Perhaps Noonan — who is a gifted writer — should use a word like "threatening," because, at the moment, that is what this looks like to me.

As usual, I look to history for guidance. All history, really, but I prefer recent history when it is applicable.

There have been times in the last half century when insurgents have won their parties' nominations. Historically, Democrats have been more prone to it — eventual nominees George McGovern, Jimmy Carter, Michael Dukakis, even Bill Clinton and Barack Obama were nowhere in the polls more than a year before the general election when they were the standard bearers for the out–of–power party — so history does suggest that Sanders might have a chance to win the nomination — provided he can peel off some rich donors and make inroads into certain demographics that currently are in Hillary's camp.

But those donors and demographic groups are going to have to get a lot more nervous about Hillary before they'll be ripe for the picking. The fact that Sanders is drawing huge crowds on the campaign trail indicates to me that a sizable segment of the Democrats craves a real contest for this nomination, one that requires Democrats to take clear stands on issues and promote policies that are designed to help the voters, not the candidates.

I think that is true of voters of all stripes. They want to have a conversation about the issues that affect them and their children. They don't want that conversation to be disrupted by distractions. And the emergence of people like Donald Trump, Ben Carson and Carly Fiorina suggests voters have lost confidence in career politicians to confront and vanquish the problems and are looking for someone who can bring common sense from another field to the White House.

I would say that Hillary is still the odds–on favorite to win the nomination, but those odds are growing ever smaller. If Biden challenges her with a platform that appeals to an electorate that has clearly soured on politics as usual, things could get dicey for the Democrats. Hillary Clinton could find herself in political history books with all the other sure things — like Ed Muskie and Gary Hart.

Then there's Donald Trump.

A lot of Republicans fear that, if Trump is denied the GOP's nomination, he will run as an independent — and, in the process, hand the White House to the Democrats for four more years. I suppose they are the new Republicans, the ones whose party has lost five of the last six popular votes, a skid that began with Ross Perot's first independent candidacy.

I'm not so sure about that one, either. Hey, it is still very early in the process, and the folks who fear that Trump, with his deep pockets, will keep the Republicans from winning the presidency by running as an independent overlook a few key points that separate 2016 from 1992.

In 1992, the Republicans had been the incumbent party for a dozen years. They never had majorities in both houses of Congress simultaneously — in fact, for half of that time, Democrats controlled both houses — but the general public perception was that the Republicans had ownership of just about everything.

In 2016, Democrats will have been in charge of the White House for eight years, and the policies that will be debated are policies that, by and large, are products of this administration. If historical trends persist, voters will hold them responsible for conditions that exist, even though Republicans have controlled one or both houses of Congress through most of the Obama presidency; and Trump, although he has been seeking the Republican nomination, was supportive of many of those policies — and may tend to draw as many votes from disaffected Democrats as Republicans if he runs as an independent in the general election.

In short, an independent Trump candidacy won't necessarily work against Republicans, as many fear.

I learned a long time ago not to predict what voters will do until we are close to the time when they have to go to the polls. Attitudes are volatile more than a year from the election, and there may be events ahead that will shape the race in ways we cannot imagine.

One thing that voters in both parties must decide is whether essentially political matters are best left to essentially non–political people. If the answer to that is no, the primaries will bear witness to a thinning of the Republican field. I think that is bound to happen anyway. Virtually none of the GOP candidates mired at 1% or 2% in the polls can afford to stay in the race for long, and I am convinced the field will be half its current size before New Year's Day. At least one of the non–politicians is certain to be among those who drop out.

That will make it possible for all the candidates to participate in the same debate — and voters can judge them side by side. The race will become more focused, as it should.

Tuesday, September 22, 2015

The Unintended Victim

From April 4, 1841 until Nov. 22, 1963, a period of 122 years, America averaged a presidential death about every 15¼ years (we have now gone more than 50 years without an incumbent president's death). Some of those deaths were the clear outcomes of assassination attempts, and others were rumored to be — but never proven to be — assassinations.

No president had ever been the target of two assassination attempts — presumably because nearly all of the previous assassination attempts were successful — until this day in 1975.

I guess you really couldn't blame President Gerald Ford for wondering if there was a target on his chest. It was the second time in a month that he had been targeted for assassination — and both attempts were carried out by women in the state of California.

As a result of that first attempt, the Secret Service began putting more distance between Ford and the crowds who greeted him at his stops. That strategy was still evolving, but it may have prevented Ford's injury or death when, 40 years ago today, Sara Jane Moore attempted to shoot Ford from across a street in San Francisco. The gun never went off in that first attempt. It did go off in the second attempt, but the sights were off, so the shot missed.

The shot may also have been affected by the actions of a retired Marine standing next to Moore. Acting out of instinct, he reached for her just as she fired the first shot. Before Moore could fire a second shot, the ex–Marine reached for the gun and deflected the shot, which missed Ford by about six inches, ricocheted and wounded a taxi driver.

It turned out afterward that the retired Marine was gay, and his heroic act brought a lot of unwanted attention to him and his lifestyle. His big problem was that his family found out about his sexual orientation for the very first time through those news reports.

The man was outed, so I hear, by gay politician Harvey Milk, who was a friend of the man. Supposedly, Milk thought it was too good an opportunity to show the community that gays were capable of heroic deeds and advised the San Francisco Chronicle that the man was gay. That was the tragedy of the story. The man became estranged from his family, and his mental and physical health deteriorated over the years. Eventually, he reconciled with his family, but he drank heavily, gained weight and became paranoid and suicidal.

At times later in his life, he expressed regret at having deflected the shot intended for Ford. He was found dead in his bed in February 1989. Earlier in the day, he told a friend he had been turned away by a VA hospital where he had gone about difficulty he had been having breathing due to pneumonia.

I don't know if that was his cause of death or not, but his treatment after the incident speaks volumes about the America of the mid–'70s and the America of today. The man asked that his sexual orientation and other aspects of his life be withheld from publication, but the media ignored his request. President Ford was criticized at the time for not inviting the man to the White House to thank him and was accused of being homophobic. Ford insisted that he did not know until later about the man's sexual orientation; my memory is that the topic was never mentioned the next year when Ford ran for a full four–year term as president.

Ford lost that election, but the ex–Marine, Billy Sipple, lost a lot more than that. He was the unintended victim.

Saturday, September 5, 2015

Taking Aim at Jerry Ford

"In the job of selling himself to the voters, Ford embarked, shortly after Labor Day, on a routine two–day trip to the West Coast. Before it was over, the nation was treated to yet another bizarre illustration of the unpredictability of American presidential politics."

Jules Witcover, Marathon: The Pursuit of the Presidency 1972–1976

For just a moment or two, put yourself in Gerald Ford's position 40 years ago. The summer of 1975 was Ford's first full summer as president, having succeeded Richard Nixon in August 1974. To say that his first year in office had been challenging would be an understatement.

Most people who are old enough to remember Ford's presidency would tell you that he seemed like a nice guy, a decent guy, whether they agreed with him on most things or not. When Ford became president, the contrast between his easygoing disposition and the sullen Nixon was so stark that he enjoyed astonishing popularity from the start. He irretrievably lost a lot of the public's good will when he pardoned Nixon about a month after becoming president, but he didn't deserve to be targeted for assassination for it. I think even Ford's detractors would agree with that.

Yet it was 40 years ago today that Squeaky Fromme, one of the original members of the Manson Family, tried to assassinate Ford in Sacramento, Calif.

Now, to be fair, Squeaky's motive for shooting Ford apparently had nothing to do with the pardon of Nixon. It was just that, even then, the timing of the shooting seemed spooky to me — just a few days shy of the one–year anniversary of the pardon.

I suppose most people don't remember Squeaky's real name (Lynette). Doesn't really matter, I guess. "Squeaky" suited her.

Most of the first half of 1975 had not been particularly kind to Ford. He came under frequent criticism from hard–liners in his party over his choice of Nelson Rockefeller to be vice president. The economy had been a drain on his presidency; only a few months after taking office, he went on national television to encourage anti–inflation sentiment — since inflation was regarded as a greater threat to economic stability than rising unemployment (which, while high by the standards of the times, seems modest when compared to today's 5.1% rate). And the United States had suffered its greatest foreign policy humiliation — up to that time — when the North Vietnamese drove the Americans from South Vietnam. That led to rumblings of concern that Ford's national security team wasn't up to the job.

But in May 1975 Ford's luck began to change, thanks to an event half a world away, in the Gulf of Siam. Inexplicably, the Khmer Rouge seized the merchant ship Mayaguez and held its crew captive. The Ford administration freed the crew with a plan that was both daring and overkill, subjecting the Cambodian mainland to heavy air strikes. It was a shot in the arm for those who had worried about a loss of U.S. influence in the region, and it was leverage that Ford supporters used — unsuccessfully — in an effort to persuade Ronald Reagan and his supporters not to challenge Ford for the Republican nomination in 1976.

The Mayaguez incident was a real turning point for Ford. Economic news was getting better, too. The recession that had plagued the economy was bottoming out. Unemployment was still higher than most would like, but there were signs of a recovery, which was seen as good news for the administration, and Ford announced his candidacy for a full term in July.

Also that July, California Gov. Jerry Brown, a Democrat, would not commit to speak to the annual "Host Breakfast" in Sacramento — a gathering of the state's politically influential business leaders. They saw Brown's response as a snub and, in apparent retaliation, invited Ford, a Republican, to speak. Ford believed California was crucial to his hopes of winning a full term in 1976 and accepted the invitation.

Meanwhile, Fromme apparently had become active in environmental causes and believed (due, in part, to a study that had been released by the Environmental Protection Agency) that California's redwoods were endangered by smog. An article in the New York Times about the study observed that Ford had asked Congress to ease provisions of the 1963 Clean Air Act.

Fromme wanted to bring attention to this matter, and she wanted those in government to be fearful so she decided to kill the symbolic head of the government. On the morning of Sept. 5, she walked approximately half a mile from her apartment to the state capitol grounds — a short distance from the Senator Hotel, where Ford was staying — a Colt .45 concealed beneath her distinctive red robe.

Ford returned from the breakfast around 9:30 a.m., then left the hotel on foot at 10, his destination the governor's office — and an apparent photo opp with Jerry Brown. Along the way, he encountered Fromme, who drew the gun from beneath her robe and pulled the trigger. The weapon had ammunition — but no bullet in the chamber — so the gun didn't fire.

"It wouldn't go off!" Fromme shouted as Secret Service agents took the gun from her hands and wrestled her to the ground. "Can you believe it? It didn't go off."

Ford went on to the capitol and met with Brown for half an hour, only mentioning the assassination attempt in passing as he prepared to leave.

"I thought I'd better get on with my day's schedule," Ford later said.

Two months later, Fromme was convicted of attempting to assassinate the president and received a life sentence. She was paroled in August 2009, nearly three years after Ford's death.

Sunday, August 9, 2015

Seventy Years Ago Today

"The atomic bomb is too dangerous to be loose in a lawless world. That is why Great Britain, Canada and the United States, who have the secret of its production, do not intend to reveal that secret until means have been found to control the bomb so as to protect ourselves and the rest of the world from the danger of total destruction."

Harry Truman
Aug. 9, 1945

Seventy years ago today, an atomic bomb was dropped on one country by another for what was the last time — so far.

The rationale for using the bombs in 1945 was to prevent what was widely believed to be a bloodier invasion of the Japanese mainland. But that has been questioned from the start, and proponents of the use of the bomb have been raising the estimate of lives saved ever since. If one is to defend the use of the atomic bomb, I suppose, any lives that are saved, even if it is only one or two, not hundreds of thousands or millions, is justifiable.

But then we start getting into complicated math — because there were casualties, between 50,000 and 150,000 initial civilian casualties, in Hiroshima and Nagasaki combined. It is hard to be precise. Harry Truman had been told that a quick resolution of the war in the Pacific would save about 200,000 soldiers who could be expected to be lost in an invasion of Japan.

If you are of the opinion that all lives matter, though, even if the civilian casualties were the low–end figure, that would produce a much smaller net gain than simply focusing on the invasion that was prevented.

But that is just one part of the story, and it really only compares apples to oranges. The estimated casualties from an invasion would be accumulated over weeks and months of painstakingly capturing ground from a determined enemy; the civilian casualties I just cited came from the bombs' immediate detonations. To be more accurate, you would have to include those who died weeks and months later from radiation poisoning, which would further reduce the number of lives that were presumably saved.

Those who supported the use of the bomb kept raising the estimate over the years; recent estimates have been in the millions.

Of course, the whole subject of how many lives were saved by dropping two atomic bombs 70 years ago is a purely hypothetical one — and, as a rule, I prefer to avoid hypotheticals. What really is of greater importance is where we are now, seven decades later.

I suppose the nuclear technology that was born in World War II could not have remained secret for long, especially when you consider that so many scientists on both sides had been trying to harness the power of the atom; showing the world what the bomb could do may well have made the world, as some people claimed, safer — for awhile.

Until other countries began to get the technology, by legitimate or illegitimate means, and that was inevitable because, throughout history, unconventional weapons have, in time, become conventional weapons. It might have been delayed for a time by withholding the revelation from the public — but it could never have been kept under wraps forever.

That visual display of the bombs going off — and the photographs of victims that circulated later — may have been more valuable than anyone knew in preventing the use of nuclear weapons in the last 70 years. As more nations have joined the nuclear club, a sense of the awesome responsibility in their hands seems to have come with it. Perhaps that has been because, until fairly recently, everyone who acquired nuclear technology felt the weight of a moral obligation not to use it.

But now nations that sponsor terrorism are acquiring the technology, and I fear they will not hesitate to use it. They have already expressed their objectives, and the annihilation of perceived enemies is at the top of their lists. They have made no attempt to conceal their intention, and the United States has made no real attempt to prevent them from achieving it.

The "secret" to which Truman referred has been out for a long time, and there is much work to be done if his pledge to "control the bomb" is to be fulfilled.

Thursday, July 16, 2015

Rock Hudson's Revelation

It was 30 years ago today that Rock Hudson and his old friend and co–star, Doris Day, held a press conference to announce her new TV cable show Doris Day's Best Friends. Hudson was going to be a guest on the show. It was a milestone moment.

All the talk after the press conference wasn't about Day's TV show, however. It was about Hudson, how emaciated he looked, how incomprehensible his speech pattern was. He was practically unrecognizable. There had been rumors about Hudson's health for a long time, and his appearance with Day revived them.

A couple of days later, Hudson traveled to Paris for another round of treatment and collapsed in his hotel room, after which his publicist confirmed that Hudson was ill but told everyone it was inoperable liver cancer. The publicist denied that Hudson suffered from AIDS — but then, only a few days later, he backpedaled and confirmed that Hudson did have AIDS and had been diagnosed with the virus more than a year earlier. Hudson hypothesized that he had been exposed to the virus through a blood transfusion when he had heart bypass surgery — long before anyone knew that blood carried the AIDS virus.

When it was confirmed that Hudson had AIDS, that triggered a lot of speculation about whether Hudson was homosexual. I don't recall if Hudson ever acknowledged that he was gay; I'm inclined to think he didn't, but People magazine ran a cover story about Hudson that discussed his AIDS diagnosis in the context of his sexuality about a month and a half before his death.

The 1980s were a trip. Ask any people you know who are old enough to remember, and they'll tell you the same thing — if not in those words, then in words to that effect.

It was a decade that often provided examples of how kind and generous people can be — and, just as often, provided examples of how petty people can be, too. I guess most decades are like that, but the 1980s seemed to have even more than most.

In such an atmosphere, it was initially regarded as socially acceptable to be dying of liver cancer — but not of AIDS. Then, when it was impossible to continue denying that he was afflicted with AIDS, it became important for the public to believe that Hudson got sick through no fault of his own. That was the phrase that separated the good AIDS sufferers from the bad ones. It was the phrase that cast the blame. Did the sufferer get sick through his own recklessness? Or did he get sick through someone else's negligence? (And, if Hudson had been exposed to the virus via transfusion, it couldn't even be called negligence — because it would be years before anyone knew that AIDS could be transmitted through blood.)

I was in college when the '80s began. At that time, most people were just beginning to hear about a strange new disease that was, apparently, 100% fatal, but before it killed you, it stripped you of your immunities, making you vulnerable to all sorts of things that healthy people shrug off. The vast majority of Americans tended to feel secure because the disease only appeared to be striking certain groups — hemophiliacs, heroin users, Haitians and homosexuals. In fact, it could have been called the "4 H" disease. (Actually, I think it may have been called that for awhile.)

They didn't know what to call it, frankly. Because it seemed to be striking the homosexual demographic disproportionately, it was initially called GRID for Gay–Related Immune Deficiency. Understandably, the gay community objected, feeling that the name unfairly singled out homosexuals when the record clearly showed that non–homosexuals were getting the disease, too.

And even though a non–judgmental name — Acquired Immune Deficiency Syndrome (AIDS) — was being used officially by the fall of 1982, the perception persisted that homosexuals had put the health of the rest of the population at risk.

People do strange things when they are frightened. I knew that from my studies of history, and AIDS gave me proof that irrational fear wasn't something that was unique to past generations. Human beings continue to have the potential for irrational fear; I guess they always will.

At first, AIDS was thought to be something of a medical anomaly, like Legionnaires' disease. It didn't take long for people to realize it was not a medical anomaly, but nevertheless the impression that homosexuals, through their reckless behavior, had put everyone at risk persisted. For a time, many people refused to use public restrooms or water fountains, afraid that AIDS sufferers might have been there before them.

It is necessary, you see, to recall the conditions that existed in the 1980s to understand what a big deal it was when Rock Hudson's affliction with AIDS became known in the summer of 1985. As imperfect as his acknowledgement was, it was a milestone in the AIDS story. Until that time, it was hard to get funding for research into the disease; consequently, it took years for the medical community even to discover that it was passed from one person to the next through bodily fluids.

Doctors learned the highest concentrations of the disease could be found in blood and semen; it was present in much lower levels in tears and saliva. Thus, the odds against someone getting sick from exposure to tears or saliva were considerable. Even so, in light of the fact that Hudson's diagnosis was more than a year old, people in the media speculated about the passionate kiss he had shared with actress Linda Evans on Dynasty. Hudson knew he was sick when the scene was filmed, but he did not tell Evans, prompting a certain amount of panic. Some actresses insisted on having kisses written out of their scripts, and the Screen Actors Guild adopted new rules regarding "open–mouth kissing." Actors had to be notified in advance — and were immune from penalty if they decided not to participate.

After the revelation that Hudson, one of Hollywood's most popular leading men, was sick with AIDS, roughly $2 million was raised, and Congress set aside more than $200 million to seek a cure.

Hudson's condition created issues for President Ronald Reagan, who was seen by a significant portion of the population as being indifferent to AIDS. But Reagan and his wife Nancy were Hudson's friends. On the strength of that friendship, a lot of people expected Reagan to break his long public silence on the subject.

But Reagan made no statement about Hudson, even when he had the opportunity at a press conference a couple of weeks before Hudson died.

He did, however, issue a brief statement on the occasion of Hudson's death on Oct. 2, 1985: "Nancy and I are saddened by the news of Rock Hudson's death. He will always be remembered for his dynamic impact on the film industry, and fans all over the world will certainly mourn his loss. He will be remembered for his humanity, his sympathetic spirit and well–deserved reputation for kindness. May God rest his soul."

Hudson's affliction and death was a milestone, however belated, in the fight against AIDS. People began talking about it. It was — and still is — a long way from a cure, but, as the old saying goes, the journey of a thousand miles begins with a single step.