Saturday, November 14, 2015
I suppose I hoped that the attack on the offices of Charlie Hebdo nearly a year ago would have made it too clear to be misunderstood or ignored. Yet the eyes of the world are drawn once again to Paris, the scene of yesterday's horrific series of coordinated terrorist attacks — because those who should have learned from that earlier experience did not.
A virtual anarchist's cookbook of tactics was on display as the terrorists struck at any place people tend to gather on an evening in Paris, one of the largest cities in the world. For centuries, Paris has been known the world over for its culture, its arts, its music, and people have been drawn there to experience it. Technology did not bring culture to Paris. Instead, Paris' culture brought technology there — and, lately, not for good.
On Friday terrorists used bombs and guns at cafes, at a stadium where a soccer match was in progress, at a theater where a concert was taking place. Even though most of the perpetrators appear to be dead now, those attacks are sure to have at least a temporary chilling effect on Paris' cultural scene — not unlike the dramatic drop in air traffic in the United States after the Sept. 11, 2001 hijackings.
Appropriately, it is the deadliest attack on French soil since World War II — and I say "appropriately" because this is a war. Too many people have been unwilling to acknowledge that — and, I am sure, many are still reluctant to do so, perhaps because they feel it is a war against Islam, which it is not.
But Muslim extremists are waging a war on Western civilization. The target today is Paris — but the real target, the objective, is the overthrow of Western civilization, and that will mean that the war, inevitably, will be waged on our soil. We did not seek this war any more than we sought a war with Japan in the 1940s, but Pearl Harbor dragged us into the conflict.
Wars are regrettable, but sometimes they are necessary to preserve a way of life.
But, at long last, we must acknowledge the fact that this war is not a conventional war. Just because there hasn't been a major attack like the one more than 14 years ago — with a high body count and lots of mayhem — doesn't mean the war is over. The terrorists are patient — and they're smart the way that criminals are always smart. They apply logic to their objectives. It was why in 2001 they selected those jets that had enough fuel for a coast–to–coast trip — they wanted plenty of jet fuel to cause maximum damage when the planes crashed into buildings — and why they chose weekdays instead of weekends to carry out their plots. They knew there would be fewer people on board to resist.
The attacks in Paris were well coordinated and indicate extensive planning. Why did they pick yesterday to carry them out? Was it in response to the United States' drone attack that killed Jihadi John? Or was it planned ahead of time, and the timing was a happy coincidence for the terrorists?
I'm pretty sure it wasn't because yesterday was Friday the 13th, but I guess you never know ...
I sympathize with the reluctance of many to see the United States engaged in a war. The Iraq/Afghanistan experience left a bad taste in many people's mouths, and it is an experience no one wishes to repeat. (Afghanistan, of course, was targeted because the terrorist attack was planned there. Iraq was different. It was a war of choice and could have been avoided. But that is a discussion for another time.)
In case you haven't noticed yet, life affords no one the luxury of controlling events. The United States has always desired peace, but outside influences sometimes force us to go to war (OK, one time it was due to inside influences). Those wars in Iraq and Afghanistan began as responses to the 9–11 attacks — well, Iraq got piggybacked in because of the alleged presence of weapons of mass destruction — and they were very popular at first. They became much less popular as they dragged on.
True, the perps in these terrorist attacks are always Muslims, but this is a war with the extremists, not mainstream Islam. Those who call this what it is are not calling for an FDR–like roundup and segregation of all who fit a general description. Those who call this what it is are being realists. Does that sound like profiling to you? Well, if it does, you must remember that profiling, when correctly applied, serves a useful purpose — if, for example, there has been a series of break–ins somewhere, and witnesses report that the apparent perps were in a certain age group and appeared to be in a particular racial group, authorities won't squander valuable time interrogating people who do not fit the description — but it can be abused. There is no doubt about that. There must be adequate, diligent oversight to prevent abuse.
The idea behind profiling is a good one — to provide useful information that can enable authorities to resolve criminal cases faster. The implementation needs to be fine–tuned.
In France today, there is no massive manhunt as there was in January. My understanding is that all the attackers are now dead. But if any were alive, it would be good for authorities to have a physical description of them and/or their colleagues.
As I write this, the death toll has fluctuated. CNN reported 128 casualties last night, and ABC News reports 127 casualties this morning. I don't know the actual number — maybe no one does — but many, many more are injured, some critically, and the death toll is sure to rise in the coming days.
The latest figure is 129 — from The Telegraph. As I say, though, that number will surely rise.
French President François Hollande — who was attending that soccer match — calls it what it is. He said it was an "act of war."
It seems to be a little late to be reaching that conclusion — but better late than never, I suppose.
Sunday, November 8, 2015
I'm not sure when my stepsister bought her vehicle, but I bought mine about a year ago. It was used — a couple of years old — and it was a five–speed standard transmission. I saw it advertised on the internet and went to investigate on a Saturday.
The salesman was a friendly fellow — they always are, aren't they? — and he was glad someone was interested in the vehicle, but he hesitantly got around to mentioning (apologetically) that it was a standard transmission. Was I aware of that? he asked.
"Oh, yes," I replied. "That is what I want," and he seemed relieved to hear that. I explained that I have been driving standard transmissions nearly all of my driving life. I probably wouldn't know what to do with my left foot if I didn't drive a standard.
I guess the first car I drove regularly was an automatic. My mother and grandmother taught me how to drive. We went out in the country — there was a lot of it around where I grew up — and I practiced basic maneuvers. My parents had two cars, one an automatic and the other a standard. Mom felt I should learn to drive both.
She told me that there might come a time when an emergency would come up and the only vehicle that could be used was a standard. In such a situation, it would be good if I knew how to drive a standard. The other people around me might not know how.
That made sense to me — except that later, as I reflected on Mom's reasoning, I thought that, if I had not been the one who drove the standard to wherever this situation occurred, the owner of that standard must be there, too. Wouldn't that person be able to drive the vehicle? It seemed Mom had overlooked that detail. Perhaps not, though. Perhaps the owner broke a leg or was rendered unconscious. Then, by process of elimination, it might be up to me to save us all — or, at least, get us the hell out of Dodge.
So I could accept Mom's reasoning on that. Maybe she did touch all the bases in her reasoning based on what she knew to be true at the time — but she and I both failed to anticipate a time (in my lifetime) when standard transmissions would virtually cease to exist. That seems to be where we are headed. Standards, as I observed earlier, are dwindling. Someday in the future — perhaps the near future — a vehicle with standard transmission may be a special order kind of thing — if it still exists at all.
This vehicle I am driving now may well turn out to be the last of its kind for me. In the future, I may not have a choice about what kind of transmission to have in my vehicle. It might be regarded as a luxury option — luxury in the sense of additional cost.
That will mean yet another adjustment in my life, but that really doesn't bother me too much, I suppose. I've been through that kind of thing before.
What really bothers me is future generations, who are being deprived of more of the simple pleasures of life and not really getting something better — or even just equal — in return.
I saw a meme on Facebook the other day that pointed out that modern cell phone users will never know the satisfaction of slamming a telephone receiver to end a frustrating call. I'm sure it never sounded as dramatic on the other end, but it sure did feel good, didn't it? Pressing a button to end a call just never has been the same.
And future drivers operating an automatic transmission will never feel as liberated as shifting into fifth gear on an open highway and watching the countryside race by can make you feel.
Of course, these days, there is talk of driverless cars. I'm not sure how I feel about that. It is said that driverless cars will permit their owners to relax, perhaps read the morning paper, while being taken to work by someone who shares the same family tree with Manti Te'o's girlfriend.
I don't think I could relax or read with a ghost behind the wheel.
Sunday, October 18, 2015
For the last seven or eight years, American voters have seemed to be intent upon turning history on its ear.
They elected and re–elected a black president while taking away his party's advantages in first the House and then the Senate. Aggrieved Democrats have complained that, somehow, the system is rigged against them in midterms. Yet, while these congressional shifts were extreme by historical standards, the pattern has been unmistakable.
The party that is not in possession of the White House almost always does better in midterms than the party in power. Sometimes the Democrats benefit. Sometimes the Republicans benefit. Depends on who holds the White House.
It is Americans' way of preventing the political pendulum from swinging too far in one direction or the other. We like to think of ourselves as fair and balanced, tolerant of all and open to all — whether we really are or not — and we use the ballot to pursue equilibrium. (If we ever actually achieve equilibrium, it is short–lived.)
I have written of this before, and you can find those posts archived elsewhere on this blog.
But a lot of that has addressed congressional politics. It naturally leads to an interesting phenomenon I have observed in presidential politics — but have not written about. Others have, though, to an extent. I think political analyst Charlie Cook wrote something to the effect that, in discussions of presidential politics, whenever the conversation turns to the dynamics of a campaign, the introduction of the phenomenon is "as sure as the sun coming up in the morning."
It is called the "Six–Year Itch," and it holds that voters are inclined to look favorably upon the out–of–power party by the time the current administration has been in place for six years. This really goes beyond the midterm elections, which, as I say, almost never go well for the incumbent party, and has more to do with the popularity of the incumbent during the time of the midterms.
After all, even popular presidents see their parties lose ground in midterms, especially second midterms (which fall in a re–elected president's sixth year in office). About a week before the second midterm of his presidency, Ronald Reagan's approval rating was 63% — but his personal popularity failed to help his Republican Party maintain its grip on its majority in the Senate — a majority it had held since Reagan was first elected in 1980.
Voters, though, treat legislative elections and executive elections differently. Following the '86 midterms, Reagan's popularity took a beating during the Iran–Contra affair, but he bounced back and helped his vice president win the presidency two years later when he was constitutionally prevented from running.
Following Barack Obama's re–election in 2012, the Washington Post sought to shoot holes in the notion of a six–year itch.
"It's overrated," wrote Aaron Blake for the Post. He wrote that column, it is worth noting, less than six weeks after his employer endorsed Obama's re–election bid so you need to consider that as a counterweight to Blake's argument. I was inclined to agree with him, to an extent, when he wrote, "It's not so much that a second midterm isn't trouble for an incumbent president, as much as midterms in general are trouble. And the American public scratches that itch nearly as often in a president's second year as in his sixth year."
That, it seems to me, supports what I wrote about that political pendulum correction. So does the fact that today more Americans than ever do not identify with either party and call themselves independent.
Whether they do so consciously or not, I think most Americans are inclined to give a president — of either party — the two four–year terms in office to which he is constitutionally limited, all things being equal. I guess Americans tend to be reluctant to admit having made a mistake in electing someone a first time. But it depends on what he does with his first four years, and experience tells me that is largely a matter of perception.
If a negative perception takes hold early — if the president suffers a string of setbacks at the start of a presidency — and the perception of misfortune is allowed to harden, it can be almost impossible to overcome. If the president is perceived to have made a mess of things — as Jimmy Carter was — voters look elsewhere for leadership. If a president is perceived to have exceeded expectations, that reservoir of good will makes a landslide re–election likely.
That, I think, is a big reason why some presidents who don't seem to share the same belief system with many of their constituents nevertheless win their votes for second terms.
In those second terms, a president's popularity really is more of a concern to whoever his party nominates to replace him. That is the true coattail effect of which political analysts often speak, and it is the last (if not only) opportunity for a president to have an electoral influence. Coattails are not really factors in House and Senate midterm elections, which are not national and tend to be decided by issues that matter only within the boundaries of states and congressional districts, but they can be factors in national campaigns.
But there is a catch.
Historically, the United States has not been likely to elect candidates who are nominated by an incumbent president's party to succeed that president. Well, I guess that should be narrowed down to the post–World War II period. Prior to the war, the United States was hardly hesitant to stick with the same party in more than two consecutive presidential elections.
It elected Franklin D. Roosevelt four times, then elected the man who succeeded him following his death and just prior to the end of the war, Harry Truman, to a full four–year term of his own. Just prior to FDR's time, Republicans won three straight elections. In fact, Woodrow Wilson's two terms in office in the early 20th century and Grover Cleveland's two nonconsecutive terms in the late 19th century were the only interruptions in a period when Republicans won 14 of 18 national elections.
But since World War II and Truman's decision not to seek another term in 1952, Americans have only elected the same party three straight times once. That was in the 1980s, when Reagan won twice and then his vice president, George H.W. Bush, was elected to succeed him.
Reagan, as I pointed out, enjoyed solid approval ratings just before his party sustained significant Senate losses in 1986 — but that was on the legislative side. His personal popularity benefited Bush in the 1988 election.
Not that Bush's opponent, Michael Dukakis, didn't seem to do everything in his power to sabotage his own campaign.
And that, I think, underscores an important point about the six–year itch. It is susceptible to the dynamics that are unique to each campaign.
The popularity of the incumbent president seems to have a lot to do with the outcome, but that is no guarantee. Dwight Eisenhower enjoyed approval ratings that exceeded 50% for much of his presidency, but his vice president lost narrowly in his first bid for the presidency.
That leads me to another observation: It is also important for the president's would–be successor to take advantage of the resource of a popular incumbent. To my knowledge, Richard Nixon never distanced himself from Eisenhower, but the Republican ticket was hurt by the recession the country experienced in 1960.
Al Gore didn't embrace the popular Bill Clinton in 2000, and that was a decision that apparently cost him the presidency. Clinton's approval rating just before the 1998 midterms was over 60%, but Gore, while winning the 2000 popular vote, lost the Electoral College.
Since the advent of the polling era, few presidents have been popular with a majority of voters at the ends of their presidencies, and their would–be successors suffered for it. In 1966, after six years of the Kennedy–Johnson presidency, Democrat Lyndon Johnson had an approval rating of about 43% — roughly the share of the vote his vice president, Hubert Humphrey, received on Election Day two years later.
1974, after nearly two full terms of the Nixon–Ford presidency, Republican Gerald Ford's approval rating was around 47%, and he narrowly lost the election to Jimmy Carter in 1976.
History says the voters will have an itch to scratch next year, and Obama, like Ford, hovers below the 50% mark. Ford, of course, had the advantages of incumbency in the election year of 1976, and Obama will not be allowed to seek a third term, which suggests that 2016 will be an uphill climb for the Democrats' nominee.
It looks like it will be the Republican's race to lose.
Saturday, October 10, 2015
"Censorship, in my opinion, is a stupid and shallow way of approaching the solution to any problem. Though sometimes necessary, as witness a professional and technical secret that may have a bearing upon the welfare and very safety of this country, we should be very careful in the way we apply it, because in censorship always lurks the very great danger of working to the disadvantage of the American nation."
Dwight D. Eisenhower
April 24, 1950
As a student of history, I tend to believe that Dwight Eisenhower could not have been elected president in the modern incarnation of a world that was only beginning to develop when he served as America's commander–in–chief.
Presidents tend to be products of their times, not the other way around. Even if they enter the presidency with a specific agenda, circumstances often force them to change direction in ways they never anticipated. Presidents aren't prophets, and few probably would have chosen the crises they had to face.
But they are also influenced by the technology that exists when they live and serve. Some presidents have been slower than others to embrace emerging technology, and some have been ill–equipped to do so. Most presidents have been the first presidents to do something, but history always remembers things like:
- the first president to be photographed (John Quincy Adams — although he wasn't president when the photograph was taken);
- the first president to ride in a train (Andrew Jackson);
- the first president to have a telephone installed in the White House (Rutherford B. Hayes);
- the first president to ride in a submarine and an airplane (Theodore Roosevelt in both instances);
- the first president to own an automobile (William Howard Taft);
- the first president to give a radio broadcast from the White House (Calvin Coolidge), and
- the first president to appear on television (Franklin D. Roosevelt).
Eisenhower, who was born 125 years ago next Wednesday, was the last president born in the 19th century. He was not far removed from his heroic military leadership in World War II, an experience that clearly shaped his view of the world, and he benefited from the public's good will because of it. But America was only beginning to see emerging technological advances, often made possible by war–related research and development, that would come to play important roles in American politics in the not–so–distant future.
In Ike's day, for example, it wasn't crucial to look good on television because TV wasn't yet a commonplace item in every home. By the time Ike's vice president, Richard Nixon, was elected president, there were a lot more TVs in American homes, and how a candidate came across on television was more important. Today it is impossible to imagine a candidate who does not give a good impression on television being much of a success.
In many ways, that is reflected by a growing tendency to favor candidates because of which demographic group(s) they are believed to bring to the electoral table. The face of America is its president, and Americans increasingly show an inclination for that face to be a particular color or gender — and, in equal and opposite proportions, disdain for what Martin Luther King Jr. would call the content of a person's character.
Ike wasn't very photogenic, when you get right down to it. And he wasn't a stemwinder of a speaker, either. But he had some core virtues. Modern politicians would do well to follow his lead. The country certainly would benefit.
He said things that made a lot of sense, things that both Democrats and Republicans ought to study today, but he showed no penchant for what is known today as a "sound bite." He probably thought they were frivolous and overly simple, but such things win elections these days. Common sense often cannot be boiled down to a single phrase that is suitable for a bumper sticker — although "I Like Ike" wasn't bad for its day.
Ike might have been persuaded to run as a Democrat. He had no party affiliation and was pursued by officials from both parties to seek their nominations. It is interesting that House Speaker Sam Rayburn brushed off talk about Eisenhower seeking the presidency when the topic was first raised in 1948: "Good man," Rayburn said, "but wrong business."
Eisenhower decided not to seek the presidency in 1948, and many people thought he had passed up his only opportunity. It was widely assumed at that time that Tom Dewey, who had lost the 1944 election to Franklin D. Roosevelt, would be elected over Roosevelt's successor, Harry Truman. It was further assumed that Dewey would be re–elected in 1952 — and Eisenhower, at age 66, would be too old to seek the presidency by 1956.
But Truman won in what is still regarded as a major upset, then became phenomenally unpopular and chose not to seek another term in 1952. By that time, Eisenhower was ready to declare himself a Republican after voicing his disagreements with Democrat policies. He may have been just as motivated by a desire to prevent Sen. Robert Taft, a non–interventionist, from winning the Republican nomination.
Eisenhower did deny Taft the nomination — after one of the closest, most bitterly fought presidential nomination battles in American history — but I have always wondered if it had as much (if not more) to do with Taft's unpopular opposition to the postwar Nuremberg trials. (In the interest of fairness, I should point out that future President John F. Kennedy praised Taft in "Profiles in Courage" for taking a principled stand in spite of public opposition.)
During his tenure, Ike balanced the budget three times and cut the federal debt as a share of GDP. He was criticized as a "do–nothing" president, probably because of his domestic record, particularly Ike's record on civil rights. Seen from the 21st century, Ike's record on promoting racial equality appears to be unimpressive, but he took some important steps. Truman gets credit, and rightfully so, for desegregating the military, but Ike took it farther, ending the segregation that existed in VA hospitals and schools on military installations. His administration also navigated legislative waters in 1957 to pass the first civil rights act since Reconstruction.
Having grown up in Arkansas, one of the first things I learned about Eisenhower was that he enforced a desegregation court order that had been defied by Orval Faubus, the governor of Arkansas. It's worth noting that one of the members of Congress who opposed the president's action was Democrat John F. Kennedy.
I studied this when I took Arkansas history in school; in those days, I think it was a class everyone took in the fifth grade. For me that would have been more than a decade after the Little Rock Central crisis, but my memory is that our textbooks were brand–spanking new, so new that the books squeaked when you opened a cover or turned a page. Ours was the first class in my hometown to study an unbiased account of that moment in our home state's history. Those books had not been in use the previous year, when a text that was less balanced and tended to favor Faubus was used.
The New Republic's Richard Strout, bewildered by Eisenhower's soaring popularity (which seldom strayed below 50%), complained that "the less he does the more they love him." He didn't understand, as Ike did, that the American public was weary from the back–to–back experiences of the Great Depression and World War II. In the '50s, Americans craved stability.
Black Americans were still inclined to heavily support Democrats, as they had been since the ascent of Franklin D. Roosevelt to the presidency in 1932, but in 1956 Eisenhower received 39% of black America's vote when he sought a second term. Within a decade, Republican presidential nominees were receiving much less than 10% of black votes. Win or lose, has any Republican presidential nominee even come close to matching Eisenhower's achievement in the last 60 years?
In his rather modest, soft–spoken Midwestern way, Eisenhower achieved things without feeling the need to resort to self–promotion. He respected constitutional limits — on the use of military power, of the capacity of the government and the role of the president — and worked within them. He didn't try to get around them.
But there were still times when he wanted credit for things he did.
"The United States never lost a soldier or a foot of ground in my administration," he said after leaving the White House. "We kept the peace. People ask how it happened — by God, it didn't just happen, I'll tell you that."
We could use another Eisenhower today. Unfortunately, no candidate in either party remotely resembles him.
Sunday, September 27, 2015
"There's something happening here
But what it is ain't exactly clear."
Peggy Noonan recently observed in The Wall Street Journal that, so far, the 2016 presidential campaign has been full of surprises.
She made this observation in the context of another column that she wrote earlier this year in which she anticipated a "bloody" battle for the GOP's presidential nomination and a "boring" one for the Democrats' nod.
Now, she writes, the Republican campaign has become "exciting" with a record–setting debate night, and the Democrats' campaign has become "ominous." In other words, the presidential campaign — in which not one single vote has been cast in either party — has been full of surprises for Noonan.
That in itself surprises me. I've been aware of Noonan for 30 years, going back to when she wrote President Reagan's moving speech to the nation after the explosion of the Challenger in January 1986. If she's been around presidential politics at least that long, she should know how unpredictable it can be. Really. When has it ever been anything else?
As we approached the time last spring when Hillary Clinton made her candidacy official, I began to have a peculiar feeling about this campaign. Everyone acted as if it was a done deal that Hillary would not only win the Democrats' nomination but would breeze to victory in the general election.
Now, in my experience, nothing is that positive — and I have been following presidential politics most of my life. To be sure, there have been times when non–incumbent front–runners ended up cruising to the nomination as expected, but they usually struggle along the way, losing at least a primary or two. In keeping with history, it hasn't been the fait accompli that Hillary Clinton's march to the nomination appeared to be only a few months ago — and no one has even voted yet.
Now, Hillary insists that she never expected an effortless glide to the nomination, that she always expected it to be competitive. Part of that may be the residual effect of having been the presumptive nominee in 2008 only to lose it to an inexperienced — and largely unknown — guy named Barack Obama when the party's voters began participating in primaries and caucuses. And at least part of it is sure to be P.R.
It reminds me of Election Night 1980, when Hillary's husband lost a narrow race for re–election as Arkansas' governor. I guess you had to be in Arkansas at the time to understand just how popular Bill Clinton was there then — and how shocking it was that he had been voted out of office. True, he lost his first race, in 1974, for the U.S. House seat representing Arkansas' Third District, but he took 48% of the vote in that heavily Republican northwest quadrant of the state. Two years later, he was elected Arkansas' attorney general, facing only modest opposition in the primary and none in the general election. Arkansas elected its statewide officials every two years in those days, and, in 1978, Bill Clinton was elected governor.
1980 turned out to be a Republican year, with Reagan sweeping Jimmy Carter out of the White House and Republicans seizing control of the U.S. Senate. There were clear indications prior to the election that it would turn out that way nationally.
But Arkansas was solidly Democratic in those days. Four years earlier, it had given Carter his highest share of the popular vote outside of Carter's home state of Georgia. Even with a Reagan victory more or less expected, the feeling in Arkansas was that Carter would prevail there again.
But he didn't, and neither did Clinton. Both lost narrowly, and, when speaking to his supporters that night, Clinton said that he and his campaign staff had been aware, in the closing days of the campaign, of shifts within the electorate that pointed to the possibility that he would lose. It didn't come as a shock to them, Clinton insisted.
But I'll guarantee it came as a shock to many Arkansans.
I was probably too young at the time to recognize that for what it was — an early manifestation of the Clintons' obsession with controlling the conversation, whatever it was about. Even if you have been blindsided, never let 'em know that.
That trait is often interpreted as deceitful, and perhaps it is. What I have known about Hillary Clinton for a long time — and others only seem to be understanding now — is that she is a cold fish politically. Her husband is a scoundrel, but he is a likable scoundrel. He has sure–footed natural political instincts. It is why he hasn't lost a general election since he was beaten in that 1980 campaign I mentioned earlier. He lost some presidential primaries but always won the nomination he sought.
Hillary has none of her husband's strengths and all of his weaknesses. It is a combination that isn't likely to hurt her much in the race for the nomination — but it is apt to be troublesome when she is trying to win as many independent and even Republican votes as possible. Because she can't win a national election on the votes from her party alone. No one can — not in a country where more than 40% of voters identify as independents.
Self–defined independents are important because they now outnumber Democrats and Republicans. They may lean to one side or the other, but the fact that they call themselves independent suggests that they cannot be taken for granted.
In spite of what Noonan says, though, I'm not sold — yet — on the narrative that holds that the emergence of Bernie Sanders on the campaign trail and the possible entry of Vice President Joe Biden — who met with Sen. Elizabeth Warren recently in what may have been the strongest signal yet that he will throw his hat in the ring — suggest that a race Noonan once described as "boring" is becoming "ominous." Well, perhaps "ominous" really isn't the right word. Perhaps Noonan — who is a gifted writer — should use a word like "threatening," because, at the moment, that is what this looks like to me.
As usual, I look to history for guidance. All history, really, but I prefer recent history when it is applicable.
There have been times in the last half century when insurgents have won their parties' nominations. Historically, Democrats have been more prone to it — eventual nominees George McGovern, Jimmy Carter, Michael Dukakis, even Bill Clinton and Barack Obama were nowhere in the polls more than a year before the general election when they were the standard bearers for the out–of–power party — so history does suggest that Sanders might have a chance to win the nomination — provided he can peel off some rich donors and make inroads into certain demographics that currently are in Hillary's camp.
But those donors and demographic groups are going to have to get a lot more nervous about Hillary before they'll be ripe for the picking. The fact that Sanders is drawing huge crowds on the campaign trail indicates to me that a sizable segment of the Democrats craves a real contest for this nomination, one that requires Democrats to take clear stands on issues and promote policies that are designed to help the voters, not the candidates.
I think that is true of voters of all stripes. They want to have a conversation about the issues that affect them and their children. They don't want that conversation to be disrupted by distractions. And the emergence of people like Donald Trump, Ben Carson and Carly Fiorina suggests voters have lost confidence in career politicians to confront and vanquish the problems and are looking for someone who can bring common sense from another field to the White House.
I would say that Hillary is still the odds–on favorite to win the nomination, but those odds are growing ever smaller. If Biden challenges her with a platform that appeals to an electorate that has clearly soured on politics as usual, things could get dicey for the Democrats. Hillary Clinton could find herself in political history books with all the other sure things — like Ed Muskie and Gary Hart.
Then there's Donald Trump.
A lot of Republicans fear that, if Trump is denied the GOP's nomination, he will run as an independent — and, in the process, hand the White House to the Democrats for four more years. I suppose they are the new Republicans, the ones whose party has lost five of the last six popular votes, a skid that began with Ross Perot's first independent candidacy.
I'm not so sure about that one, either. Hey, it is still very early in the process, and the folks who fear that Trump, with his deep pockets, will keep the Republicans from winning the presidency by running as an independent overlook a few key points that separate 2016 from 1992.
In 1992, the Republicans had been the incumbent party for a dozen years. They never had majorities in both houses of Congress simultaneously — in fact, for half of that time, Democrats controlled both houses — but the general public perception was that the Republicans had ownership of just about everything.
In 2016, Democrats will have been in charge of the White House for eight years, and the policies that will be debated are policies that, by and large, are products of this administration. If historical trends persist, voters will hold them responsible for conditions that exist, even though Republicans have controlled one or both houses of Congress through most of the Obama presidency; and Trump, although he has been seeking the Republican nomination, was supportive of many of those policies — and may tend to draw as many votes from disaffected Democrats as Republicans if he runs as an independent in the general election.
In short, an independent Trump candidacy won't necessarily work against Republicans, as many fear.
I learned a long time ago not to predict what voters will do until we are close to the time when they have to go to the polls. Attitudes are volatile more than a year from the election, and there may be events ahead that will shape the race in ways we cannot imagine.
One thing that voters in both parties must decide is whether essentially political matters are best left to essentially non–political people. If the answer to that is no, the primaries will bear witness to a thinning of the Republican field. I think that is bound to happen anyway. Virtually none of the GOP candidates mired at 1% or 2% in the polls can afford to stay in the race for long, and I am convinced the field will be half its current size before New Year's Day. At least one of the non–politicians is certain to be among those who drop out.
That will make it possible for all the candidates to participate in the same debate — and voters can judge them side by side. The race will become more focused, as it should.
Tuesday, September 22, 2015
From April 4, 1841 until Nov. 22, 1963, a period of 122 years, America averaged a presidential death about every 15¼ years (we have now gone more than 50 years without an incumbent president's death). Some of those deaths were the clear outcomes of assassination attempts, and others were rumored to be — but never proven to be — assassinations.
No president had ever been the target of two assassination attempts — presumably because nearly all of the previous assassination attempts were successful — until this day in 1975.
I guess you really couldn't blame President Gerald Ford for wondering if there was a target on his chest. It was the second time in a month that he had been targeted for assassination — and both attempts were carried out by women in the state of California.
As a result of that first attempt, the Secret Service began putting more distance between Ford and the crowds who greeted him at his stops. That strategy was still evolving, but it may have prevented Ford's injury or death when, 40 years ago today, Sara Jane Moore attempted to shoot Ford from across a street in San Francisco. The gun never went off in that first attempt. It did go off in the second attempt, but the sights were off, so the shot missed.
The shot may also have been affected by the actions of a retired Marine standing next to Moore. Acting out of instinct, he reached for her just as she fired the first shot. Before Moore could fire a second shot, the ex–Marine reached for the gun and deflected the shot, which missed Ford by about six inches, ricocheted and wounded a taxi driver.
It turned out afterward that the retired Marine was gay, and his heroic act brought a lot of unwanted attention to him and his lifestyle. His big problem was that his family found out about his sexual orientation for the very first time through those news reports.
The man was outed, so I hear, by gay politician Harvey Milk, who was a friend of the man. Supposedly, Milk thought it was too good an opportunity to show the community that gays were capable of heroic deeds and advised the San Francisco Chronicle that the man was gay. That was the tragedy of the story. The man became estranged from his family, and his mental and physical health deteriorated over the years. Eventually, he reconciled with his family, but he drank heavily, gained weight and became paranoid and suicidal.
At times later in his life, he expressed regret at having deflected the shot intended for Ford. He was found dead in his bed in February 1989. Earlier in the day, he told a friend he had been turned away by a VA hospital where he had gone about difficulty he had been having breathing due to pneumonia.
I don't know if that was his cause of death or not, but his treatment after the incident speaks volumes about the America of the mid–'70s and the America of today. The man asked that his sexual orientation and other aspects of his life be withheld from publication, but the media ignored his request. President Ford was criticized at the time for not inviting the man to the White House to thank him and was accused of being homophobic. Ford insisted that he did not know until later about the man's sexual orientation; my memory is that the topic was never mentioned the next year when Ford ran for a full four–year term as president.
Ford lost that election, but the ex–Marine, Billy Sipple, lost a lot more than that. He was the unintended victim.
Saturday, September 5, 2015
"In the job of selling himself to the voters, Ford embarked, shortly after Labor Day, on a routine two–day trip to the West Coast. Before it was over, the nation was treated to yet another bizarre illustration of the unpredictability of American presidential politics."
Jules Witcover, Marathon: The Pursuit of the Presidency 1972–1976
For just a moment or two, put yourself in Gerald Ford's position 40 years ago. The summer of 1975 was Ford's first full summer as president, having succeeded Richard Nixon in August 1974. To say that his first year in office had been challenging would be an understatement.
Most people who are old enough to remember Ford's presidency would tell you that he seemed like a nice guy, a decent guy, whether they agreed with him on most things or not. When Ford became president, the contrast between his easygoing disposition and the sullen Nixon was so stark that he enjoyed astonishing popularity from the start. He irretrievably lost a lot of the public's good will when he pardoned Nixon about a month after becoming president, but he didn't deserve to be targeted for assassination for it. I think even Ford's detractors would agree with that.
Yet it was 40 years ago today that Squeaky Fromme, one of the original members of the Manson Family, tried to assassinate Ford in Sacramento, Calif.
Now, to be fair, Squeaky's motive for shooting Ford apparently had nothing to do with the pardon of Nixon. It was just that, even then, the timing of the shooting seemed spooky to me — just a few days shy of the one–year anniversary of the pardon.
I suppose most people don't remember Squeaky's real name (Lynette). Doesn't really matter, I guess. "Squeaky" suited her.
Most of the first half of 1975 had not been particularly kind to Ford. He came under frequent criticism from hard–liners in his party over his choice of Nelson Rockefeller to be vice president. The economy had been a drain on his presidency; only a few months after taking office, he went on national television to encourage anti–inflation sentiment — since inflation was regarded as a greater threat to economic stability than rising unemployment (which, while high by the standards of the times, seems modest when compared to today's 5.1% rate). And the United States had suffered its greatest foreign policy humiliation — up to that time — when the North Vietnamese drove the Americans from South Vietnam. That led to rumblings of concern that Ford's national security team wasn't up to the job.
But in May 1975 Ford's luck began to change, thanks to an event half a world away, in the Gulf of Siam. Inexplicably, the Khmer Rouge seized the merchant ship Mayaguez and held its crew captive. The Ford administration freed the crew with a plan that was both daring and overkill, subjecting the Cambodian mainland to heavy air strikes. It was a shot in the arm for those who had worried about a loss of U.S. influence in the region, and it was leverage that Ford supporters used — unsuccessfully — in an effort to persuade Ronald Reagan and his supporters not to challenge Ford for the Republican nomination in 1976.
The Mayaguez incident was a real turning point for Ford. Economic news was getting better, too. The recession that had plagued the economy was bottoming out. Unemployment was still higher than most would like, but there were signs of a recovery, which was seen as good news for the administration, and Ford announced his candidacy for a full term in July.
Also that July, California Gov. Jerry Brown, a Democrat, would not commit to speak to the annual "Host Breakfast" in Sacramento — a gathering of the state's politically influential business leaders. They saw Brown's response as a snub and, in apparent retaliation, invited Ford, a Republican, to speak. Ford believed California was crucial to his hopes of winning a full term in 1976 and accepted the invitation.
Meanwhile, Fromme apparently had become active in environmental causes and believed (due, in part, to a study that had been released by the Environmental Protection Agency) that California's redwoods were endangered by smog. An article in the New York Times about the study observed that Ford had asked Congress to ease provisions of the 1963 Clean Air Act.
Fromme wanted to bring attention to this matter, and she wanted those in government to be fearful so she decided to kill the symbolic head of the government. On the morning of Sept. 5, she walked approximately half a mile from her apartment to the state capitol grounds — a short distance from the Senator Hotel, where Ford was staying — a Colt .45 concealed beneath her distinctive red robe.
Ford returned from the breakfast around 9:30 a.m., then left the hotel on foot at 10, his destination the governor's office — and an apparent photo opp with Jerry Brown. Along the way, he encountered Fromme, who drew the gun from beneath her robe and pulled the trigger. The weapon had ammunition — but no bullet in the chamber — so the gun didn't fire.
"It wouldn't go off!" Fromme shouted as Secret Service agents took the gun from her hands and wrestled her to the ground. "Can you believe it? It didn't go off."
Ford went on to the capitol and met with Brown for half an hour, only mentioning the assassination attempt in passing as he prepared to leave.
"I thought I'd better get on with my day's schedule," Ford later said.
Two months later, Fromme was convicted of attempting to assassinate the president and received a life sentence. She was paroled in August 2009, nearly three years after Ford's death.
Sunday, August 9, 2015
"The atomic bomb is too dangerous to be loose in a lawless world. That is why Great Britain, Canada and the United States, who have the secret of its production, do not intend to reveal that secret until means have been found to control the bomb so as to protect ourselves and the rest of the world from the danger of total destruction."
Aug. 9, 1945
Seventy years ago today, an atomic bomb was dropped on one country by another for what was the last time — so far.
The rationale for using the bombs in 1945 was to prevent what was widely believed to be a bloodier invasion of the Japanese mainland. But that has been questioned from the start, and proponents of the use of the bomb have been raising the estimate of lives saved ever since. If one is to defend the use of the atomic bomb, I suppose, any lives that are saved, even if it is only one or two, not hundreds of thousands or millions, is justifiable.
But then we start getting into complicated math — because there were casualties, between 50,000 and 150,000 initial civilian casualties, in Hiroshima and Nagasaki combined. It is hard to be precise. Harry Truman had been told that a quick resolution of the war in the Pacific would save about 200,000 soldiers who could be expected to be lost in an invasion of Japan.
If you are of the opinion that all lives matter, though, even if the civilian casualties were the low–end figure, that would produce a much smaller net gain than simply focusing on the invasion that was prevented.
But that is just one part of the story, and it really only compares apples to oranges. The estimated casualties from an invasion would be accumulated over weeks and months of painstakingly capturing ground from a determined enemy; the civilian casualties I just cited came from the bombs' immediate detonations. To be more accurate, you would have to include those who died weeks and months later from radiation poisoning, which would further reduce the number of lives that were presumably saved.
Those who supported the use of the bomb kept raising the estimate over the years; recent estimates have been in the millions.
Of course, the whole subject of how many lives were saved by dropping two atomic bombs 70 years ago is a purely hypothetical one — and, as a rule, I prefer to avoid hypotheticals. What really is of greater importance is where we are now, seven decades later.
I suppose the nuclear technology that was born in World War II could not have remained secret for long, especially when you consider that so many scientists on both sides had been trying to harness the power of the atom; showing the world what the bomb could do may well have made the world, as some people claimed, safer — for awhile.
Until other countries began to get the technology, by legitimate or illegitimate means, and that was inevitable because, throughout history, unconventional weapons have, in time, become conventional weapons. It might have been delayed for a time by withholding the revelation from the public — but it could never have been kept under wraps forever.
That visual display of the bombs going off — and the photographs of victims that circulated later — may have been more valuable than anyone knew in preventing the use of nuclear weapons in the last 70 years. As more nations have joined the nuclear club, a sense of the awesome responsibility in their hands seems to have come with it. Perhaps that has been because, until fairly recently, everyone who acquired nuclear technology felt the weight of a moral obligation not to use it.
But now nations that sponsor terrorism are acquiring the technology, and I fear they will not hesitate to use it. They have already expressed their objectives, and the annihilation of perceived enemies is at the top of their lists. They have made no attempt to conceal their intention, and the United States has made no real attempt to prevent them from achieving it.
The "secret" to which Truman referred has been out for a long time, and there is much work to be done if his pledge to "control the bomb" is to be fulfilled.
Thursday, July 16, 2015
It was 30 years ago today that Rock Hudson and his old friend and co–star, Doris Day, held a press conference to announce her new TV cable show Doris Day's Best Friends. Hudson was going to be a guest on the show. It was a milestone moment.
All the talk after the press conference wasn't about Day's TV show, however. It was about Hudson, how emaciated he looked, how incomprehensible his speech pattern was. He was practically unrecognizable. There had been rumors about Hudson's health for a long time, and his appearance with Day revived them.
A couple of days later, Hudson traveled to Paris for another round of treatment and collapsed in his hotel room, after which his publicist confirmed that Hudson was ill but told everyone it was inoperable liver cancer. The publicist denied that Hudson suffered from AIDS — but then, only a few days later, he backpedaled and confirmed that Hudson did have AIDS and had been diagnosed with the virus more than a year earlier. Hudson hypothesized that he had been exposed to the virus through a blood transfusion when he had heart bypass surgery — long before anyone knew that blood carried the AIDS virus.
When it was confirmed that Hudson had AIDS, that triggered a lot of speculation about whether Hudson was homosexual. I don't recall if Hudson ever acknowledged that he was gay; I'm inclined to think he didn't, but People magazine ran a cover story about Hudson that discussed his AIDS diagnosis in the context of his sexuality about a month and a half before his death.
The 1980s were a trip. Ask any people you know who are old enough to remember, and they'll tell you the same thing — if not in those words, then in words to that effect.
It was a decade that often provided examples of how kind and generous people can be — and, just as often, provided examples of how petty people can be, too. I guess most decades are like that, but the 1980s seemed to have even more than most.
In such an atmosphere, it was initially regarded as socially acceptable to be dying of liver cancer — but not of AIDS. Then, when it was impossible to continue denying that he was afflicted with AIDS, it became important for the public to believe that Hudson got sick through no fault of his own. That was the phrase that separated the good AIDS sufferers from the bad ones. It was the phrase that cast the blame. Did the sufferer get sick through his own recklessness? Or did he get sick through someone else's negligence? (And, if Hudson had been exposed to the virus via transfusion, it couldn't even be called negligence — because it would be years before anyone knew that AIDS could be transmitted through blood.)
I was in college when the '80s began. At that time, most people were just beginning to hear about a strange new disease that was, apparently, 100% fatal, but before it killed you, it stripped you of your immunities, making you vulnerable to all sorts of things that healthy people shrug off. The vast majority of Americans tended to feel secure because the disease only appeared to be striking certain groups — hemophiliacs, heroin users, Haitians and homosexuals. In fact, it could have been called the "4 H" disease. (Actually, I think it may have been called that for awhile.)
They didn't know what to call it, frankly. Because it seemed to be striking the homosexual demographic disproportionately, it was initially called GRID for Gay–Related Immune Deficiency. Understandably, the gay community objected, feeling that the name unfairly singled out homosexuals when the record clearly showed that non–homosexuals were getting the disease, too.
And even though a non–judgmental name — Acquired Immune Deficiency Syndrome (AIDS) — was being used officially by the fall of 1982, the perception persisted that homosexuals had put the health of the rest of the population at risk.
People do strange things when they are frightened. I knew that from my studies of history, and AIDS gave me proof that irrational fear wasn't something that was unique to past generations. Human beings continue to have the potential for irrational fear; I guess they always will.
At first, AIDS was thought to be something of a medical anomaly, like Legionnaires' disease. It didn't take long for people to realize it was not a medical anomaly, but nevertheless the impression that homosexuals, through their reckless behavior, had put everyone at risk persisted. For a time, many people refused to use public restrooms or water fountains, afraid that AIDS sufferers might have been there before them.
It is necessary, you see, to recall the conditions that existed in the 1980s to understand what a big deal it was when Rock Hudson's affliction with AIDS became known in the summer of 1985. As imperfect as his acknowledgement was, it was a milestone in the AIDS story. Until that time, it was hard to get funding for research into the disease; consequently, it took years for the medical community even to discover that it was passed from one person to the next through bodily fluids.
Doctors learned the highest concentrations of the disease could be found in blood and semen; it was present in much lower levels in tears and saliva. Thus, the odds against someone getting sick from exposure to tears or saliva were considerable. Even so, in light of the fact that Hudson's diagnosis was more than a year old, people in the media speculated about the passionate kiss he had shared with actress Linda Evans on Dynasty. Hudson knew he was sick when the scene was filmed, but he did not tell Evans, prompting a certain amount of panic. Some actresses insisted on having kisses written out of their scripts, and the Screen Actors Guild adopted new rules regarding "open–mouth kissing." Actors had to be notified in advance — and were immune from penalty if they decided not to participate.
After the revelation that Hudson, one of Hollywood's most popular leading men, was sick with AIDS, roughly $2 million was raised, and Congress set aside more than $200 million to seek a cure.
Hudson's condition created issues for President Ronald Reagan, who was seen by a significant portion of the population as being indifferent to AIDS. But Reagan and his wife Nancy were Hudson's friends. On the strength of that friendship, a lot of people expected Reagan to break his long public silence on the subject.
But Reagan made no statement about Hudson, even when he had the opportunity at a press conference a couple of weeks before Hudson died.
He did, however, issue a brief statement on the occasion of Hudson's death on Oct. 2, 1985: "Nancy and I are saddened by the news of Rock Hudson's death. He will always be remembered for his dynamic impact on the film industry, and fans all over the world will certainly mourn his loss. He will be remembered for his humanity, his sympathetic spirit and well–deserved reputation for kindness. May God rest his soul."
Hudson's affliction and death was a milestone, however belated, in the fight against AIDS. People began talking about it. It was — and still is — a long way from a cure, but, as the old saying goes, the journey of a thousand miles begins with a single step.
Friday, June 26, 2015
I wasn't working full time last year — at least through the first half of the year — so I didn't enroll in the state–mandated health insurance. I couldn't afford it. (Well, I guess I could have — if I had stopped doing things like, you know, paying rent or eating.)
I am working full time now — and I didn't like being treated like a criminal because I didn't sign up for health insurance — so I signed up before the deadline this year, and now I am in compliance with the law. (Well, that is what I have been told ...)
I had my annual checkup earlier this month. It was the first time I had ever met my doctor. He was assigned to me by the state because the doctor I have been seeing for years isn't on the state–approved list. That meant I had to go through my medical history with a stranger rather than see a doctor who is already familiar with my medical history. I wasn't too thrilled about that.
Nor am I pleased with the fact that this insurance doesn't cover my monthly prescriptions. In fact, it doesn't kick in on anything at all until I pony up six grand.
I pay nearly $375 a month for this policy. I'll be damned if I can see any benefit to it.
Oh, excuse me. There is one benefit. I am entitled to one no–charge visit with my state–assigned doctor per year. I gather it's a no–frills thing. When I met my new doctor, one of the first questions he asked me was how extensive I wanted the appointment to be. I replied that it was my understanding that my policy entitled me to one visit per year.
His response? "Oh. You want the free stuff."
Now, I'm a journalist. I studied journalism in college. I have worked as a reporter, an editor, a journalism instructor. The study of language is a given in my line of work, and I know — probably better than most — how easily language can be manipulated and misused to achieve whatever the user wishes to achieve. Successful politicians know it, too. For that matter, I suppose, most people today have a smattering of a familiarity with how it works.
Anyway, as I just said, I'm shelling out nearly $375 a month for this policy, and the only thing I really get in return — unless I get hit by a bus or something like that (and then it will cost me $6,000 up front) — is one visit with my health care provider per year. What the hell is affordable about that?
It certainly is not free. It costs me nearly $4,500 a year — and it isn't nearly as thorough as the annual checkups for which I paid $300 before the state compelled me to carry this policy.
Oh, sure, I understand why the doctor calls it free stuff. As far as he is concerned, I suppose, it is free.
But not really. The doctor is paid for that annual visit by the health insurer, not the patient (and I use that term loosely). It's a very cursory, bare–bones examination. Whatever the insurer pays for it, he/she is being overcharged.
Actually, we're all being overcharged so a small group of people can have their policies at discounted rates. That's what the Supreme Court upheld this week — the state's practice of using money from the working class to subsidize health insurance policies for others.
The policy doesn't cover prescriptions, but it does cover contraceptives. I mentioned to a friend that I was having to pay for someone else's contraceptives. This friend, whom I have known since before my high school days, is as devout a supporter of Barack Obama and Obamacare as you will find, and he tried to tell me that subsidizing contraceptives was a social obligation — the same way that we all (symbolically, at least) pitch in for the upkeep of roads and schools.
I really can't follow that logic — although God knows I've tried. Actually, I suppose I can follow it — up to a point. I agree that everyone is entitled to drive on good, well–maintained roads and send their children to good schools.
But contraceptives are different. Subsidizing contraceptives suggests that sex — like good roads and good schools — is a right. I disagree. If sex was a right, people would be entitled to grab anyone off the street and have sex with that person. Never mind if the other person didn't give his/her consent.
The law doesn't permit people to have sex with anyone, consent be damned. In fact, the law has a specific word for the act of sex with others without their consent. It's called rape — or sexual assault in the namby–pamby jurisdictions that won't call things what they are.
Sex is not a right. Sex is a privilege.
Even if you're one–half of a married couple. I have known many men who believed they were entitled to sex with their wives whenever they wanted it (and some even thought they were entitled to sex with their children). It was a wife's duty, they said — and then the courts began to rule that there was such a thing as spousal rape.
Clearly, unless you're talking about masturbation, sex is not a right.
(Now that the courts are handing down rulings that re–define marriage, I expect that sometime in the not–so–distant future there will be similar rulings establishing boundaries for sexual behavior in same–sex marriages. Seems like the next logical step to me. But I digress.
(I don't really care about that, though. I don't really have an opinion on same–sex marriage. I do have an opinion about the health care law.)
But it's that "free stuff" part that really bothers me. People believe it. Clearly, at least one doctor does.
I am an adjunct journalism professor at one of the community colleges here in Dallas, and I was there during the 2012 presidential campaign. I couldn't begin to tell you how many students told me they were voting for Obama "because he's going to give me free health insurance."
From the start, it reminded me of something I have heard all my life: There is no such thing as a free lunch. As a youngster, I thought that was absurd. Of course there were free lunches.
But as I have gotten older I have realized that the statement was true. Even if something appears to be free, you'll wind up paying for it in the end.