Freedom Writing

Sunday, September 27, 2015

The Eternal Randomness of Presidential Politics

"There's something happening here
But what it is ain't exactly clear."

Buffalo Springfield

Peggy Noonan recently observed in The Wall Street Journal that, so far, the 2016 presidential campaign has been full of surprises.

She made this observation in the context of another column that she wrote earlier this year in which she anticipated a "bloody" battle for the GOP's presidential nomination and a "boring" one for the Democrats' nod.

Now, she writes, the Republican campaign has become "exciting" with a record–setting debate night, and the Democrats' campaign has become "ominous." In other words, the presidential campaign — in which not one single vote has been cast in either party — has been full of surprises for Noonan.

That in itself surprises me. I've been aware of Noonan for 30 years, going back to when she wrote President Reagan's moving speech to the nation after the explosion of the Challenger in January 1986. If she's been around presidential politics at least that long, she should know how unpredictable it can be. Really. When has it ever been anything else?

As we approached the time last spring when Hillary Clinton made her candidacy official, I began to have a peculiar feeling about this campaign. Everyone acted as if it was a done deal that Hillary would not only win the Democrats' nomination but would breeze to victory in the general election.

Now, in my experience, nothing is that positive — and I have been following presidential politics most of my life. To be sure, there have been times when non–incumbent front–runners ended up cruising to the nomination as expected, but they usually struggle along the way, losing at least a primary or two. In keeping with history, it hasn't been the fait accompli that Hillary Clinton's march to the nomination appeared to be only a few months ago — and no one has even voted yet.

Now, Hillary insists that she never expected an effortless glide to the nomination, that she always expected it to be competitive. Part of that may be the residual effect of having been the presumptive nominee in 2008 only to lose it to an inexperienced — and largely unknown — guy named Barack Obama when the party's voters began participating in primaries and caucuses. And at least part of it is sure to be P.R.

It reminds me of Election Night 1980, when Hillary's husband lost a narrow race for re–election as Arkansas' governor. I guess you had to be in Arkansas at the time to understand just how popular Bill Clinton was there then — and how shocking it was that he had been voted out of office. True, he lost his first race, in 1974, for the U.S. House seat representing Arkansas' Third District, but he took 48% of the vote in that heavily Republican northwest quadrant of the state. Two years later, he was elected Arkansas' attorney general, facing only modest opposition in the primary and none in the general election. Arkansas elected its statewide officials every two years in those days, and, in 1978, Bill Clinton was elected governor.

1980 turned out to be a Republican year, with Reagan sweeping Jimmy Carter out of the White House and Republicans seizing control of the U.S. Senate. There were clear indications prior to the election that it would turn out that way nationally.

But Arkansas was solidly Democratic in those days. Four years earlier, it had given Carter his highest share of the popular vote outside of Carter's home state of Georgia. Even with a Reagan victory more or less expected, the feeling in Arkansas was that Carter would prevail there again.

But he didn't, and neither did Clinton. Both lost narrowly, and, when speaking to his supporters that night, Clinton said that he and his campaign staff had been aware, in the closing days of the campaign, of shifts within the electorate that pointed to the possibility that he would lose. It didn't come as a shock to them, Clinton insisted.

But I'll guarantee it came as a shock to many Arkansans.

I was probably too young at the time to recognize that for what it was — an early manifestation of the Clintons' obsession with controlling the conversation, whatever it was about. Even if you have been blindsided, never let 'em know that.

That trait is often interpreted as deceitful, and perhaps it is. What I have known about Hillary Clinton for a long time — and others only seem to be understanding now — is that she is a cold fish politically. Her husband is a scoundrel, but he is a likable scoundrel. He has sure–footed natural political instincts. It is why he hasn't lost a general election since he was beaten in that 1980 campaign I mentioned earlier. He lost some presidential primaries but always won the nomination he sought.

Hillary has none of her husband's strengths and all of his weaknesses. It is a combination that isn't likely to hurt her much in the race for the nomination — but it is apt to be troublesome when she is trying to win as many independent and even Republican votes as possible. Because she can't win a national election on the votes from her party alone. No one can — not in a country where more than 40% of voters identify as independents.

Self–defined independents are important because they now outnumber Democrats and Republicans. They may lean to one side or the other, but the fact that they call themselves independent suggests that they cannot be taken for granted.

In spite of what Noonan says, though, I'm not sold — yet — on the narrative that holds that the emergence of Bernie Sanders on the campaign trail and the possible entry of Vice President Joe Biden — who met with Sen. Elizabeth Warren recently in what may have been the strongest signal yet that he will throw his hat in the ring — suggest that a race Noonan once described as "boring" is becoming "ominous." Well, perhaps "ominous" really isn't the right word. Perhaps Noonan — who is a gifted writer — should use a word like "threatening," because, at the moment, that is what this looks like to me.

As usual, I look to history for guidance. All history, really, but I prefer recent history when it is applicable.

There have been times in the last half century when insurgents have won their parties' nominations. Historically, Democrats have been more prone to it — eventual nominees George McGovern, Jimmy Carter, Michael Dukakis, even Bill Clinton and Barack Obama were nowhere in the polls more than a year before the general election when they were the standard bearers for the out–of–power party — so history does suggest that Sanders might have a chance to win the nomination — provided he can peel off some rich donors and make inroads into certain demographics that currently are in Hillary's camp.

But those donors and demographic groups are going to have to get a lot more nervous about Hillary before they'll be ripe for the picking. The fact that Sanders is drawing huge crowds on the campaign trail indicates to me that a sizable segment of the Democrats craves a real contest for this nomination, one that requires Democrats to take clear stands on issues and promote policies that are designed to help the voters, not the candidates.

I think that is true of voters of all stripes. They want to have a conversation about the issues that affect them and their children. They don't want that conversation to be disrupted by distractions. And the emergence of people like Donald Trump, Ben Carson and Carly Fiorina suggests voters have lost confidence in career politicians to confront and vanquish the problems and are looking for someone who can bring common sense from another field to the White House.

I would say that Hillary is still the odds–on favorite to win the nomination, but those odds are growing ever smaller. If Biden challenges her with a platform that appeals to an electorate that has clearly soured on politics as usual, things could get dicey for the Democrats. Hillary Clinton could find herself in political history books with all the other sure things — like Ed Muskie and Gary Hart.

Then there's Donald Trump.

A lot of Republicans fear that, if Trump is denied the GOP's nomination, he will run as an independent — and, in the process, hand the White House to the Democrats for four more years. I suppose they are the new Republicans, the ones whose party has lost five of the last six popular votes, a skid that began with Ross Perot's first independent candidacy.

I'm not so sure about that one, either. Hey, it is still very early in the process, and the folks who fear that Trump, with his deep pockets, will keep the Republicans from winning the presidency by running as an independent overlook a few key points that separate 2016 from 1992.

In 1992, the Republicans had been the incumbent party for a dozen years. They never had majorities in both houses of Congress simultaneously — in fact, for half of that time, Democrats controlled both houses — but the general public perception was that the Republicans had ownership of just about everything.

In 2016, Democrats will have been in charge of the White House for eight years, and the policies that will be debated are policies that, by and large, are products of this administration. If historical trends persist, voters will hold them responsible for conditions that exist, even though Republicans have controlled one or both houses of Congress through most of the Obama presidency; and Trump, although he has been seeking the Republican nomination, was supportive of many of those policies — and may tend to draw as many votes from disaffected Democrats as Republicans if he runs as an independent in the general election.

In short, an independent Trump candidacy won't necessarily work against Republicans, as many fear.

I learned a long time ago not to predict what voters will do until we are close to the time when they have to go to the polls. Attitudes are volatile more than a year from the election, and there may be events ahead that will shape the race in ways we cannot imagine.

One thing that voters in both parties must decide is whether essentially political matters are best left to essentially non–political people. If the answer to that is no, the primaries will bear witness to a thinning of the Republican field. I think that is bound to happen anyway. Virtually none of the GOP candidates mired at 1% or 2% in the polls can afford to stay in the race for long, and I am convinced the field will be half its current size before New Year's Day. At least one of the non–politicians is certain to be among those who drop out.

That will make it possible for all the candidates to participate in the same debate — and voters can judge them side by side. The race will become more focused, as it should.

Tuesday, September 22, 2015

The Unintended Victim

From April 4, 1841 until Nov. 22, 1963, a period of 122 years, America averaged a presidential death about every 15¼ years (we have now gone more than 50 years without an incumbent president's death). Some of those deaths were the clear outcomes of assassination attempts, and others were rumored to be — but never proven to be — assassinations.

No president had ever been the target of two assassination attempts — presumably because nearly all of the previous assassination attempts were successful — until this day in 1975.

I guess you really couldn't blame President Gerald Ford for wondering if there was a target on his chest. It was the second time in a month that he had been targeted for assassination — and both attempts were carried out by women in the state of California.

As a result of that first attempt, the Secret Service began putting more distance between Ford and the crowds who greeted him at his stops. That strategy was still evolving, but it may have prevented Ford's injury or death when, 40 years ago today, Sara Jane Moore attempted to shoot Ford from across a street in San Francisco. The gun never went off in that first attempt. It did go off in the second attempt, but the sights were off, so the shot missed.

The shot may also have been affected by the actions of a retired Marine standing next to Moore. Acting out of instinct, he reached for her just as she fired the first shot. Before Moore could fire a second shot, the ex–Marine reached for the gun and deflected the shot, which missed Ford by about six inches, ricocheted and wounded a taxi driver.

It turned out afterward that the retired Marine was gay, and his heroic act brought a lot of unwanted attention to him and his lifestyle. His big problem was that his family found out about his sexual orientation for the very first time through those news reports.

The man was outed, so I hear, by gay politician Harvey Milk, who was a friend of the man. Supposedly, Milk thought it was too good an opportunity to show the community that gays were capable of heroic deeds and advised the San Francisco Chronicle that the man was gay. That was the tragedy of the story. The man became estranged from his family, and his mental and physical health deteriorated over the years. Eventually, he reconciled with his family, but he drank heavily, gained weight and became paranoid and suicidal.

At times later in his life, he expressed regret at having deflected the shot intended for Ford. He was found dead in his bed in February 1989. Earlier in the day, he told a friend he had been turned away by a VA hospital where he had gone about difficulty he had been having breathing due to pneumonia.

I don't know if that was his cause of death or not, but his treatment after the incident speaks volumes about the America of the mid–'70s and the America of today. The man asked that his sexual orientation and other aspects of his life be withheld from publication, but the media ignored his request. President Ford was criticized at the time for not inviting the man to the White House to thank him and was accused of being homophobic. Ford insisted that he did not know until later about the man's sexual orientation; my memory is that the topic was never mentioned the next year when Ford ran for a full four–year term as president.

Ford lost that election, but the ex–Marine, Billy Sipple, lost a lot more than that. He was the unintended victim.

Saturday, September 5, 2015

Taking Aim at Jerry Ford

"In the job of selling himself to the voters, Ford embarked, shortly after Labor Day, on a routine two–day trip to the West Coast. Before it was over, the nation was treated to yet another bizarre illustration of the unpredictability of American presidential politics."

Jules Witcover, Marathon: The Pursuit of the Presidency 1972–1976

For just a moment or two, put yourself in Gerald Ford's position 40 years ago. The summer of 1975 was Ford's first full summer as president, having succeeded Richard Nixon in August 1974. To say that his first year in office had been challenging would be an understatement.

Most people who are old enough to remember Ford's presidency would tell you that he seemed like a nice guy, a decent guy, whether they agreed with him on most things or not. When Ford became president, the contrast between his easygoing disposition and the sullen Nixon was so stark that he enjoyed astonishing popularity from the start. He irretrievably lost a lot of the public's good will when he pardoned Nixon about a month after becoming president, but he didn't deserve to be targeted for assassination for it. I think even Ford's detractors would agree with that.

Yet it was 40 years ago today that Squeaky Fromme, one of the original members of the Manson Family, tried to assassinate Ford in Sacramento, Calif.

Now, to be fair, Squeaky's motive for shooting Ford apparently had nothing to do with the pardon of Nixon. It was just that, even then, the timing of the shooting seemed spooky to me — just a few days shy of the one–year anniversary of the pardon.

I suppose most people don't remember Squeaky's real name (Lynette). Doesn't really matter, I guess. "Squeaky" suited her.

Most of the first half of 1975 had not been particularly kind to Ford. He came under frequent criticism from hard–liners in his party over his choice of Nelson Rockefeller to be vice president. The economy had been a drain on his presidency; only a few months after taking office, he went on national television to encourage anti–inflation sentiment — since inflation was regarded as a greater threat to economic stability than rising unemployment (which, while high by the standards of the times, seems modest when compared to today's 5.1% rate). And the United States had suffered its greatest foreign policy humiliation — up to that time — when the North Vietnamese drove the Americans from South Vietnam. That led to rumblings of concern that Ford's national security team wasn't up to the job.

But in May 1975 Ford's luck began to change, thanks to an event half a world away, in the Gulf of Siam. Inexplicably, the Khmer Rouge seized the merchant ship Mayaguez and held its crew captive. The Ford administration freed the crew with a plan that was both daring and overkill, subjecting the Cambodian mainland to heavy air strikes. It was a shot in the arm for those who had worried about a loss of U.S. influence in the region, and it was leverage that Ford supporters used — unsuccessfully — in an effort to persuade Ronald Reagan and his supporters not to challenge Ford for the Republican nomination in 1976.

The Mayaguez incident was a real turning point for Ford. Economic news was getting better, too. The recession that had plagued the economy was bottoming out. Unemployment was still higher than most would like, but there were signs of a recovery, which was seen as good news for the administration, and Ford announced his candidacy for a full term in July.

Also that July, California Gov. Jerry Brown, a Democrat, would not commit to speak to the annual "Host Breakfast" in Sacramento — a gathering of the state's politically influential business leaders. They saw Brown's response as a snub and, in apparent retaliation, invited Ford, a Republican, to speak. Ford believed California was crucial to his hopes of winning a full term in 1976 and accepted the invitation.

Meanwhile, Fromme apparently had become active in environmental causes and believed (due, in part, to a study that had been released by the Environmental Protection Agency) that California's redwoods were endangered by smog. An article in the New York Times about the study observed that Ford had asked Congress to ease provisions of the 1963 Clean Air Act.

Fromme wanted to bring attention to this matter, and she wanted those in government to be fearful so she decided to kill the symbolic head of the government. On the morning of Sept. 5, she walked approximately half a mile from her apartment to the state capitol grounds — a short distance from the Senator Hotel, where Ford was staying — a Colt .45 concealed beneath her distinctive red robe.

Ford returned from the breakfast around 9:30 a.m., then left the hotel on foot at 10, his destination the governor's office — and an apparent photo opp with Jerry Brown. Along the way, he encountered Fromme, who drew the gun from beneath her robe and pulled the trigger. The weapon had ammunition — but no bullet in the chamber — so the gun didn't fire.

"It wouldn't go off!" Fromme shouted as Secret Service agents took the gun from her hands and wrestled her to the ground. "Can you believe it? It didn't go off."

Ford went on to the capitol and met with Brown for half an hour, only mentioning the assassination attempt in passing as he prepared to leave.

"I thought I'd better get on with my day's schedule," Ford later said.

Two months later, Fromme was convicted of attempting to assassinate the president and received a life sentence. She was paroled in August 2009, nearly three years after Ford's death.

Sunday, August 9, 2015

Seventy Years Ago Today

"The atomic bomb is too dangerous to be loose in a lawless world. That is why Great Britain, Canada and the United States, who have the secret of its production, do not intend to reveal that secret until means have been found to control the bomb so as to protect ourselves and the rest of the world from the danger of total destruction."

Harry Truman
Aug. 9, 1945

Seventy years ago today, an atomic bomb was dropped on one country by another for what was the last time — so far.

The rationale for using the bombs in 1945 was to prevent what was widely believed to be a bloodier invasion of the Japanese mainland. But that has been questioned from the start, and proponents of the use of the bomb have been raising the estimate of lives saved ever since. If one is to defend the use of the atomic bomb, I suppose, any lives that are saved, even if it is only one or two, not hundreds of thousands or millions, is justifiable.

But then we start getting into complicated math — because there were casualties, between 50,000 and 150,000 initial civilian casualties, in Hiroshima and Nagasaki combined. It is hard to be precise. Harry Truman had been told that a quick resolution of the war in the Pacific would save about 200,000 soldiers who could be expected to be lost in an invasion of Japan.

If you are of the opinion that all lives matter, though, even if the civilian casualties were the low–end figure, that would produce a much smaller net gain than simply focusing on the invasion that was prevented.

But that is just one part of the story, and it really only compares apples to oranges. The estimated casualties from an invasion would be accumulated over weeks and months of painstakingly capturing ground from a determined enemy; the civilian casualties I just cited came from the bombs' immediate detonations. To be more accurate, you would have to include those who died weeks and months later from radiation poisoning, which would further reduce the number of lives that were presumably saved.

Those who supported the use of the bomb kept raising the estimate over the years; recent estimates have been in the millions.

Of course, the whole subject of how many lives were saved by dropping two atomic bombs 70 years ago is a purely hypothetical one — and, as a rule, I prefer to avoid hypotheticals. What really is of greater importance is where we are now, seven decades later.

I suppose the nuclear technology that was born in World War II could not have remained secret for long, especially when you consider that so many scientists on both sides had been trying to harness the power of the atom; showing the world what the bomb could do may well have made the world, as some people claimed, safer — for awhile.

Until other countries began to get the technology, by legitimate or illegitimate means, and that was inevitable because, throughout history, unconventional weapons have, in time, become conventional weapons. It might have been delayed for a time by withholding the revelation from the public — but it could never have been kept under wraps forever.

That visual display of the bombs going off — and the photographs of victims that circulated later — may have been more valuable than anyone knew in preventing the use of nuclear weapons in the last 70 years. As more nations have joined the nuclear club, a sense of the awesome responsibility in their hands seems to have come with it. Perhaps that has been because, until fairly recently, everyone who acquired nuclear technology felt the weight of a moral obligation not to use it.

But now nations that sponsor terrorism are acquiring the technology, and I fear they will not hesitate to use it. They have already expressed their objectives, and the annihilation of perceived enemies is at the top of their lists. They have made no attempt to conceal their intention, and the United States has made no real attempt to prevent them from achieving it.

The "secret" to which Truman referred has been out for a long time, and there is much work to be done if his pledge to "control the bomb" is to be fulfilled.

Thursday, July 16, 2015

Rock Hudson's Revelation

It was 30 years ago today that Rock Hudson and his old friend and co–star, Doris Day, held a press conference to announce her new TV cable show Doris Day's Best Friends. Hudson was going to be a guest on the show. It was a milestone moment.

All the talk after the press conference wasn't about Day's TV show, however. It was about Hudson, how emaciated he looked, how incomprehensible his speech pattern was. He was practically unrecognizable. There had been rumors about Hudson's health for a long time, and his appearance with Day revived them.

A couple of days later, Hudson traveled to Paris for another round of treatment and collapsed in his hotel room, after which his publicist confirmed that Hudson was ill but told everyone it was inoperable liver cancer. The publicist denied that Hudson suffered from AIDS — but then, only a few days later, he backpedaled and confirmed that Hudson did have AIDS and had been diagnosed with the virus more than a year earlier. Hudson hypothesized that he had been exposed to the virus through a blood transfusion when he had heart bypass surgery — long before anyone knew that blood carried the AIDS virus.

When it was confirmed that Hudson had AIDS, that triggered a lot of speculation about whether Hudson was homosexual. I don't recall if Hudson ever acknowledged that he was gay; I'm inclined to think he didn't, but People magazine ran a cover story about Hudson that discussed his AIDS diagnosis in the context of his sexuality about a month and a half before his death.

The 1980s were a trip. Ask any people you know who are old enough to remember, and they'll tell you the same thing — if not in those words, then in words to that effect.

It was a decade that often provided examples of how kind and generous people can be — and, just as often, provided examples of how petty people can be, too. I guess most decades are like that, but the 1980s seemed to have even more than most.

In such an atmosphere, it was initially regarded as socially acceptable to be dying of liver cancer — but not of AIDS. Then, when it was impossible to continue denying that he was afflicted with AIDS, it became important for the public to believe that Hudson got sick through no fault of his own. That was the phrase that separated the good AIDS sufferers from the bad ones. It was the phrase that cast the blame. Did the sufferer get sick through his own recklessness? Or did he get sick through someone else's negligence? (And, if Hudson had been exposed to the virus via transfusion, it couldn't even be called negligence — because it would be years before anyone knew that AIDS could be transmitted through blood.)

I was in college when the '80s began. At that time, most people were just beginning to hear about a strange new disease that was, apparently, 100% fatal, but before it killed you, it stripped you of your immunities, making you vulnerable to all sorts of things that healthy people shrug off. The vast majority of Americans tended to feel secure because the disease only appeared to be striking certain groups — hemophiliacs, heroin users, Haitians and homosexuals. In fact, it could have been called the "4 H" disease. (Actually, I think it may have been called that for awhile.)

They didn't know what to call it, frankly. Because it seemed to be striking the homosexual demographic disproportionately, it was initially called GRID for Gay–Related Immune Deficiency. Understandably, the gay community objected, feeling that the name unfairly singled out homosexuals when the record clearly showed that non–homosexuals were getting the disease, too.

And even though a non–judgmental name — Acquired Immune Deficiency Syndrome (AIDS) — was being used officially by the fall of 1982, the perception persisted that homosexuals had put the health of the rest of the population at risk.

People do strange things when they are frightened. I knew that from my studies of history, and AIDS gave me proof that irrational fear wasn't something that was unique to past generations. Human beings continue to have the potential for irrational fear; I guess they always will.

At first, AIDS was thought to be something of a medical anomaly, like Legionnaires' disease. It didn't take long for people to realize it was not a medical anomaly, but nevertheless the impression that homosexuals, through their reckless behavior, had put everyone at risk persisted. For a time, many people refused to use public restrooms or water fountains, afraid that AIDS sufferers might have been there before them.

It is necessary, you see, to recall the conditions that existed in the 1980s to understand what a big deal it was when Rock Hudson's affliction with AIDS became known in the summer of 1985. As imperfect as his acknowledgement was, it was a milestone in the AIDS story. Until that time, it was hard to get funding for research into the disease; consequently, it took years for the medical community even to discover that it was passed from one person to the next through bodily fluids.

Doctors learned the highest concentrations of the disease could be found in blood and semen; it was present in much lower levels in tears and saliva. Thus, the odds against someone getting sick from exposure to tears or saliva were considerable. Even so, in light of the fact that Hudson's diagnosis was more than a year old, people in the media speculated about the passionate kiss he had shared with actress Linda Evans on Dynasty. Hudson knew he was sick when the scene was filmed, but he did not tell Evans, prompting a certain amount of panic. Some actresses insisted on having kisses written out of their scripts, and the Screen Actors Guild adopted new rules regarding "open–mouth kissing." Actors had to be notified in advance — and were immune from penalty if they decided not to participate.

After the revelation that Hudson, one of Hollywood's most popular leading men, was sick with AIDS, roughly $2 million was raised, and Congress set aside more than $200 million to seek a cure.

Hudson's condition created issues for President Ronald Reagan, who was seen by a significant portion of the population as being indifferent to AIDS. But Reagan and his wife Nancy were Hudson's friends. On the strength of that friendship, a lot of people expected Reagan to break his long public silence on the subject.

But Reagan made no statement about Hudson, even when he had the opportunity at a press conference a couple of weeks before Hudson died.

He did, however, issue a brief statement on the occasion of Hudson's death on Oct. 2, 1985: "Nancy and I are saddened by the news of Rock Hudson's death. He will always be remembered for his dynamic impact on the film industry, and fans all over the world will certainly mourn his loss. He will be remembered for his humanity, his sympathetic spirit and well–deserved reputation for kindness. May God rest his soul."

Hudson's affliction and death was a milestone, however belated, in the fight against AIDS. People began talking about it. It was — and still is — a long way from a cure, but, as the old saying goes, the journey of a thousand miles begins with a single step.

Friday, June 26, 2015

Free Stuff?

I wasn't working full time last year — at least through the first half of the year — so I didn't enroll in the state–mandated health insurance. I couldn't afford it. (Well, I guess I could have — if I had stopped doing things like, you know, paying rent or eating.)

I am working full time now — and I didn't like being treated like a criminal because I didn't sign up for health insurance — so I signed up before the deadline this year, and now I am in compliance with the law. (Well, that is what I have been told ...)

I had my annual checkup earlier this month. It was the first time I had ever met my doctor. He was assigned to me by the state because the doctor I have been seeing for years isn't on the state–approved list. That meant I had to go through my medical history with a stranger rather than see a doctor who is already familiar with my medical history. I wasn't too thrilled about that.

Nor am I pleased with the fact that this insurance doesn't cover my monthly prescriptions. In fact, it doesn't kick in on anything at all until I pony up six grand.

I pay nearly $375 a month for this policy. I'll be damned if I can see any benefit to it.

Oh, excuse me. There is one benefit. I am entitled to one no–charge visit with my state–assigned doctor per year. I gather it's a no–frills thing. When I met my new doctor, one of the first questions he asked me was how extensive I wanted the appointment to be. I replied that it was my understanding that my policy entitled me to one visit per year.

His response? "Oh. You want the free stuff."

Now, I'm a journalist. I studied journalism in college. I have worked as a reporter, an editor, a journalism instructor. The study of language is a given in my line of work, and I know — probably better than most — how easily language can be manipulated and misused to achieve whatever the user wishes to achieve. Successful politicians know it, too. For that matter, I suppose, most people today have a smattering of a familiarity with how it works.

Anyway, as I just said, I'm shelling out nearly $375 a month for this policy, and the only thing I really get in return — unless I get hit by a bus or something like that (and then it will cost me $6,000 up front) — is one visit with my health care provider per year. What the hell is affordable about that?

It certainly is not free. It costs me nearly $4,500 a year — and it isn't nearly as thorough as the annual checkups for which I paid $300 before the state compelled me to carry this policy.

Oh, sure, I understand why the doctor calls it free stuff. As far as he is concerned, I suppose, it is free.

But not really. The doctor is paid for that annual visit by the health insurer, not the patient (and I use that term loosely). It's a very cursory, bare–bones examination. Whatever the insurer pays for it, he/she is being overcharged.

Actually, we're all being overcharged so a small group of people can have their policies at discounted rates. That's what the Supreme Court upheld this week — the state's practice of using money from the working class to subsidize health insurance policies for others.

The policy doesn't cover prescriptions, but it does cover contraceptives. I mentioned to a friend that I was having to pay for someone else's contraceptives. This friend, whom I have known since before my high school days, is as devout a supporter of Barack Obama and Obamacare as you will find, and he tried to tell me that subsidizing contraceptives was a social obligation — the same way that we all (symbolically, at least) pitch in for the upkeep of roads and schools.

I really can't follow that logic — although God knows I've tried. Actually, I suppose I can follow it — up to a point. I agree that everyone is entitled to drive on good, well–maintained roads and send their children to good schools.

But contraceptives are different. Subsidizing contraceptives suggests that sex — like good roads and good schools — is a right. I disagree. If sex was a right, people would be entitled to grab anyone off the street and have sex with that person. Never mind if the other person didn't give his/her consent.

The law doesn't permit people to have sex with anyone, consent be damned. In fact, the law has a specific word for the act of sex with others without their consent. It's called rape — or sexual assault in the namby–pamby jurisdictions that won't call things what they are.

Sex is not a right. Sex is a privilege.

Even if you're one–half of a married couple. I have known many men who believed they were entitled to sex with their wives whenever they wanted it (and some even thought they were entitled to sex with their children). It was a wife's duty, they said — and then the courts began to rule that there was such a thing as spousal rape.

Clearly, unless you're talking about masturbation, sex is not a right.

(Now that the courts are handing down rulings that re–define marriage, I expect that sometime in the not–so–distant future there will be similar rulings establishing boundaries for sexual behavior in same–sex marriages. Seems like the next logical step to me. But I digress.

(I don't really care about that, though. I don't really have an opinion on same–sex marriage. I do have an opinion about the health care law.)

But it's that "free stuff" part that really bothers me. People believe it. Clearly, at least one doctor does.

I am an adjunct journalism professor at one of the community colleges here in Dallas, and I was there during the 2012 presidential campaign. I couldn't begin to tell you how many students told me they were voting for Obama "because he's going to give me free health insurance."

From the start, it reminded me of something I have heard all my life: There is no such thing as a free lunch. As a youngster, I thought that was absurd. Of course there were free lunches.

But as I have gotten older I have realized that the statement was true. Even if something appears to be free, you'll wind up paying for it in the end.

Sunday, June 21, 2015

Learning From History

The mass shooting at an historically black church in Charleston, S.C., last week is disturbing on so many levels. It is overwhelming from a distance. I can only imagine what it must be like closer to Ground Zero.

There is, above and beyond all else, the disturbing story of the event itself — a young man sat through an hour or so of Bible study with a group of (presumably) strangers, then (apparently) calmly opened fire on them. Nine people were killed.

In case you're wondering why I used the parentheses on a couple of words in the preceding paragraph, it is because there is still so much we do not know. As there usually is at this stage of the investigation into a criminal act. But TV has conditioned many people to believe that all loose ends can be tied up in an hour's time, allowing for commercials — so they leap to conclusions without knowing all the facts that can put an event into context.

Usually, those conclusions are self–serving rubbish.

The intention is not, as some people would have you believe, to justify what happened. The intention is to satisfy the legal requirements to define a criminal act accurately. Those definitions have evolved through many millennia and generations of human experience. When one person kills another, that is a tragedy, but the law must know certain things before a case can be dealt with appropriately.

There are a lot of emotions swirling around this case, and I sympathize with that, but the law should not be administered on emotion. It should be administered on facts. In a criminal case, especially one that involves the death of one or more people at the hands of another, facts establish the legal nature of the crime. Emotion, as we should have learned from recent events, is often mistaken and can cause another tragedy.

Do you recall the 1996 Atlanta Olympics? A bomb went off the night of July 27, 1996, at the Olympics, killing one person and injuring 111 more. A security guard discovered the bomb before it went off and managed to clear a lot of people from the area, probably preventing more injuries and deaths, but he still became the prime suspect in the case. The media was relentlessly aggressive in its pursuit of him, treating him as if it was a foregone conclusion that he was guilty — until he was cleared of all accusations a few months later. He had never been formally charged with anything, but the damage to his reputation had been done. His job was terminated on the basis of unreliable information provided to his employer.

Our experience has told us that a person is justified in killing another if the other person posed a threat to the first person's life. In other words, we have concluded that self–defense is a valid, albeit regrettable, reason to kill. So the law must answer the question, did this young man kill in self–defense? There has never been any indication whatsoever that he killed in self–defense so any legal provisions on that can be ignored.

And, because of our experience, we have decided that if someone causes another person's death through negligence, that, too, should be treated differently by the justice system than murder would be treated. The case in Charleston clearly did not result from negligence.

Our experience has told us that it is not right to hold someone responsible for his or her acts if he or she is insane. That is much more of a gray area, and it requires weeks and months of evaluation before that can be determined. It may also involve interrogating a suspect's doctor to find out if that suspect had any known mental issues and/or had been prescribed any medication to treat such a condition. At this point, it appears that the answer to both questions would be "yes" in this case.

The law also needs to know whether the killing was premeditated. If someone planned to kill another, that is — and should be — handled differently than a killing that occurs out of the blue (i.e., a couple of people get into an argument that turns into a fight with guns or knives or even fists and one of them is killed in the fight — tragic but usually not premeditated).

A premeditated killing can be punished by death. Unpremeditated convictions usually result in jail time.

If it was premeditated, the law needs to know if it was the result of a conspiracy. Did someone else participate, either at the scene or behind the scenes? That requires time, too — sometimes a lot of it. Certainly more than an hour. (Heck, I've heard suggestions all my life that the John F. Kennedy assassination was a conspiracy, but, after more than 50 years, that remains a matter of opinion only.)

In this case, the evidence suggests it was premeditated and, to this point, does not suggest that there was a conspiracy, but the investigators need to be allowed time to talk to all relevant witnesses and review available evidence. I have heard of no second shooters, nor have I heard any suggestion that someone paid this young man to shoot these people. It appears he acted alone, but the law must be satisfied, and laws vary from state to state.

And that isn't all there is to it.

Until recently, I would have thought that most police killings are not premeditated. In most cases, the lack of premeditation would qualify a killing for a lesser charge, but the killing of a police officer is treated as a capital crime (and, therefore, eligible for the death penalty). It is our way of discouraging people from killing law enforcement officers (which, I suppose, includes judges and other court officers) under any circumstances.

As I say, the law evolves over time, and there are provisions in the law that make it legally possible to prosecute an unpremeditated killing as if it were premeditated — if someone is killed when another crime is taking place (i.e., a convenience store robbery), the person responsible for the killing is eligible for the death penalty even though the death of another person was not intended.

If you have ever served on a jury, you know that legal verdicts are seldom, if ever, as simple as "guilty" or "not guilty." The verdict forms consist of many pages of questions, most of which are designed to answer specific questions like the ones I have just outlined.

Why must so many questions be answered? Well, it has a lot to do with the experiences that the Founding Fathers had. They came from environments where it was customary for the state to take a person into custody and hold that person indefinitely without informing him or her of the reason. The Founding Fathers believed all people were entitled to due process — and that suspects deserved protection against being repeatedly arrested and charged for the same offenses until a jury finally convicted them. (They called that "double jeopardy.")

Many times, convictions are appealed, and appellate courts have been known to overturn convictions for entirely unanticipated reasons. That is why it is so critically important for prosecutors to have their ducks in a row when they go to court. The burden of proof is on them, not on the defense. If a verdict is overturned, it will be because of a mistake someone on the prosecution's team made.

I'm not a lawyer, but I have served on juries before, and I covered trials in my reporting days. Not having studied law, I don't know the history of law and justice, and I readily admit that I could be wrong on this, but my impression is that law and logic (which is a class everyone in Arts and Sciences was required to take when I was in college) must have evolved simultaneously. Jury verdict forms remind me so much of the logic questions I had to plot — If all A are B, and all B are C, then all A are C.

(For people who never had to take Logic, I suppose the most appropriate comparison would be your standard flow chart — "If the answer is yes, proceed. If the answer is no, stop.")

It seems to me that is what much of the application of the law comes down to — logic, the law of averages, probability. Comparing cases without context can become matters of comparing apples to oranges when it is really more instructive to compare apples to apples. The fact is, though, that each person and each case is different from all the others. People are individual and imperfect; they don't share the same brain or possess the same experiences.

So, it seems to me, as long as the application of law is made by imperfect people, there will be imperfections in the system. But that doesn't mean we stop trying to make the system better. We try to learn from the situations we face, especially the ones that challenge what we have always believed, just as we learn from history in all walks of life. Discoveries are important in the evolving story of humankind. One discovery begets another, and together they form the building blocks for yet another discovery. We could not have sent rockets to the moon if we had not, at some point along the way, discovered how to make and control fire.

History is also about learning from mistakes, errors in judgment, what we have tried to do about them in the past that was successful and what didn't work (so we don't keep spinning our wheels). Sometimes history is shameful, but no useful purpose is served by revising it. The only useful purpose is to remember, like the child who learns from experience not to put his hand on a hot stove. That child probably won't try to go through life without a stove, but he will learn when not to touch one. Do you recall what George Santayana wrote? "Those who cannot remember the past are condemned to repeat it." Wise words.

My father was a religion professor at a small liberal arts college in Arkansas. One summer, when I was 13, my family spent a couple of months in Austria. We took side trips to places, and some of the places my father wanted to visit were important scenes from Germany's Nazi past — concentration camps, Hitler's retreat (Eagle's Nest) in the German Alps near the centuries–old, tiny town of Berchtesgaden. He wanted to take pictures of those places, pictures he could use in his college classes. He heard about them when they were in the news when he was a teenager; this was his opportunity to see them.

But he was disappointed. The German government, in its misguided attempt to erase Nazism from its national consciousness, allowed the concentration camps to fall into disrepair, and Eagle's Nest became a restaurant and beer garden (the irony of that is that, although Hitler occasionally drank in private, he never drank in public and actually gave up alcohol altogether in his last years). In fact, the first law enacted by the Allied Control Council after World War II ended abolished all Nazi symbols. Possession of Nazi flags has been forbidden in many countries since; the importation or display of them is particularly frowned upon in Germany.

What has been achieved by that in recent years? Well, neo–Nazism has been on the rise, and skepticism about the Holocaust abounds. Those who cannot remember the past ...

I am not a professional historian, but I have been studying history all my life, and one of the things I have learned from it is this: No possible good comes from denying the past. And that is why I reject the popular move to ban the Confederate flag. The Civil War was certainly a dark chapter in America's history, but there were important lessons to be learned from it. Banning the Confederate flag would prevent those lessons from being learned and, perhaps, like modern Germany, at some point give rise to a new Confederate movement among the American young.

Would I have a Confederate flag in my home? Absolutely not — nor would I have a Nazi flag in my home. But pretending they didn't exist or that they didn't witness some horrific things is to deny history. And I have learned that history always has the last word.

Recently, the Supreme Court upheld the decision of Texas, the state where I live, not to allow people to have Confederate flags on their license plates on the grounds that license plates are government property. I support that ruling.

But possession of the flag is not banned. It is still legal to have one; choosing to display it is a matter of free speech on an individual basis. Some people find that offensive, just as some people find the burning of the American flag to be offensive, but free speech is protected under the First Amendment. Former Texas Gov. Rick Perry spoke in an interview yesterday about how the Confederate flag divides people. He is right about that, but we owe it to ourselves and future generations not to ban the symbols of hate but to learn from them.

History is not served when it is whitewashed.

Wednesday, June 3, 2015

Taking a Stroll in Space

"I'm coming back in ... and it's the saddest moment of my life."

Edward H. White
June 3, 1965

Fifty years ago, an American walked in space for the first time.

The man who took the first walk in space was not an American but a Russian. It was during the heated days of the U.S.–U.S.S.R. space race, and every first in the race to the moon was treated like something truly special, even if it wasn't.

Well, maybe it was special at the time, but not so much later on.

On this day in 1965, Edward White became the first American to walk in space. He wasn't one of the original "Mercury 7" astronauts. He was part of the second group chosen — along with Neil Armstrong, who would become the first man to walk on the moon, and Jim Lovell, who flew to the moon twice but never landed there.

White was the pilot of Gemini 4, the second manned space flight in NASA's Project Gemini. James McDivitt was the command pilot. White spent about 20 minutes outside the space ship, then reluctantly returned.

It was — without question — the highlight of the mission. Most people don't know that another first was planned on that mission, but it didn't work out nearly as well. McDivitt was slated to attempt a space rendezvous — an orbital maneuver that became almost routine in later missions but failed on this occasion. McDivitt made up for it a few years later as commander of Apollo 9, which was the first manned flight test of the lunar module.

(And he was Apollo spacecraft program manager from 1969 to 1972, the period in which all of NASA's missions to the moon — so far — were launched.)

The lunar module was the vehicle that carried astronauts to the surface of the moon. It was necessary for the command module to perform a space rendezvous with the lunar module before that part of the mission could commence.

So it is safe to say that McDivitt secured a better spot for himself in NASA's history later in his career than he did 50 years ago.

White, too, is remembered for something other than his space walk on Gemini 4 — something that was probably more important to the success of the program in the long run but hardly as personally triumphant. On Jan. 27, 1967, while conducting spacecraft practice, White and two other astronauts perished when a fire broke out in the pure oxygen environment of the cabin.

The astronauts' deaths revealed spacecraft flaws that NASA resolved before resuming the Apollo program, which went on to put 12 men on the moon and return them safely to earth.

Saturday, May 23, 2015

Hindsight Is 20/20

Hindsight is a wonderful thing. It really is. I believe it is an extremely good quality for a person to possess, to be able to look back at a decision that turned out to be the wrong one and learn from it.

The decision to invade Iraq in 2003 was the wrong decision. I believed it was the wrong decision at the time, but that was not a popular position to take. It took a certain amount of courage, back in those post–September 11 days, to tell one's friends and co–workers, many of whom supported the decision to invade Iraq, that it was a bad decision, and I did not always have the strength of will to argue with people about it, especially as confident as supporters of the invasion were that weapons of mass destruction would be found.

After a certain amount of time had passed and it became clear that the pretext for the invasion — the alleged existence of those weapons of mass destruction — was based on faulty information, public opinion began to sour on the war. But I think it is important to remember that a lot of people supported the invasion initially — including Hillary Clinton, the presumptive Democratic nominee for president in 2016 — no matter how much they may pretend otherwise today.

Mrs. Clinton wasn't the only Democrat who voted to authorize George W. Bush to use force against Iraq. When the Senate voted on Oct. 11, 2002, 29 of 50 Democrats joined 48 Republicans in a 77–23 vote giving Bush the authority he sought. Her colleague from New York, Chuck Schumer, voted to authorize the use of force. So did Joe Biden and Dianne Feinstein and Harry Reid.

In my lifetime, I have had the opportunity to vote for national tickets with a Bush on them half a dozen times. I have never voted for one and, if Jeb is nominated next year, it will make seven times I have refused to lend my support to a Bush in a national campaign.

But I find myself sympathizing — to an extent — with his recent stumble on the question of invading Iraq.

Fox News' Megyn Kelly asked him, "Knowing what we know now, would you have authorized the invasion?"

Bush tried to answer a different question. "I would've, and so would've Hillary Clinton, just to remind everybody, and so would have almost everybody that was confronted with the intelligence they got."

He kind of got back to what Kelly was getting at when he elaborated: "In retrospect, the intelligence that everybody saw, that the world saw, not just the United States, was faulty. And in retrospect, once we … invaded and took out Saddam Hussein, we didn't focus on security first. And the Iraqis, in this incredibly insecure environment, turned on the United States military because there was no security for themselves and their families."

Kelly was dealing in hypotheticals, and what Bush should have said — but, obviously, did not — was that he won't answer hypothetical questions. I'm an amateur historian, and what–if is the kind of game historians love to play. But it is a game that really cannot be won because the past is what it is. It's no trick to look back on a bad decision and know it was a mistake, but human beings are not blessed with the ability to see the future. If they were, I guess many would not marry the people they married or invest in companies that go belly up.

Or bet on the wrong horse at the racetrack.

There seems to be an impression among many Americans these days that a president must be infallible, that he must be capable of all things — including superhuman stuff like seeing the future. But anyone who looks for an infallible leader, someone around whom everyone can rally, is just asking to be disappointed. In the life of every presidency, there will be those who think the president does everything right and those who think the president does everything wrong — and everyone else who falls in between those two extremes. To misquote Abraham Lincoln, you can please some of the people all of the time and all of the people some of the time, but you can't please all the people all the time.

A president can only act within the reality of his times — and hope, at the end of the day, that he made the right decision. Seems to me that the best presidents have been the ones who second–guessed themselves and tried to learn from each decision they made — and the worst presidents were the ones who would not admit to having made a mistake.

If one is going to answer Kelly's question, though, it would have to be something like this: "In hindsight, it was a mistake to invade Iraq." That's it. Bush's inclination to defend his brother is admirable, but it does not have to be part of his answer to that question.

It can be the answer to another question if it is asked. He is right when he observes that a president must act on the information he has. But that is not the question that was asked. So don't answer it.

Better still, though, not to answer hypothetical questions at all. Politicians can't win hypotheticals, and politicians always want to play games they can win. Hypotheticals require proving a negative, and that cannot be done.

One time, I saw illusionist Penn Jillette talking about Nostradamus' prophecies that supposedly predicted Napoleon and Hitler and many other events that occurred long after his death. Jillette complained that the prophecies, which were apparently written in a deliberately obscure way, never named names, places or dates. What good is that, he wanted to know, if we want to prevent or avoid a certain event?

It's a fair point.

Let me ask you something. If time travel was possible, and you could go back in time, would you kill an infant Adolf Hitler sleeping in his crib? It is safe to say, I believe, that nazism would not have seized control of Germany without a charismatic leader at the helm. Snuffing out an infant who, knowing what we know now, grew up to plunge the world into a war that claimed millions of lives could be seen as heroic.

But could you take the life of a baby? You might say now that you could, but, when the chips were down, you might find it incredibly difficult to kill a small child, even knowing that, by doing so, you could save millions of others.

In the two decades between his resignation and his death, Richard Nixon might have said that, in hindsight, having the taping system installed in the Oval Office was a mistake — but that would have been with the benefit of knowing how it eventually played out, producing the evidence that brought his presidency to an end. But when the system was installed, his motivation (ostensibly) was the preservation of the historical record.

As Dr. Phil would say, how did that work out for ya?

Monday, May 4, 2015

Of Flowers And Water And Bullets

There certainly were a lot of lingering images from Baltimore last week.

There were, of course, the images of the plundering of small businesses, the burning of public property, the clashes between protestors and police. Those images overwhelmed everything else.

There were also the images of a city government that was caught flat–footed in the aftermath of Freddie Gray's death while in police custody. One had to wonder if this was an isolated incident, or if this sort of thing, albeit on a much smaller scale, goes on all the time. Could the government of Baltimore really be that inept, that incompetent?

And there was the image of the Baltimore mom slapping her son around. I think there must have been a lot of people who applauded that assertion of parental authority. There seems to be far too little of it these days.

I felt some of the most powerful images from Baltimore were the less public moments, the ones that photojournalists always seem to find. Sometimes, unfortunately, those moments have been manufactured, but the spontaneous ones have the power to remain in your memory.

Like the one at the top of this post of the black child distributing bottles of water to city police in riot gear.

It reminded me of a mental image I've carried with me for many years — I say mental because it is entirely the product of my imagination based on accounts I have read and heard. As far as I know, there is no photograph of it. But it is said to have happened 45 years ago — yesterday, I believe, maybe the day before — in Ohio.

To put it in historical perspective, President Nixon had just told the world about the Americans' previously secret invasion of Cambodia. Angry protests had erupted on college campuses all across the country. In Ohio, the National Guard had been called out to bring order to the campus of Kent State University.

Lots of people think that the Guard only appeared in Kent on the day of the shootings — Monday, May 4, 1970 — but the Guard was there that weekend. Sometime that weekend, Allison Krause, who had just turned 19, approached one of the Guardsmen with a flower in her hand. She placed it in the barrel of his weapon and said, "Flowers are better than bullets."

On Monday, May 4, Krause was one of four Kent State students who died after being shot by Guardsmen. Her comment about flowers and bullets is chiseled into the stone that marks her grave.

It seems to me that those two moments, separated by nearly half a century, summarize the differences in the thinking of the two sides in our ongoing political debates.

Liberals are like the image in my mind of Allison Krause. They see an ideal world that doesn't exist — but, in their minds, it should, and it frustrates them that it does not.

Conservatives are like the young black boy distributing water bottles to combatants. They see the world — and deal with it — as it is. They wish the world was better, but it is not, and it frustrates them that it is not.

I wonder if the two sides will ever find common ground.