Loading...

Freedom Writing

Thursday, July 16, 2015

Rock Hudson's Revelation



It was 30 years ago today that Rock Hudson and his old friend and co–star, Doris Day, held a press conference to announce her new TV cable show Doris Day's Best Friends. Hudson was going to be a guest on the show. It was a milestone moment.

All the talk after the press conference wasn't about Day's TV show, however. It was about Hudson, how emaciated he looked, how incomprehensible his speech pattern was. He was practically unrecognizable. There had been rumors about Hudson's health for a long time, and his appearance with Day revived them.

A couple of days later, Hudson traveled to Paris for another round of treatment and collapsed in his hotel room, after which his publicist confirmed that Hudson was ill but told everyone it was inoperable liver cancer. The publicist denied that Hudson suffered from AIDS — but then, only a few days later, he backpedaled and confirmed that Hudson did have AIDS and had been diagnosed with the virus more than a year earlier. Hudson hypothesized that he had been exposed to the virus through a blood transfusion when he had heart bypass surgery — long before anyone knew that blood carried the AIDS virus.

When it was confirmed that Hudson had AIDS, that triggered a lot of speculation about whether Hudson was homosexual. I don't recall if Hudson ever acknowledged that he was gay; I'm inclined to think he didn't, but People magazine ran a cover story about Hudson that discussed his AIDS diagnosis in the context of his sexuality about a month and a half before his death.

The 1980s were a trip. Ask any people you know who are old enough to remember, and they'll tell you the same thing — if not in those words, then in words to that effect.

It was a decade that often provided examples of how kind and generous people can be — and, just as often, provided examples of how petty people can be, too. I guess most decades are like that, but the 1980s seemed to have even more than most.

In such an atmosphere, it was initially regarded as socially acceptable to be dying of liver cancer — but not of AIDS. Then, when it was impossible to continue denying that he was afflicted with AIDS, it became important for the public to believe that Hudson got sick through no fault of his own. That was the phrase that separated the good AIDS sufferers from the bad ones. It was the phrase that cast the blame. Did the sufferer get sick through his own recklessness? Or did he get sick through someone else's negligence? (And, if Hudson had been exposed to the virus via transfusion, it couldn't even be called negligence — because it would be years before anyone knew that AIDS could be transmitted through blood.)

I was in college when the '80s began. At that time, most people were just beginning to hear about a strange new disease that was, apparently, 100% fatal, but before it killed you, it stripped you of your immunities, making you vulnerable to all sorts of things that healthy people shrug off. The vast majority of Americans tended to feel secure because the disease only appeared to be striking certain groups — hemophiliacs, heroin users, Haitians and homosexuals. In fact, it might have been called the "4 H" disease. (Actually, I think it may have been called that for awhile.)

They didn't know what to call it, frankly. Because it seemed to be striking the homosexual demographic disproportionately, it was initially called GRID for Gay–Related Immune Deficiency. Understandably, the gay community objected, feeling that the name unfairly singled out homosexuals when the record clearly showed that non–homosexuals were getting the disease, too.

And even though a non–judgmental name — Acquired Immune Deficiency Syndrome (AIDS) — was being used officially by the fall of 1982, the perception persisted that homosexuals had put the health of the rest of the population at risk.

People do strange things when they are frightened. I knew that from my studies of history, and AIDS gave me proof that irrational fear wasn't something that was unique to past generations. Human beings continue to have the potential for irrational fear; I guess they always will.

At first, AIDS was thought to be something of a medical anomaly, like Legionnaires' disease. It didn't take long for people to realize it was not a medical anomaly, but nevertheless the impression that homosexuals, through their reckless behavior, had put everyone at risk persisted. For a time, many people refused to use public restrooms or water fountains, afraid that AIDS sufferers might have been there before them.

It is necessary, you see, to recall the conditions that existed in the 1980s to understand what a big deal it was when Rock Hudson's affliction with AIDS became known in the summer of 1985. As imperfect as his acknowledgement was, it was a milestone in the AIDS story. Until that time, it was hard to get funding for research into the disease; consequently, it took years for the medical community even to discover that it was passed from one person to the next through bodily fluids.

Doctors learned the highest concentrations of the disease could be found in blood and semen; it was present in much lower levels in tears and saliva. Thus, the odds against someone getting sick from exposure to tears or saliva were considerable. Even so, in light of the fact that Hudson's diagnosis was more than a year old, people in the media speculated about the passionate kiss he had shared with actress Linda Evans on Dynasty. Hudson knew he was sick when the scene was filmed, but he did not tell Evans, prompting a certain amount of panic. Some actresses insisted on having kisses written out of their scripts, and the Screen Actors Guild adopted new rules regarding "open–mouth kissing." Actors had to be notified in advance — and were immune from penalty if they decided not to participate.

After the revelation that Hudson, one of Hollywood's most popular leading men, was sick with AIDS, roughly $2 million was raised, and Congress set aside more than $200 million to seek a cure.

Hudson's condition created issues for President Ronald Reagan, who was seen by a significant portion of the population as being indifferent to AIDS. But Reagan and his wife Nancy were Hudson's friends. On the strength of that friendship, a lot of people expected Reagan to break his long public silence on the subject.

But Reagan made no statement about Hudson, even when he had the opportunity at a press conference a couple of weeks before Hudson died.

He did, however, issue a brief statement on the occasion of Hudson's death on Oct. 2, 1985: "Nancy and I are saddened by the news of Rock Hudson's death. He will always be remembered for his dynamic impact on the film industry, and fans all over the world will certainly mourn his loss. He will be remembered for his humanity, his sympathetic spirit and well–deserved reputation for kindness. May God rest his soul."

Hudson's affliction and death was a milestone, however belated, in the fight against AIDS. People began talking about it. It was — and still is — a long way from a cure, but, as the old saying goes, the journey of a thousand miles begins with a single step.

Friday, June 26, 2015

Free Stuff?



I wasn't working full time last year — at least through the first half of the year — so I didn't enroll in the state–mandated health insurance. I couldn't afford it. (Well, I guess I could have — if I had stopped doing things like, you know, paying rent or eating.)

I am working full time now — and I didn't like being treated like a criminal because I didn't sign up for health insurance — so I signed up before the deadline this year, and now I am in compliance with the law. (Well, that is what I have been told ...)

I had my annual checkup earlier this month. It was the first time I had ever met my doctor. He was assigned to me by the state because the doctor I have been seeing for years isn't on the state–approved list. That meant I had to go through my medical history with a stranger rather than see a doctor who is already familiar with my medical history. I wasn't too thrilled about that.

Nor am I pleased with the fact that this insurance doesn't cover my monthly prescriptions. In fact, it doesn't kick in on anything at all until I pony up six grand.

I pay nearly $375 a month for this policy. I'll be damned if I can see any benefit to it.

Oh, excuse me. There is one benefit. I am entitled to one no–charge visit with my state–assigned doctor per year. I gather it's a no–frills thing. When I met my new doctor, one of the first questions he asked me was how extensive I wanted the appointment to be. I replied that it was my understanding that my policy entitled me to one visit per year.

His response? "Oh. You want the free stuff."

Now, I'm a journalist. I studied journalism in college. I have worked as a reporter, an editor, a journalism instructor. The study of language is a given in my line of work, and I know — probably better than most — how easily language can be manipulated and misused to achieve whatever the user wishes to achieve. Successful politicians know it, too. For that matter, I suppose, most people today have a smattering of a familiarity with how it works.

Anyway, as I just said, I'm shelling out nearly $375 a month for this policy, and the only thing I really get in return — unless I get hit by a bus or something like that (and then it will cost me $6,000 up front) — is one visit with my health care provider per year. What the hell is affordable about that?

It certainly is not free. It costs me nearly $4,500 a year — and it isn't nearly as thorough as the annual checkups for which I paid $300 before the state compelled me to carry this policy.

Oh, sure, I understand why the doctor calls it free stuff. As far as he is concerned, I suppose, it is free.

But not really. The doctor is paid for that annual visit by the health insurer, not the patient (and I use that term loosely). It's a very cursory, bare–bones examination. Whatever the insurer pays for it, he/she is being overcharged.

Actually, we're all being overcharged so a small group of people can have their policies at discounted rates. That's what the Supreme Court upheld this week — the state's practice of using money from the working class to subsidize health insurance policies for others.

The policy doesn't cover prescriptions, but it does cover contraceptives. I mentioned to a friend that I was having to pay for someone else's contraceptives. This friend, whom I have known since before my high school days, is as devout a supporter of Barack Obama and Obamacare as you will find, and he tried to tell me that subsidizing contraceptives was a social obligation — the same way that we all (symbolically, at least) pitch in for the upkeep of roads and schools.

I really can't follow that logic — although God knows I've tried. Actually, I suppose I can follow it — up to a point. I agree that everyone is entitled to drive on good, well–maintained roads and send their children to good schools.

But contraceptives are different. Subsidizing contraceptives suggests that sex — like good roads and good schools — is a right. I disagree. If sex was a right, people would be entitled to grab anyone off the street and have sex with that person. Never mind if the other person didn't give his/her consent.

The law doesn't permit people to have sex with anyone, consent be damned. In fact, the law has a specific word for the act of sex with others without their consent. It's called rape — or sexual assault in the namby–pamby jurisdictions that won't call things what they are.

Sex is not a right. Sex is a privilege.

Even if you're one–half of a married couple. I have known many men who believed they were entitled to sex with their wives whenever they wanted it (and some even thought they were entitled to sex with their children). It was a wife's duty, they said — and then the courts began to rule that there was such a thing as spousal rape.

Clearly, unless you're talking about masturbation, sex is not a right.

(Now that the courts are handing down rulings that re–define marriage, I expect that sometime in the not–so–distant future there will be similar rulings establishing boundaries for sexual behavior in same–sex marriages. Seems like the next logical step to me. But I digress.

(I don't really care about that, though. I don't really have an opinion on same–sex marriage. I do have an opinion about the health care law.)

But it's that "free stuff" part that really bothers me. People believe it. Clearly, at least one doctor does.

I am an adjunct journalism professor at one of the community colleges here in Dallas, and I was there during the 2012 presidential campaign. I couldn't begin to tell you how many students told me they were voting for Obama "because he's going to give me free health insurance."

From the start, it reminded me of something I have heard all my life: There is no such thing as a free lunch. As a youngster, I thought that was absurd. Of course there were free lunches.

But as I have gotten older I have realized that the statement was true. Even if something appears to be free, you'll wind up paying for it in the end.

Sunday, June 21, 2015

Learning From History



The mass shooting at an historically black church in Charleston, S.C., last week is disturbing on so many levels. It is overwhelming from a distance. I can only imagine what it must be like closer to Ground Zero.

There is, above and beyond all else, the disturbing story of the event itself — a young man sat through an hour or so of Bible study with a group of (presumably) strangers, then (apparently) calmly opened fire on them. Nine people were killed.

In case you're wondering why I used the parentheses on a couple of words in the preceding paragraph, it is because there is still so much we do not know. As there usually is at this stage of the investigation into a criminal act. But TV has conditioned many people to believe that all loose ends can be tied up in an hour's time, allowing for commercials — so they leap to conclusions without knowing all the facts that can put an event into context.

Usually, those conclusions are self–serving rubbish.

The intention is not, as some people would have you believe, to justify what happened. The intention is to satisfy the legal requirements to define a criminal act accurately. Those definitions have evolved through many millennia and generations of human experience. When one person kills another, that is a tragedy, but the law must know certain things before a case can be dealt with appropriately.

There are a lot of emotions swirling around this case, and I sympathize with that, but the law should not be administered on emotion. It should be administered on facts. In a criminal case, especially one that involves the death of one or more people at the hands of another, facts establish the legal nature of the crime. Emotion, as we should have learned from recent events, is often mistaken and can cause another tragedy.

Do you recall the 1996 Atlanta Olympics? A bomb went off the night of July 27, 1996, at the Olympics, killing one person and injuring 111 more. A security guard discovered the bomb before it went off and managed to clear a lot of people from the area, probably preventing more injuries and deaths, but he still became the prime suspect in the case. The media was relentlessly aggressive in its pursuit of him, treating him as if it was a foregone conclusion that he was guilty — until he was cleared of all accusations a few months later. He had never been formally charged with anything, but the damage to his reputation had been done. His job was terminated on the basis of unreliable information provided to his employer.

Our experience has told us that a person is justified in killing another if the other person posed a threat to the first person's life. In other words, we have concluded that self–defense is a valid, albeit regrettable, reason to kill. So the law must answer the question, did this young man kill in self–defense? There has never been any indication whatsoever that he killed in self–defense so any legal provisions on that can be ignored.

And, because of our experience, we have decided that if someone causes another person's death through negligence, that, too, should be treated differently by the justice system than murder would be treated. The case in Charleston clearly did not result from negligence.

Our experience has told us that it is not right to hold someone responsible for his or her acts if he or she is insane. That is much more of a gray area, and it requires weeks and months of evaluation before that can be determined. It may also involve interrogating a suspect's doctor to find out if that suspect had any known mental issues and/or had been prescribed any medication to treat such a condition. At this point, it appears that the answer to both questions would be "yes" in this case.

The law also needs to know whether the killing was premeditated. If someone planned to kill another, that is — and should be — handled differently than a killing that occurs out of the blue (i.e., a couple of people get into an argument that turns into a fight with guns or knives or even fists and one of them is killed in the fight — tragic but usually not premeditated).

A premeditated killing can be punished by death. Unpremeditated convictions usually result in jail time.

If it was premeditated, the law needs to know if it was the result of a conspiracy. Did someone else participate, either at the scene or behind the scenes? That requires time, too — sometimes a lot of it. Certainly more than an hour. (Heck, I've heard suggestions all my life that the John F. Kennedy assassination was a conspiracy, but, after more than 50 years, that remains a matter of opinion only.)

In this case, the evidence suggests it was premeditated and, to this point, does not suggest that there was a conspiracy, but the investigators need to be allowed time to talk to all relevant witnesses and review available evidence. I have heard of no second shooters, nor have I heard any suggestion that someone paid this young man to shoot these people. It appears he acted alone, but the law must be satisfied, and laws vary from state to state.

And that isn't all there is to it.

Until recently, I would have thought that most police killings are not premeditated. In most cases, the lack of premeditation would qualify a killing for a lesser charge, but the killing of a police officer is treated as a capital crime (and, therefore, eligible for the death penalty). It is our way of discouraging people from killing law enforcement officers (which, I suppose, includes judges and other court officers) under any circumstances.

As I say, the law evolves over time, and there are provisions in the law that make it legally possible to prosecute an unpremeditated killing as if it were premeditated — if someone is killed when another crime is taking place (i.e., a convenience store robbery), the person responsible for the killing is eligible for the death penalty even though the death of another person was not intended.

If you have ever served on a jury, you know that legal verdicts are seldom, if ever, as simple as "guilty" or "not guilty." The verdict forms consist of many pages of questions, most of which are designed to answer specific questions like the ones I have just outlined.

Why must so many questions be answered? Well, it has a lot to do with the experiences that the Founding Fathers had. They came from environments where it was customary for the state to take a person into custody and hold that person indefinitely without informing him or her of the reason. The Founding Fathers believed all people were entitled to due process — and that suspects deserved protection against being repeatedly arrested and charged for the same offenses until a jury finally convicted them. (They called that "double jeopardy.")

Many times, convictions are appealed, and appellate courts have been known to overturn convictions for entirely unanticipated reasons. That is why it is so critically important for prosecutors to have their ducks in a row when they go to court. The burden of proof is on them, not on the defense. If a verdict is overturned, it will be because of a mistake someone on the prosecution's team made.

I'm not a lawyer, but I have served on juries before, and I covered trials in my reporting days. Not having studied law, I don't know the history of law and justice, and I readily admit that I could be wrong on this, but my impression is that law and logic (which is a class everyone in Arts and Sciences was required to take when I was in college) must have evolved simultaneously. Jury verdict forms remind me so much of the logic questions I had to plot — If all A are B, and all B are C, then all A are C.

(For people who never had to take Logic, I suppose the most appropriate comparison would be your standard flow chart — "If the answer is yes, proceed. If the answer is no, stop.")

It seems to me that is what much of the application of the law comes down to — logic, the law of averages, probability. Comparing cases without context can become matters of comparing apples to oranges when it is really more instructive to compare apples to apples. The fact is, though, that each person and each case is different from all the others. People are individual and imperfect; they don't share the same brain or possess the same experiences.

So, it seems to me, as long as the application of law is made by imperfect people, there will be imperfections in the system. But that doesn't mean we stop trying to make the system better. We try to learn from the situations we face, especially the ones that challenge what we have always believed, just as we learn from history in all walks of life. Discoveries are important in the evolving story of humankind. One discovery begets another, and together they form the building blocks for yet another discovery. We could not have sent rockets to the moon if we had not, at some point along the way, discovered how to make and control fire.

History is also about learning from mistakes, errors in judgment, what we have tried to do about them in the past that was successful and what didn't work (so we don't keep spinning our wheels). Sometimes history is shameful, but no useful purpose is served by revising it. The only useful purpose is to remember, like the child who learns from experience not to put his hand on a hot stove. That child probably won't try to go through life without a stove, but he will learn when not to touch one. Do you recall what George Santayana wrote? "Those who cannot remember the past are condemned to repeat it." Wise words.

My father was a religion professor at a small liberal arts college in Arkansas. One summer, when I was 13, my family spent a couple of months in Austria. We took side trips to places, and some of the places my father wanted to visit were important scenes from Germany's Nazi past — concentration camps, Hitler's retreat (Eagle's Nest) in the German Alps near the centuries–old, tiny town of Berchtesgaden. He wanted to take pictures of those places, pictures he could use in his college classes. He heard about them when they were in the news when he was a teenager; this was his opportunity to see them.

But he was disappointed. The German government, in its misguided attempt to erase Nazism from its national consciousness, allowed the concentration camps to fall into disrepair, and Eagle's Nest became a restaurant and beer garden (the irony of that is that, although Hitler occasionally drank in private, he never drank in public and actually gave up alcohol altogether in his last years). In fact, the first law enacted by the Allied Control Council after World War II ended abolished all Nazi symbols. Possession of Nazi flags has been forbidden in many countries since; the importation or display of them is particularly frowned upon in Germany.

What has been achieved by that in recent years? Well, neo–Nazism has been on the rise, and skepticism about the Holocaust abounds. Those who cannot remember the past ...

I am not a professional historian, but I have been studying history all my life, and one of the things I have learned from it is this: No possible good comes from denying the past. And that is why I reject the popular move to ban the Confederate flag. The Civil War was certainly a dark chapter in America's history, but there were important lessons to be learned from it. Banning the Confederate flag would prevent those lessons from being learned and, perhaps, like modern Germany, at some point give rise to a new Confederate movement among the American young.

Would I have a Confederate flag in my home? Absolutely not — nor would I have a Nazi flag in my home. But pretending they didn't exist or that they didn't witness some horrific things is to deny history. And I have learned that history always has the last word.

Recently, the Supreme Court upheld the decision of Texas, the state where I live, not to allow people to have Confederate flags on their license plates on the grounds that license plates are government property. I support that ruling.

But possession of the flag is not banned. It is still legal to have one; choosing to display it is a matter of free speech on an individual basis. Some people find that offensive, just as some people find the burning of the American flag to be offensive, but free speech is protected under the First Amendment. Former Texas Gov. Rick Perry spoke in an interview yesterday about how the Confederate flag divides people. He is right about that, but we owe it to ourselves and future generations not to ban the symbols of hate but to learn from them.

History is not served when it is whitewashed.

Wednesday, June 3, 2015

Taking a Stroll in Space



"I'm coming back in ... and it's the saddest moment of my life."

Edward H. White
June 3, 1965

Fifty years ago, an American walked in space for the first time.

The man who took the first walk in space was not an American but a Russian. It was during the heated days of the U.S.–U.S.S.R. space race, and every first in the race to the moon was treated like something truly special, even if it wasn't.

Well, maybe it was special at the time, but not so much later on.

On this day in 1965, Edward White became the first American to walk in space. He wasn't one of the original "Mercury 7" astronauts. He was part of the second group chosen — along with Neil Armstrong, who would become the first man to walk on the moon, and Jim Lovell, who flew to the moon twice but never landed there.

White was the pilot of Gemini 4, the second manned space flight in NASA's Project Gemini. James McDivitt was the command pilot. White spent about 20 minutes outside the space ship, then reluctantly returned.

It was — without question — the highlight of the mission. Most people don't know that another first was planned on that mission, but it didn't work out nearly as well. McDivitt was slated to attempt a space rendezvous — an orbital maneuver that became almost routine in later missions but failed on this occasion. McDivitt made up for it a few years later as commander of Apollo 9, which was the first manned flight test of the lunar module.

(And he was Apollo spacecraft program manager from 1969 to 1972, the period in which all of NASA's missions to the moon — so far — were launched.)

The lunar module was the vehicle that carried astronauts to the surface of the moon. It was necessary for the command module to perform a space rendezvous with the lunar module before that part of the mission could commence.

So it is safe to say that McDivitt secured a better spot for himself in NASA's history later in his career than he did 50 years ago.

White, too, is remembered for something other than his space walk on Gemini 4 — something that was probably more important to the success of the program in the long run but hardly as personally triumphant. On Jan. 27, 1967, while conducting spacecraft practice, White and two other astronauts perished when a fire broke out in the pure oxygen environment of the cabin.

The astronauts' deaths revealed spacecraft flaws that NASA resolved before resuming the Apollo program, which went on to put 12 men on the moon and return them safely to earth.

Saturday, May 23, 2015

Hindsight Is 20/20



Hindsight is a wonderful thing. It really is. I believe it is an extremely good quality for a person to possess, to be able to look back at a decision that turned out to be the wrong one and learn from it.

The decision to invade Iraq in 2003 was the wrong decision. I believed it was the wrong decision at the time, but that was not a popular position to take. It took a certain amount of courage, back in those post–September 11 days, to tell one's friends and co–workers, many of whom supported the decision to invade Iraq, that it was a bad decision, and I did not always have the strength of will to argue with people about it, especially as confident as supporters of the invasion were that weapons of mass destruction would be found.

After a certain amount of time had passed and it became clear that the pretext for the invasion — the alleged existence of those weapons of mass destruction — was based on faulty information, public opinion began to sour on the war. But I think it is important to remember that a lot of people supported the invasion initially — including Hillary Clinton, the presumptive Democratic nominee for president in 2016 — no matter how much they may pretend otherwise today.

Mrs. Clinton wasn't the only Democrat who voted to authorize George W. Bush to use force against Iraq. When the Senate voted on Oct. 11, 2002, 29 of 50 Democrats joined 48 Republicans in a 77–23 vote giving Bush the authority he sought. Her colleague from New York, Chuck Schumer, voted to authorize the use of force. So did Joe Biden and Dianne Feinstein and Harry Reid.

In my lifetime, I have had the opportunity to vote for national tickets with a Bush on them half a dozen times. I have never voted for one and, if Jeb is nominated next year, it will make seven times I have refused to lend my support to a Bush in a national campaign.

But I find myself sympathizing — to an extent — with his recent stumble on the question of invading Iraq.

Fox News' Megyn Kelly asked him, "Knowing what we know now, would you have authorized the invasion?"

Bush tried to answer a different question. "I would've, and so would've Hillary Clinton, just to remind everybody, and so would have almost everybody that was confronted with the intelligence they got."

He kind of got back to what Kelly was getting at when he elaborated: "In retrospect, the intelligence that everybody saw, that the world saw, not just the United States, was faulty. And in retrospect, once we … invaded and took out Saddam Hussein, we didn't focus on security first. And the Iraqis, in this incredibly insecure environment, turned on the United States military because there was no security for themselves and their families."

Kelly was dealing in hypotheticals, and what Bush should have said — but, obviously, did not — was that he won't answer hypothetical questions. I'm an amateur historian, and what–if is the kind of game historians love to play. But it is a game that really cannot be won because the past is what it is. It's no trick to look back on a bad decision and know it was a mistake, but human beings are not blessed with the ability to see the future. If they were, I guess many would not marry the people they married or invest in companies that go belly up.

Or bet on the wrong horse at the racetrack.

There seems to be an impression among many Americans these days that a president must be infallible, that he must be capable of all things — including superhuman stuff like seeing the future. But anyone who looks for an infallible leader, someone around whom everyone can rally, is just asking to be disappointed. In the life of every presidency, there will be those who think the president does everything right and those who think the president does everything wrong — and everyone else who falls in between those two extremes. To misquote Abraham Lincoln, you can please some of the people all of the time and all of the people some of the time, but you can't please all the people all the time.

A president can only act within the reality of his times — and hope, at the end of the day, that he made the right decision. Seems to me that the best presidents have been the ones who second–guessed themselves and tried to learn from each decision they made — and the worst presidents were the ones who would not admit to having made a mistake.

If one is going to answer Kelly's question, though, it would have to be something like this: "In hindsight, it was a mistake to invade Iraq." That's it. Bush's inclination to defend his brother is admirable, but it does not have to be part of his answer to that question.

It can be the answer to another question if it is asked. He is right when he observes that a president must act on the information he has. But that is not the question that was asked. So don't answer it.

Better still, though, not to answer hypothetical questions at all. Politicians can't win hypotheticals, and politicians always want to play games they can win. Hypotheticals require proving a negative, and that cannot be done.

One time, I saw illusionist Penn Jillette talking about Nostradamus' prophecies that supposedly predicted Napoleon and Hitler and many other events that occurred long after his death. Jillette complained that the prophecies, which were apparently written in a deliberately obscure way, never named names, places or dates. What good is that, he wanted to know, if we want to prevent or avoid a certain event?

It's a fair point.

Let me ask you something. If time travel was possible, and you could go back in time, would you kill an infant Adolf Hitler sleeping in his crib? It is safe to say, I believe, that nazism would not have seized control of Germany without a charismatic leader at the helm. Snuffing out an infant who, knowing what we know now, grew up to plunge the world into a war that claimed millions of lives could be seen as heroic.

But could you take the life of a baby? You might say now that you could, but, when the chips were down, you might find it incredibly difficult to kill a small child, even knowing that, by doing so, you could save millions of others.

In the two decades between his resignation and his death, Richard Nixon might have said that, in hindsight, having the taping system installed in the Oval Office was a mistake — but that would have been with the benefit of knowing how it eventually played out, producing the evidence that brought his presidency to an end. But when the system was installed, his motivation (ostensibly) was the preservation of the historical record.

As Dr. Phil would say, how did that work out for ya?

Monday, May 4, 2015

Of Flowers And Water And Bullets



There certainly were a lot of lingering images from Baltimore last week.

There were, of course, the images of the plundering of small businesses, the burning of public property, the clashes between protestors and police. Those images overwhelmed everything else.

There were also the images of a city government that was caught flat–footed in the aftermath of Freddie Gray's death while in police custody. One had to wonder if this was an isolated incident, or if this sort of thing, albeit on a much smaller scale, goes on all the time. Could the government of Baltimore really be that inept, that incompetent?

And there was the image of the Baltimore mom slapping her son around. I think there must have been a lot of people who applauded that assertion of parental authority. There seems to be far too little of it these days.

I felt some of the most powerful images from Baltimore were the less public moments, the ones that photojournalists always seem to find. Sometimes, unfortunately, those moments have been manufactured, but the spontaneous ones have the power to remain in your memory.

Like the one at the top of this post of the black child distributing bottles of water to city police in riot gear.

It reminded me of a mental image I've carried with me for many years — I say mental because it is entirely the product of my imagination based on accounts I have read and heard. As far as I know, there is no photograph of it. But it is said to have happened 45 years ago — yesterday, I believe, maybe the day before — in Ohio.

To put it in historical perspective, President Nixon had just told the world about the Americans' previously secret invasion of Cambodia. Angry protests had erupted on college campuses all across the country. In Ohio, the National Guard had been called out to bring order to the campus of Kent State University.

Lots of people think that the Guard only appeared in Kent on the day of the shootings — Monday, May 4, 1970 — but the Guard was there that weekend. Sometime that weekend, Allison Krause, who had just turned 19, approached one of the Guardsmen with a flower in her hand. She placed it in the barrel of his weapon and said, "Flowers are better than bullets."

On Monday, May 4, Krause was one of four Kent State students who died after being shot by Guardsmen. Her comment about flowers and bullets is chiseled into the stone that marks her grave.

It seems to me that those two moments, separated by nearly half a century, summarize the differences in the thinking of the two sides in our ongoing political debates.

Liberals are like the image in my mind of Allison Krause. They see an ideal world that doesn't exist — but, in their minds, it should, and it frustrates them that it does not.

Conservatives are like the young black boy distributing water bottles to combatants. They see the world — and deal with it — as it is. They wish the world was better, but it is not, and it frustrates them that it is not.

I wonder if the two sides will ever find common ground.

Thursday, April 30, 2015

The Day Hitler Died



"Outside in the passageway, Dr. (Joseph) Goebbels, (Martin) Bormann and a few others waited. In a few moments a revolver shot was heard. They waited for a second one, but there was only silence. After a decent interval they quietly entered the fuehrer's quarters. They found the body of Adolf Hitler sprawled on the sofa dripping blood. He had shot himself in the mouth. At his side lay Eva Braun. Two revolvers had tumbled to the floor, but the bride had not used hers. She had swallowed poison."

William L. Shirer
The Rise and Fall of the Third Reich

World War II and Adolf Hitler and the Nazis all came before my time so I only know what I have read or seen in documentaries.

It was real for my parents, though. They were not quite grown when the war began, not even when the war ended, but they were old enough to know who was fighting and what the stakes were.

And when the news that Hitler had committed suicide 70 years ago today reached them, they must have known that the war in Europe would be over soon.

I don't know if that means they felt the war in general was over — or if they realized that the war in the Pacific continued.

My guess is that, in 1945, most people who were old enough to remember Pearl Harbor knew there would still be a fight to finish in the Pacific. There was considerable angst about the prospect of an invasion of Japan — widely believed in April 1945 to be the only way to end the fighting but just as widely believed to be likely to claims hundreds of thousands of American lives in the process.

The Japanese were determined fighters, and no one thought they would go down easily. The invasion of Japan was expected to be won by whoever was the last man standing.

But that was a matter to consider some other time. Seventy years ago today, Hitler was dead, and the German surrender was only days away.

Hitler's death, TIME magazine recalls, was shrouded in mystery.

"It wasn't immediately clear what had happened on April 30, 1945," wrote TIME. "This much the world knew: Adolf Hitler was gone, one way or another."

And Hitler had been at the core of Nazi Germany. The tide had turned against the Nazis — it was why Hitler committed suicide — and, when Hitler was gone, all motivation to continue fighting was gone, too.

Questions remain, though, about Hitler's final hours, even after seven decades. Was his suicide the last act of an irrational man who had been waiting vainly for the arrival of Nazi troops who never came? Or was it the cool, deliberate act of a man who had considered all the possible endings to the scenario and concluded suicide was the best choice? The people who were with him in the bunker insist they heard a single gunshot — and that Eva Braun's revolver was not fired. Papers in the Russians' files indicated that Hitler poisoned himself. Were both accounts true? Did Hitler shoot himself after (or while) biting down on the poison capsule? Or did someone else pull the trigger?

We'll probably never know — and it really doesn't matter, does it?

Forty Years Since the Fall of Saigon



The picture at the top of this post is the image that comes to my mind when I think of the end of the war in Vietnam 40 years ago today.

As far back as I can remember, the war in Vietnam was a fact of life. To a young boy, it seemed that there had never been a time when U.S. forces were not in Vietnam. Anyway, it seemed that way to me. It was probably different for people who were even a year older than I; I was born at the right time to have no real memory of the pre–Vietnam era, but I know that older brothers and sisters of my contemporaries did know of that time, had memories of it.

I knew nothing of it, and I guess I've always assumed that the others who were my age had no memories of it, either, but I could be wrong about that. I can think of a few people I knew who were probably more aware of the outside world than the rest of us, but they were definitely the exceptions. Anyway, Vietnam influenced everything. It was on the news every night with updated casualty counts. Late in the '60s, if there was a demonstration somewhere or someone important was giving a speech, it was a pretty good bet that it was about the war. It was everywhere.

My father was a religion professor at a small college in my hometown. For a small college, it had some impressive things, though, like an Olympic–sized swimming pool. In the summer, one hour was set aside each weekday for faculty members and their families to have exclusive use of that pool, and my brother and I were regulars there. Anyway, on one of those occasions, I have a vivid memory of swimming in the pool and, for whatever reason, I started to muse about whether the war would still be going on when I got old enough to be drafted. I didn't think about it that much; after all, the prospect still seemed far away, and I was still just a boy, cooling off on a hot summer day in Arkansas. But that moment made enough of an impression on me that I can still remember it all these years later.

I don't remember how I imagined the war would end. I guess I pictured a Hollywoodesque finish with bombs and rockets bursting, and the Americans finding some way to win the thing in the end. I guess I imagined a John Wayne movie. It wasn't like that, of course. The fall of Saigon was far from glamorous. The Viet Cong swept the city, capturing all the important places, and South Vietnamese refugees evacuated.

In fact, the fall of the city actually came after many of the civilians and the Americans there had fled. In that picture, you can see some of the South Vietnamese trying to climb aboard a single helicopter on April 29, 1975. It looks reasonably orderly in the picture, but my memory is of chaos. I guess it was controlled chaos. In 24 hours, American helicopters evacuated about 7,000 people — roughly a dozen at a time — and it was not orderly.

But there were times when I watched the news coverage of helicopters like the one in the picture struggling to get off the ground, so heavy were they with passengers.

Strange as it might have seemed to people at the time — which explains why I never mentioned it to anyone — I found myself sympathizing with Gerald Ford. I liked him when he first became president. He was such a likable guy, a breath of fresh air after the Nixon years, and then he pardoned Nixon and threw away all the good will the American people had given him. In hindsight, I have to grudgingly admit that he was probably right when he said that pardoning Nixon was the only way to close the chapter on Watergate and move on. At the time, I thought it was a flimsy excuse. So, too, apparently, did a lot of people.

The Nixon/Watergate matter wasn't the only challenge Ford faced. The loss of Saigon was another. Ford's approval rating, which had been in the low 70s right after he took office but tumbled after the pardon, had been hovering around 40% since before Christmas in 1974, which was when the North Vietnamese broke the 1973 accords and invaded a South Vietnamese province along the Cambodian border. In Gallup's last survey before the fall of Saigon, Ford's approval stood at 39%.

Ford had a reputation for not being too bright, but I have come to believe that was mostly a facade for him. He used that image to his advantage. It made his adversaries underestimate him, some more than others.

I don't think anything illustrated that quite as well as the Mayaguez incident a couple of weeks after the fall of Saigon. The Mayaguez, a merchant ship, was seized by the Cambodians on May 12. Three days later, a rescue mission was launched, making Ford appear decisive and assertive — qualities he would need in the campaign for the Republican nomination against former Gov. Ronald Reagan; if that was what he was seeking, I'd be inclined to say he got it. In Gallup's next survey, Ford's approval was over 50%.

Ford and his people were products of the Cold War — he had three chiefs of staff while he was president (Alexander Haig, Donald Rumsfeld, Dick Cheney), and they almost certainly influenced his actions in Southeast Asia. They were worried about the other Southeast Asian countries, whether they would be more likely to fall prey to communism after the fall of Saigon, and they were determined to make a stand.

At the time, the expectation had been that the South Vietnamese could resist the North Vietnamese until 1976. Obviously, that prediction fell a bit short of the mark.

It is a tricky proposition to see into the future.

Thursday, April 23, 2015

A Glance at the Race for the White House



Each time we prepare to elect a president, there always seems to be someone seeking a party's nomination who sought it before but fell short. Most of the time, that candidate (or those candidates in especially active presidential election cycles) is said to be taking a different approach this time — presumably because the original approach failed the first time.

The message may be different, or the candidate may choose a different way to convey that message. The latter appears to be what Hillary Clinton is doing. "Clinton plans to forgo the packed rallies that marked her previous campaign," writes the Associated Press' Lisa Lerer, "and focus on smaller round-table events with selected groups of supporters."

Sometimes that is a good idea; other times, not so much. I am skeptical that it will help Clinton avoid questions about her email or acceptance of cash contributions from foreign governments seeking access while she was secretary of State. In the context of previous presidential campaigns, that isn't really surprising. It is frequently — but not always — difficult to know whether changing the message or how the message is presented is the right approach the second time around — until after the campaign is over.

By that time, of course, one need look no further than the election results to decide if the candidate (should he or she win the nomination) made the right choice. If it wasn't, there will be no shortage of scapegoats and other excuses in what boils down to a circular firing squad.

What is more certain these days is that it is difficult for a party to prevail in three consecutive national elections. Some people attribute that to fatigue with the incumbent party. Since the postwar era has coincided with the advent of television — which, in turn, has led to Americans having unprecedented access to a president's daily activities — that makes sense.

And I do think that plays a role in it, but I think it is more complex than that. Now, I'm going to lay a little groundwork here. I apologize in advance if it seems elementary.

There are two kinds of presidential election years — incumbent years and non–incumbent years. An incumbent year is when America has an incumbent president who is eligible to run for another term — and usually does. I think the last such incumbent who chose not to seek another term was Lyndon Johnson in 1968. Three other presidents in the 20th century made the decision not to seek another term when they legally could have — Theodore Roosevelt in 1908, Calvin Coolidge in 1928 and Harry Truman in 1952.

(Truman was president when the 22nd Amendment was ratified. He had served nearly two full terms by 1952, having succeeded Franklin D. Roosevelt in 1945, but the amendment made the specific point that it would not apply to whoever was president upon its ratification.)

Since we always have an incumbent, the matter of eligibility would seem to be the determining factor, but it isn't. LBJ's decision, which was largely the product of the public's increasingly sour mood about the war in Vietnam, not to seek another term as president instantly turned 1968 into a non–incumbent year. That's a year when the incumbent is not on the ballot in the general election, whether by choice or circumstance.

In recent times, non–incumbent years have tended to favor the nominee of the out–of–power party because those years have come when the incumbent usually is ineligible to seek another term.

It wasn't always that way. For whatever reason, it seems to have been largely a byproduct of World War II that parties almost never win three straight national elections. At least, that's when this pattern emerged. Before that, victories tended to come in bunches. Democrats won five straight elections between 1932 and 1948. The Republicans won the three elections prior to that — and 11 of 15 between 1860 and 1916.

Of course, it was after World War II ended when the 22nd Amendment limiting presidents to two full terms in office was ratified, and that was a game changer. Few presidents were tempted to seek a third term before the amendment was ratified, but it was always a possibility. Since the 22nd Amendment was ratified, it has been generally understood that, after winning his second term, a president gradually slips into irrelevance, essentially becoming a lame duck the day he takes the oath of office for the second time. Maybe that explains the pattern that has emerged in the last 67 years.

Since Harry Truman's "upset" victory in 1948, Americans have voted for the same party's nominees for president three straight times only once — in 1988 when Vice President George H.W. Bush was elected to succeed Ronald Reagan. Otherwise, it has been so predictable you could set your calendar by it.

Bush was helped by the fact that President Reagan was still popular after eight years in office — Gallup had Reagan at 51% approval just before the 1988 election — but the popularity of the incumbent does not necessarily help the nominee of the president's party.

Prior to the 2000 election, Bill Clinton's approval rating was between 59% and 62%. Clinton's vice president, Al Gore, narrowly won the popular vote but lost the electoral vote — in large part because he did not take advantage of Clinton's popularity and political skills during his campaign against George W. Bush.

Of course, if the incumbent's popularity is below 50%, his party's nominee to replace him is probably toast before the convention adjourns. George W. Bush's approval ratings were mostly in the 20s just before the 2008 election, which John McCain lost in a modest landslide.

And Lyndon Johnson's approval rating just before the 1968 election (42%) almost precisely mirrored Democratic nominee Hubert Humphrey's share of the popular vote — and 1968 turned out to be a cliffhanger but only because independent candidate George Wallace was on the ballot.

Sunday, April 19, 2015

Rising From the Ashes of Oklahoma City



"The Oklahoma City bombing was simple technology, horribly used. The problem is not technology. The problem is the person or persons using it."

Rev. Billy Graham

It's hard for me to believe it has been 20 years since the bombing of the federal building in Oklahoma City.

I wrote about this back on the 15th anniversary, and I observed much the same thing then as I do now. It's hard to believe, probably even harder now. Maybe that's because it seems as if I have lived another lifetime since it happened.

There were many things going on in my life at that time — and other things that happened in the weeks and months that followed — that make my memory of the bombing something of a blur.

I was teaching journalism at the University of Oklahoma, about 30 miles southeast of Oklahoma City, when the bombing occurred. In fact, I was scheduled to be in the classroom less than half an hour after the bombing happened. My office was just across the hall from the student newspaper newsroom, and I had been doing some work in my office for about an hour or so. There were never very many students in the newsroom in the mornings — it was a daily paper, and the staffers worked in there in the late afternoons and into the evenings — but there were a few students in there that morning, and they had the TV on. I could hear the news reports — still sketchy — as I walked down the hall just before the start of my class.

I knew something had happened, but, like most of the people watching the news reports on the local TV stations at that time, no one really knew what it was. In those days, people didn't automatically think of terrorism when something unpleasant happened. Well, maybe some people did — there was a report that day of a man of Middle Eastern descent who had the misfortune of boarding a plane in Oklahoma City that morning and flying to Chicago, where authorities stopped and detained him after he got off the plane. There was some modest hysteria about that, but it was nothing, I am sure, compared to what it might have been if the Oklahoma City bombing had occurred maybe a decade later than it did.

In those more innocent times (by comparison), terrorism was one of many potential culprits; in fact, the early speculation that day was that a gas line had exploded. As far as most Americans were concerned in 1995, terrorism was still something that happened in the other hemisphere. I could be wrong, but I don't think that man had any idea what had happened when the agents descended upon him in Chicago. Fast forward a few years. If the bombing had occurred in 2005 instead of 1995, terrorism probably would have been the first — and, perhaps, only — suspect for many.

My class lasted for an hour, then I returned to my office to do some work before going home for lunch. While I was at home, I watched the news reports. Considerably more was known by that time. The gas line explosion theory had been ruled out by noon. It was now believed to have been the outcome of a deliberate act.

That afternoon, I had a writing lab. Before it started, some of my students approached me about letting them leave early so they could donate blood for the injured. That was the kind of thing I wanted to encourage so I said I would try to wrap things up earlier than usual to allow them to do that — and that is what I did.

By mid–afternoon that day, a suspect was in custody. His name was Timothy McVeigh. He was convicted in 1997 and executed in 2001. His accomplice, Terry Nichols, is serving several life sentences in a super maximum security prison in Colorado.

For them, the Oklahoma City bombing is a closed chapter, I suppose — but not so for those who must live with the consequences of their acts.

The most obvious victims, I imagine, are the ones who were injured that day, and many have been the subjects of followup articles in newspapers and magazines. The survivors have not all been eager to share their stories. Some chose to avoid the spotlight on what must be a very personal anniversary for them; others reluctantly went ahead with the interviews but insisted that they would not let what happened 20 years ago define them.

I have to admire that.

But, as I have often said in these last 20 years, I also admire the commendable work that was done by the student journalists with whom I worked at the University of Oklahoma at that time. Many of them grew up in Oklahoma City or one of the many nearby towns; they were touched by the bombing, too, but they persevered with their work as journalists.

The student newspaper had its staffers at the bombing site for the rest of what remained of that semester. At a time when nearly every other newspaper — professional or academic — was using articles, photos and graphics supplied by the wire services, the OU student newspaper relied on its reporters, photographers and graphics artists to produce all original material — material that was posted online at a time when many professional periodicals still did not have an online presence, let alone most college newspapers.

They put aside their personal feelings and covered the event with the professionalism it deserved. That accomplishment was even more impressive than you may realize. One of the staffers actually lost her father in the bombing.

But she, like the city, has risen from the ashes. She has gone on to pursue a career in broadcast journalism and has refused to let what happened to her family 20 years ago define her.

At the site of the bombing, a memorial now stands.

I haven't been there, but I have heard it is a serene place with a reflecting pool, a "gate of time" and a field of chairs symbolizing each life that was lost that day. The chairs representing the adults are a little larger than the ones representing the children who died. That is a nice, subtle touch.

Another interesting touch is the "survivor tree." It was part of the building's original landscaping and, somehow, it survived the bombing and the fires that followed. It still stands. I presume it will be mentioned during today's memorial service.