Sunday, May 31, 2009
Take Me Home, Country Roads
A blogger friend of mine recently posted some pictures of his home state of New Jersey.
I guess the prevailing belief about New Jersey is that it is an extension of New York City, but my friend knows — and his friends and relatives who still live in New Jersey know — that there is more to it, and he posted pictures of waterfalls and beaches and all sorts of rural imagery from the state.
I've decided to — sort of — take a cue from him, and write a little about my home state of Arkansas. Most people think of Arkansas as a rural state, which is mostly true, but there are some appealing attractions if you know where to look.
Now, it's been more than 20 years since I lived in Arkansas, but I still think of it as home. My hometown is actually the place "American Idol" winner Kris Allen calls home — Conway.
I've written some about Conway recently, and I'm sure I'll get back to it, but in this post, I want to write about a special spot in Arkansas that I visited on several occasions — Petit Jean Mountain in nearby Conway County.
As a child, I remember visiting Petit Jean Mountain with my parents. There were picnic tables there, and there was the Museum of Automobiles, which was founded in 1964 by Winthrop Rockefeller. It housed Rockefeller's personal collection of vintage automobiles until 1975, two years after his death. In 1975, the collection was sold, and the land where the museum was located was donated to the state. The next year, the museum reopened under the auspices of a not–for–profit organization, with cars on loan from across the country. Today, there is a permanent collection of automobiles in the museum along with an antique gun collection.
Petit Jean Mountain is also the home of Winrock Farms and Winrock Enterprises, Rockefeller's cattle breeding and business operations that he founded there in the 1950s. They have remained in operation, even though the founder has been gone for more than 35 years.
Rockefeller renewed attention to Petit Jean when he moved there, and one of the things that Arkansans rediscovered was the legend of Petit Jean.
I heard many variations on the tale when I was growing up, but the version that seems to be the accepted one says that "Petit Jean" was the name given to a woman who disguised herself as a boy and signed on as a cabin boy with De Soto so she could be close to her fiancee, who had signed on with his expedition.
She survived the ocean voyage, but she apparently became ill when they reached the mountain in modern–day Arkansas. It was only at this time that she revealed her true identity to her fiancee. She died and was buried under the name "Petit Jean" — which means "Little John."
When I was in college, I lived for a time with an older couple. The husband worked at Winrock, and occasionally he had to attend cocktail parties for company clients. One time, he invited me to attend with him so we drove up to Petit Jean Mountain and hobnobbed with the clients. We ate some fancy appetizers and drank some wine — I've never cared much for wine so I couldn't honestly say whether it was good, but Winrock always did things first–rate so I assume the wine was top quality.
My teenage memories of Petit Jean include people closer to my own age. I remember going for hikes there as a high school senior. I took my girlfriend, and we usually had to bring her younger brother and/or her younger sister along. I admit that I wasn't thinking about legends or things like that; I was forever trying to find a few seconds when I could sneak a kiss (or more, if I thought I could get away with it) when we were out of her siblings' view.
She and I broke up shortly after my graduation, and I remember going up to Petit Jean about a month after that happened. I was going camping with my best friend. We stocked up on steaks and mushrooms and potatoes — and my friend used a fake ID to get some beer — and we were on our way. Even in the summer, it was fairly mild on that mountain, but we did encounter some heat, which made a dip in the campground pool very refreshing.
At night, we grilled steaks, baked potatoes, drank beer and smoked cigarettes, and we listened to the radio under the stars. These were the disco days, and it seemed that Barry Manilow was constantly on the airwaves, with "Can't Smile Without You," "Copacabana" and "Ready To Take a Chance Again." I remember my friend making me laugh by calling him "Barely Man Enough" whenever one of those songs came on.
I think, if I could be in Arkansas tonight, I'd like to visit Petit Jean Mountain again. I don't think it would ever be as good as it was, but it would be nice to be back there one more time.
Abortion Provider Killed in Kansas
Dr. George Tiller, 67, whose Wichita, Kansas, clinic is one of the few in the nation that perform late–term abortions, was shot and killed while he was serving as an usher at his church in Wichita this morning.
A 51–year–old suspect was taken into custody this afternoon. Police have not released his name at this point.
Tiller's clinic, Women's Health Care Services, has been a magnet for abortion battles for close to 20 years. In 1993, the doctor was shot in both arms outside his clinic. The woman who was convicted of shooting him currently is serving her prison sentence.
I've read speculative articles today suggesting that pro–choice activists may retaliate for this, but I fail to see how that will enhance the debate. I know abortion is an emotional issue — on both sides — but retribution is not the answer, and it certainly won't bring Dr. Tiller back.
As I have said before, I believe what Bill and Hillary Clinton have advocated: Abortion should be safe, legal and rare.
I also believe that those who provide abortions should feel safe in their workplaces. And they should feel safe when they attend services at their churches.
Abortion has been legal in this country for more than 35 years. If it is to become illegal now, let that be accomplished in an acceptable, legal manner.
A 51–year–old suspect was taken into custody this afternoon. Police have not released his name at this point.
Tiller's clinic, Women's Health Care Services, has been a magnet for abortion battles for close to 20 years. In 1993, the doctor was shot in both arms outside his clinic. The woman who was convicted of shooting him currently is serving her prison sentence.
I've read speculative articles today suggesting that pro–choice activists may retaliate for this, but I fail to see how that will enhance the debate. I know abortion is an emotional issue — on both sides — but retribution is not the answer, and it certainly won't bring Dr. Tiller back.
As I have said before, I believe what Bill and Hillary Clinton have advocated: Abortion should be safe, legal and rare.
I also believe that those who provide abortions should feel safe in their workplaces. And they should feel safe when they attend services at their churches.
Abortion has been legal in this country for more than 35 years. If it is to become illegal now, let that be accomplished in an acceptable, legal manner.
Labels:
abortion,
Dr. George Tiller,
Kansas,
murder
An Irony of Titanic Proportions
Millvina Dean died today at the age of 97.
She was nine weeks old when the RMS Titanic left on its ill–fated voyage in April 1912. She was the youngest passenger on board and, consequently, the youngest survivor. And, with her passing, all the passengers on board the ship are now deceased.
Her father died when the Titanic sank. If his body was recovered, it was never identified. Her brother lived to be 82 years old, dying in 1992 — on the 80th anniversary of the sinking.
And, in the final irony, Millvina Dean died on the 98th anniversary of the day the ship was launched in Belfast, Northern Ireland.
(I suppose one of the most ironic stories involving Titanic has to be the one about stewardess Violet Jessop. Many people don't know that Titanic was one of three Olympic–class passenger liners owned by the White Star Line. Jessop was on board all three — the Olympic, the Titanic and the Britannic — when they met with disaster, and she survived them all. Titanic, of course, struck an iceberg. A few years later, Britannic struck a mine during World War I and sank. Olympic collided with another ship but didn't sink.
(Socialite Margaret Brown became known as the "Unsinkable Molly Brown" when she survived the Titanic, but I think Jessop deserved to be called "unsinkable." She died in 1971 at the age of 84.)
Dean had been in ill health in recent months and had been forced to sell many family possessions to pay for her medical care. Earlier this month, the stars and director of the blockbuster movie based on the tragedy joined forces to raise money to defray her expenses.
Only a few days ago, actors Kate Winslet and Leonardo Di Caprio and director James Cameron provided $30,000 that had been raised to help with her medical costs.
The fund that was to be used to help Dean with her expenses was known at the Millvina Fund. I've heard nothing about what the fund may be used for now.
Labels:
history,
Millvina Dean,
obituary,
Titanic
Saturday, May 30, 2009
Be Careful What You Wish For
John McIntyre, a former editor for the Baltimore Sun, writes one of my favorite blogs, You Don't Say.
Mr. McIntyre is always educating me about things through his blog, which he wrote at a different web address when he was employed by the Sun. Since becoming a casualty of the economy, he has resumed the blog at a new address, but he still brings the same wit and wisdom to his writing that I found so appealing in his blog's earlier incarnation.
I hope I can return the favor in this post.
Yesterday, Mr. McIntyre noted a "correction ... from the Times Observer of Warren, Pa."
The correction stemmed from a classified advertisement that was placed in the newspaper. The advertisement said, "May Obama follow in the footsteps of Lincoln, Garfield, McKinley and Kennedy!"
In case you aren't up on your presidential history, Presidents Lincoln, Garfield, McKinley and Kennedy were the four presidents who were assassinated. The person who took the ad apparently didn't know enough about presidential history to comprehend what was really being said, and the ad was published. It was removed after someone read between the lines and determined the actual meaning.
I am sure that a similar thought, even if it wasn't expressed and even if it wasn't particularly serious, crossed the minds of some Americans with the inauguration of each new president. But the desire to see a president meet an untimely end is always of interest to people in law enforcement — and, when the president happens to be the first black to hold that office, it is particularly noteworthy. Mr. McIntyre reports that the newspaper gave the identity of the person who placed the ad to the city police in keeping with its policy. The city police department, in keeping with its policy, provided that information to federal law enforcement authorities.
The timing of this incident could hardly be more ironic.
I say that because today is the 203rd anniversary of a fatal duel involving a man who, more than two decades later, became the seventh president of the United States, Andrew Jackson.
On this date in 1806, Jackson (who was 39) killed a man named Charles Dickinson (who was in his mid–20s) in a duel. Dickinson had accused Jackson's wife, Rachel, of bigamy. Dickinson fired first and missed Jackson's heart by inches. The shot was deflected by Jackson's ribs and remained lodged in his body for the rest of his life.
When his shot misfired, Jackson asked to be reloaded and then fired, killing Dickinson.
Was Mrs. Jackson guilty of bigamy? Well, I suppose, to misquote another former president, that may depend on what your definition of is is.
When she was 18, Mrs. Jackson married a man who was given to fits of jealous rage. She eventually left him, and he told her, in December 1790, that he had filed for divorce and it was final. Believing that the marriage was over, she married Jackson the following year.
In fact, however, Mrs. Jackson's first husband had only asked the state legislature to give its approval to an enabling act that would allow him to sue for a divorce. Legally, the divorce was not final, making the Jacksons' marriage invalid. When the divorce became final, the Jacksons remarried — this time legally — in 1794.
The issue of adultery dogged the Jacksons for the rest of their marriage. Jackson participated in 13 duels — reportedly, many of which were to defend his wife's honor — but the duel with Dickinson was the only one that resulted in a death.
In 1828, when Jackson won the first of two presidential terms, the national press found out about Mrs. Jackson's previous marriage and wrote endlessly about it during the campaign.
Mrs. Jackson had been in poor health for a number of years. She was known to be a heavy smoker — a corncob pipe was her trademark — and she suffered a fatal heart attack a few days before Christmas shortly after her husband was elected but a couple of months before he was inaugurated.
For his part, Jackson always believed his wife's death was brought on by the strain from dealing with the media accounts of her marital history. "May God almighty forgive her murderers as I know she forgave them," Jackson said. "I never can."
I don't know if Jackson was the only American president who ever killed a man. Logic tells me he couldn't have been. Of the 43 men who have been president, 31 served in the military, and logic tells me that at least one must have killed someone in combat.
But to my knowledge, Jackson was the only future president who killed someone in a non–combat setting.
Incidentally, the timing of the classified ad that I mentioned earlier is doubly ironic, I suppose. Jackson is believed to have been the first sitting president who was targeted by a would–be assassin. During his second term, as Jackson was leaving the Capitol Building following a funeral service for a congressman, an unemployed painter tried to shoot at him but his pistol misfired. He pulled out another pistol, but it, too, misfired.
Thirty years later, Abraham Lincoln became the first president to be assassinated.
Mr. McIntyre is always educating me about things through his blog, which he wrote at a different web address when he was employed by the Sun. Since becoming a casualty of the economy, he has resumed the blog at a new address, but he still brings the same wit and wisdom to his writing that I found so appealing in his blog's earlier incarnation.
I hope I can return the favor in this post.
Yesterday, Mr. McIntyre noted a "correction ... from the Times Observer of Warren, Pa."
The correction stemmed from a classified advertisement that was placed in the newspaper. The advertisement said, "May Obama follow in the footsteps of Lincoln, Garfield, McKinley and Kennedy!"
In case you aren't up on your presidential history, Presidents Lincoln, Garfield, McKinley and Kennedy were the four presidents who were assassinated. The person who took the ad apparently didn't know enough about presidential history to comprehend what was really being said, and the ad was published. It was removed after someone read between the lines and determined the actual meaning.
I am sure that a similar thought, even if it wasn't expressed and even if it wasn't particularly serious, crossed the minds of some Americans with the inauguration of each new president. But the desire to see a president meet an untimely end is always of interest to people in law enforcement — and, when the president happens to be the first black to hold that office, it is particularly noteworthy. Mr. McIntyre reports that the newspaper gave the identity of the person who placed the ad to the city police in keeping with its policy. The city police department, in keeping with its policy, provided that information to federal law enforcement authorities.
The timing of this incident could hardly be more ironic.
I say that because today is the 203rd anniversary of a fatal duel involving a man who, more than two decades later, became the seventh president of the United States, Andrew Jackson.
On this date in 1806, Jackson (who was 39) killed a man named Charles Dickinson (who was in his mid–20s) in a duel. Dickinson had accused Jackson's wife, Rachel, of bigamy. Dickinson fired first and missed Jackson's heart by inches. The shot was deflected by Jackson's ribs and remained lodged in his body for the rest of his life.
When his shot misfired, Jackson asked to be reloaded and then fired, killing Dickinson.
Was Mrs. Jackson guilty of bigamy? Well, I suppose, to misquote another former president, that may depend on what your definition of is is.
When she was 18, Mrs. Jackson married a man who was given to fits of jealous rage. She eventually left him, and he told her, in December 1790, that he had filed for divorce and it was final. Believing that the marriage was over, she married Jackson the following year.
In fact, however, Mrs. Jackson's first husband had only asked the state legislature to give its approval to an enabling act that would allow him to sue for a divorce. Legally, the divorce was not final, making the Jacksons' marriage invalid. When the divorce became final, the Jacksons remarried — this time legally — in 1794.
The issue of adultery dogged the Jacksons for the rest of their marriage. Jackson participated in 13 duels — reportedly, many of which were to defend his wife's honor — but the duel with Dickinson was the only one that resulted in a death.
In 1828, when Jackson won the first of two presidential terms, the national press found out about Mrs. Jackson's previous marriage and wrote endlessly about it during the campaign.
Mrs. Jackson had been in poor health for a number of years. She was known to be a heavy smoker — a corncob pipe was her trademark — and she suffered a fatal heart attack a few days before Christmas shortly after her husband was elected but a couple of months before he was inaugurated.
For his part, Jackson always believed his wife's death was brought on by the strain from dealing with the media accounts of her marital history. "May God almighty forgive her murderers as I know she forgave them," Jackson said. "I never can."
I don't know if Jackson was the only American president who ever killed a man. Logic tells me he couldn't have been. Of the 43 men who have been president, 31 served in the military, and logic tells me that at least one must have killed someone in combat.
But to my knowledge, Jackson was the only future president who killed someone in a non–combat setting.
Incidentally, the timing of the classified ad that I mentioned earlier is doubly ironic, I suppose. Jackson is believed to have been the first sitting president who was targeted by a would–be assassin. During his second term, as Jackson was leaving the Capitol Building following a funeral service for a congressman, an unemployed painter tried to shoot at him but his pistol misfired. He pulled out another pistol, but it, too, misfired.
Thirty years later, Abraham Lincoln became the first president to be assassinated.
Labels:
Andrew Jackson,
history,
presidency,
You Don't Say
Friday, May 29, 2009
Smoking in the Movies
The American Medical Association Alliance wants any movie with scenes that show people smoking to be given an R rating.
I'll confess to having mixed feelings about this. For many years, I was a smoker. Two years ago, for mostly personal reasons (which I do not wish to discuss here), I gave it up. So I feel that I can sympathize with both sides.
You can make the case that depictions in the movies of certain activities deserve an R rating because those activities are illegal. Most violent acts, for example (and I say "most" because there may be exceptions — and, by the way, boxing movies are in an entirely different category and thus part of an entirely different conversation), are illegal, but not all films that include violence are restricted.
Clearly, smoking is not a healthy activity, but it is a legal one.
Of course, sex is legal, too, but admission to films in which naked bodies can be seen usually is restricted as a means of protecting young people. Whether such restrictions have kept any young people from following up on their normal sexual curiosity is open to debate.
The idea behind ratings is to help parents decide whether a particular film is suitable for their children. But I would argue that the ratings themselves are too vague.
When I was a teenager and I saw that a film was rated R, it never occurred to me that it might be because of the language the characters used (I figured I had heard it all from people my own age, even younger) or because of the substances the characters consumed (I had seen adults, as well as people my own age, consuming legal and illegal substances) or because one or more of the characters got punched, shot or stabbed.
I figured it was because nudity could be seen. When I was a teenager, I never expected to see nudity, even briefly, in a PG movie — and I remember being mildly shocked, in 1975, when I saw brief nudity in a PG movie called "Smile," which was a comedy about the kind of community–sponsored beauty pageants that were popular in those days.
Things don't seem to have changed much. The top–grossing film last year, "The Dark Knight," was given a PG–13 rating. It had plenty of scenes in which violence was implied. Drug use was implied. Sex was implied. The film wasn't graphic, but it was intense.
Likewise, the second highest–grossing film of 2008, "Indiana Jones and the Kingdom of the Crystal Skull," was rated PG–13. There was plenty of violence in the film, although it wasn't always caused by the humans, but it was more explicit than the violence in "The Dark Knight." Profanity was brief. So was substance consumption. I recall no nudity, only sexual innuendo.
Certainly, there are some stories that can't be told honestly unless smoking is included. Typically, those movies are historical films, dealing with times when less was known about the effects of smoking and tobacco consumption was far more extensive than it is today.
One such story is "Good Night and Good Luck," which deals with an important period in modern American history. The central character, Edward R. Murrow, was a heavy smoker. His story cannot be told honestly unless he is shown smoking. That film was rated PG.
Another historical film, Oliver Stone's "JFK," shows many investigators smoking. The year of President Kennedy's assassination was the year before the surgeon general first reported a link between smoking and cancer. If you look at film footage from 1963, you'll see many people smoking.
"JFK" was rated R — but not because of smoking.
There's no doubt, though, that children emulate what they see, whether it's behavior in the movies or in their real lives. And I applaud those who want to limit their exposure to smoking in the movies.
But I think the entire ratings system should be overhauled. If a film is going to be rated R, adults deserve to know why it received that rating. Was it because of the violence? Was it due to depictions of drug use? Was it profanity? Was it nudity? Was it smoking?
The current ratings system simply doesn't provide enough information.
And, while we're taking steps to discourage exposure to smoking in the movies, it would be a good idea to revisit the idea of banning candy cigarettes as well. It is my understanding that attempts were made to do just that in 1970 and 1991, but they failed.
Recent research indicates that candy cigarettes desensitize children to the hazards of smoking and makes them more likely to smoke real cigarettes when they get older. What's more, it seems to me that consuming candy cigarettes must contribute to the obesity problem in America.
Thus, banning them from the market would benefit long–term public health in a couple of ways.
I'll confess to having mixed feelings about this. For many years, I was a smoker. Two years ago, for mostly personal reasons (which I do not wish to discuss here), I gave it up. So I feel that I can sympathize with both sides.
You can make the case that depictions in the movies of certain activities deserve an R rating because those activities are illegal. Most violent acts, for example (and I say "most" because there may be exceptions — and, by the way, boxing movies are in an entirely different category and thus part of an entirely different conversation), are illegal, but not all films that include violence are restricted.
Clearly, smoking is not a healthy activity, but it is a legal one.
Of course, sex is legal, too, but admission to films in which naked bodies can be seen usually is restricted as a means of protecting young people. Whether such restrictions have kept any young people from following up on their normal sexual curiosity is open to debate.
The idea behind ratings is to help parents decide whether a particular film is suitable for their children. But I would argue that the ratings themselves are too vague.
When I was a teenager and I saw that a film was rated R, it never occurred to me that it might be because of the language the characters used (I figured I had heard it all from people my own age, even younger) or because of the substances the characters consumed (I had seen adults, as well as people my own age, consuming legal and illegal substances) or because one or more of the characters got punched, shot or stabbed.
I figured it was because nudity could be seen. When I was a teenager, I never expected to see nudity, even briefly, in a PG movie — and I remember being mildly shocked, in 1975, when I saw brief nudity in a PG movie called "Smile," which was a comedy about the kind of community–sponsored beauty pageants that were popular in those days.
Things don't seem to have changed much. The top–grossing film last year, "The Dark Knight," was given a PG–13 rating. It had plenty of scenes in which violence was implied. Drug use was implied. Sex was implied. The film wasn't graphic, but it was intense.
Likewise, the second highest–grossing film of 2008, "Indiana Jones and the Kingdom of the Crystal Skull," was rated PG–13. There was plenty of violence in the film, although it wasn't always caused by the humans, but it was more explicit than the violence in "The Dark Knight." Profanity was brief. So was substance consumption. I recall no nudity, only sexual innuendo.
Certainly, there are some stories that can't be told honestly unless smoking is included. Typically, those movies are historical films, dealing with times when less was known about the effects of smoking and tobacco consumption was far more extensive than it is today.
One such story is "Good Night and Good Luck," which deals with an important period in modern American history. The central character, Edward R. Murrow, was a heavy smoker. His story cannot be told honestly unless he is shown smoking. That film was rated PG.
Another historical film, Oliver Stone's "JFK," shows many investigators smoking. The year of President Kennedy's assassination was the year before the surgeon general first reported a link between smoking and cancer. If you look at film footage from 1963, you'll see many people smoking.
"JFK" was rated R — but not because of smoking.
There's no doubt, though, that children emulate what they see, whether it's behavior in the movies or in their real lives. And I applaud those who want to limit their exposure to smoking in the movies.
But I think the entire ratings system should be overhauled. If a film is going to be rated R, adults deserve to know why it received that rating. Was it because of the violence? Was it due to depictions of drug use? Was it profanity? Was it nudity? Was it smoking?
The current ratings system simply doesn't provide enough information.
And, while we're taking steps to discourage exposure to smoking in the movies, it would be a good idea to revisit the idea of banning candy cigarettes as well. It is my understanding that attempts were made to do just that in 1970 and 1991, but they failed.
Recent research indicates that candy cigarettes desensitize children to the hazards of smoking and makes them more likely to smoke real cigarettes when they get older. What's more, it seems to me that consuming candy cigarettes must contribute to the obesity problem in America.
Thus, banning them from the market would benefit long–term public health in a couple of ways.
Thursday, May 28, 2009
Rudy
Today is Rudy Giuliani's 65th birthday.
For nearly eight years, it has been hard, if not impossible, to think of Giuliani and not be reminded of the terrorist attacks of Sept. 11, 2001. With George Bush bouncing from one airfield to the next in Air Force One and Dick Cheney seeking refuge in a bunker beneath the White House, Giuliani was probably the most prominent politician Americans saw that day.
That's understandable. He was the mayor of New York City. At the time of the attacks, he wasn't very popular and New Yorkers were actually in the process of selecting his successor when the hijacked planes crashed into the Twin Towers, but he was expected to play a major role.
And the public's memory of that day is of a mayor who was reassuring in a crisis, whose leadership resulted in the nickname "America's Mayor."
Giuliani — who was barred by law from seeking a third term, anyway, but might well have been defeated if he had been eligible to run again and had been on the ballot that day and no attacks had taken place — cultivated the image. When Bush was nominated for a second term in 2004, Giuliani gave a speech endorsing the president, telling the delegates that, after the towers collapsed, he told Police Commissioner Bernard Kerik, "Bernie, thank God George Bush is our president."
The authenticity of the statement was disputed by many, particularly after Giuliani recommended Kerik to be secretary of Homeland Security. When unsavory elements of Kerik's background emerged, it raised doubts about both Giuliani's judgment and the vetting process in the Bush White House.
Such criticism notwithstanding, Giuliani launched his own bid for the 2008 Republican presidential nomination. In 2007, he led many polls, but his campaign fizzled in 2008. There were many reasons for this, but I've always felt that a big part of it was that, by 2008, being "America's Mayor" was not seen as sufficient qualification for the presidency — except, perhaps, among some who still saw terrorism as the nation's greatest threat.
But far more were concerned about escalating food and energy prices — and then, after the conventions, the main concern was the rapidly collapsing economy.
Actually, Giuliani didn't seem particularly concerned about terrorism in the years between the first World Trade Center attack in 1993 and the hijackings in 2001. He seldom mentioned the 1993 attack in public, and he was criticized for preparedness measures that were regarded as inadequate.
Given the nature of the 2001 attacks, perhaps it was unrealistic to think there was any way to be adequately prepared.
But there could have been little, if any, doubt after September 11 that Islamic extremists would seek to attack America again. What was uncertain was the kind of attack it would be, what the target would be or when the terrorists would strike.
I am convinced that, after the hijackings, the top law enforcement minds in America, whether they were in the FBI or the CIA, whether as part of a formal directive or undertaken informally, began trying to imagine what the next attack might be and encouraged preparations for it.
Some acted like generals, determined to fight the last war instead of trying to anticipate the next one, and they focused on preventing future hijackings and improving security in large buildings.
The next terrorist attack on American soil might very well involve hijacked airlines — but I have believed, ever since that day in 2001, that the terrorists would attack in some other way, some way that was not as heavily scrutinized as air travel had become.
Others, I am sure, sought to think outside the box. They used their knowledge of the Middle East, Islamic extremism and Osama bin Laden's previous statements — as well as their knowledge of the existing gaps in American security — as they formulated their scenarios.
Somewhere, in some file in the FBI or the CIA, I'm sure there is a plotline that closely mirrors what the next attack will be. It was written hypothetically, of course, but it may have inadvertently named the city where the attack actually will take place and what will be the next "Ground Zero." It may even have approximated the number of casualties.
That attack is still in the future. But I'm sure its outline has already been written, along with hundreds, if not thousands, of others.
And, when that next attack occurs, as it almost certainly will, I have no doubt that some investigator will discover the existence of this file and will question the director of Homeland Security about it. That investigator will want to know why more wasn't done to prepare for this attack since it clearly had been envisioned by someone long before it happened.
What will the director of Homeland Security say?
Will he/she point out that there were hundreds, perhaps thousands, of such scenarios in the agency's files, that the budget did not provide enough money or manpower to adequately prepare for them all?
Will he/she say that intelligence did not provide sufficient time to prepare, even if it was able to identify which scenario was the correct one?
I believe Joe Biden was right when, during the campaign, he said Barack Obama would face a test.
On the international stage, I do not believe Obama has faced that test yet.
I do believe that time is coming. I hope America will be ready when it arrives.
For nearly eight years, it has been hard, if not impossible, to think of Giuliani and not be reminded of the terrorist attacks of Sept. 11, 2001. With George Bush bouncing from one airfield to the next in Air Force One and Dick Cheney seeking refuge in a bunker beneath the White House, Giuliani was probably the most prominent politician Americans saw that day.
That's understandable. He was the mayor of New York City. At the time of the attacks, he wasn't very popular and New Yorkers were actually in the process of selecting his successor when the hijacked planes crashed into the Twin Towers, but he was expected to play a major role.
And the public's memory of that day is of a mayor who was reassuring in a crisis, whose leadership resulted in the nickname "America's Mayor."
Giuliani — who was barred by law from seeking a third term, anyway, but might well have been defeated if he had been eligible to run again and had been on the ballot that day and no attacks had taken place — cultivated the image. When Bush was nominated for a second term in 2004, Giuliani gave a speech endorsing the president, telling the delegates that, after the towers collapsed, he told Police Commissioner Bernard Kerik, "Bernie, thank God George Bush is our president."
The authenticity of the statement was disputed by many, particularly after Giuliani recommended Kerik to be secretary of Homeland Security. When unsavory elements of Kerik's background emerged, it raised doubts about both Giuliani's judgment and the vetting process in the Bush White House.
Such criticism notwithstanding, Giuliani launched his own bid for the 2008 Republican presidential nomination. In 2007, he led many polls, but his campaign fizzled in 2008. There were many reasons for this, but I've always felt that a big part of it was that, by 2008, being "America's Mayor" was not seen as sufficient qualification for the presidency — except, perhaps, among some who still saw terrorism as the nation's greatest threat.
But far more were concerned about escalating food and energy prices — and then, after the conventions, the main concern was the rapidly collapsing economy.
Actually, Giuliani didn't seem particularly concerned about terrorism in the years between the first World Trade Center attack in 1993 and the hijackings in 2001. He seldom mentioned the 1993 attack in public, and he was criticized for preparedness measures that were regarded as inadequate.
Given the nature of the 2001 attacks, perhaps it was unrealistic to think there was any way to be adequately prepared.
But there could have been little, if any, doubt after September 11 that Islamic extremists would seek to attack America again. What was uncertain was the kind of attack it would be, what the target would be or when the terrorists would strike.
I am convinced that, after the hijackings, the top law enforcement minds in America, whether they were in the FBI or the CIA, whether as part of a formal directive or undertaken informally, began trying to imagine what the next attack might be and encouraged preparations for it.
Some acted like generals, determined to fight the last war instead of trying to anticipate the next one, and they focused on preventing future hijackings and improving security in large buildings.
The next terrorist attack on American soil might very well involve hijacked airlines — but I have believed, ever since that day in 2001, that the terrorists would attack in some other way, some way that was not as heavily scrutinized as air travel had become.
Others, I am sure, sought to think outside the box. They used their knowledge of the Middle East, Islamic extremism and Osama bin Laden's previous statements — as well as their knowledge of the existing gaps in American security — as they formulated their scenarios.
Somewhere, in some file in the FBI or the CIA, I'm sure there is a plotline that closely mirrors what the next attack will be. It was written hypothetically, of course, but it may have inadvertently named the city where the attack actually will take place and what will be the next "Ground Zero." It may even have approximated the number of casualties.
That attack is still in the future. But I'm sure its outline has already been written, along with hundreds, if not thousands, of others.
And, when that next attack occurs, as it almost certainly will, I have no doubt that some investigator will discover the existence of this file and will question the director of Homeland Security about it. That investigator will want to know why more wasn't done to prepare for this attack since it clearly had been envisioned by someone long before it happened.
What will the director of Homeland Security say?
Will he/she point out that there were hundreds, perhaps thousands, of such scenarios in the agency's files, that the budget did not provide enough money or manpower to adequately prepare for them all?
Will he/she say that intelligence did not provide sufficient time to prepare, even if it was able to identify which scenario was the correct one?
I believe Joe Biden was right when, during the campaign, he said Barack Obama would face a test.
On the international stage, I do not believe Obama has faced that test yet.
I do believe that time is coming. I hope America will be ready when it arrives.
Wednesday, May 27, 2009
The Supreme Court Nominee
Barack Obama's nomination of Sonia Sotomayor to replace David Souter on the Supreme Court is attracting considerable editorial reaction.
Much of it seems to be knee–jerk and predictable:
- The New York Times calls her an "inspired choice" who would be a "trailblazing figure."
Much has been written about the quests made by previous presidents to find an Hispanic judge for the Supreme Court — in part to appeal to the fast–growing Hispanic community.
But if such choices are made with the belief that it will permanently attract a large, elusive demographic to the president's party, they are misguided. Nearly 30 years ago, Ronald Reagan nominated the first woman to the Supreme Court. Reagan did win the support of women when he sought re–election in 1984 — even though the Democrats put a woman on their ticket — but the Republicans have been losing women in most elections ever since.
For that matter, George W. Bush appointed both blacks and Hispanics to positions within his administration, but neither group has shown much loyalty to the Republican Party.
The Times finds Sotomayor's personal story moving but is quick to add that she is "more than just a distinguished member of two underrepresented groups. She is an accomplished lawyer and judge, who could become an extraordinary Supreme Court justice."
Adam Liptak writes, in the Times, that Sotomayor's judicial opinions are"marked by diligence, depth and unflashy competence," but warns that she has "issued no major decisions concerning abortion, the death penalty, gay rights or national security."
Sotomayor's track record, suggests Liptak, makes her "remarkably cursory treatment" of an employment discrimination case last year "baffling." That ruling, which many observers expect to be the centerpiece in Sotomayor's confirmation hearings, "contained a single paragraph of reasoning," Liptak writes.
The case has been appealed to the Supreme Court and its ruling is pending. - The Washington Post is a bit more restrained but nevertheless approving of the selection.
"Senators are right to closely scrutinize Judge Sotomayor's philosophy and qualifications," writes the Post. "She has produced a rich record of opinions as an appeals court judge for the Judiciary Committee to discuss. Senators also should remember that Mr. Obama, like any president, is entitled to deference in choosing a justice."
With a solid Democratic majority in the Senate, it's hard to imagine a Democratic president encountering much difficulty winning the confirmation of a Supreme Court nominee. But unforeseen things happen all the time.
Even if something unexpected doesn't pop up during the confirmation hearings, there is plenty in Sotomayor's documented history to discuss — not just her rulings from the bench but her statements in speeches. The Post cites one from 2001:"The aspiration to impartiality is just that — it's an aspiration because it denies the fact that we are by our experiences making different choices than others. ... Justice [Sandra Day] O'Connor has often been cited as saying that a wise old man and wise old woman will reach the same conclusion in deciding cases . ... I am not so sure that I agree with the statement. First, ... there can never be a universal definition of wise. Second, I would hope that a wise Latina woman with the richness of her experiences would more often than not reach a better conclusion than a white male who hasn't lived that life."
The Post's Robert Barnes and Michael Fletcher write that Sotomayor is the "most controversial of [Obama's] potential nominees."
They also remind readers that, assuming she is confirmed, Sotomayor may not be the first Hispanic member of the Supreme Court. Benjamin Cardozo (whose 139th birthday was Sunday, by the way) was said to have ancestors from Portugal, but he never acknowledged any Hispanic lineage. Perhaps he felt being Jewish was enough of a hurdle when he was chosen to replace Oliver Wendell Holmes in the 1930s. - As I wrote on this blog last week, Bill Schneider pointed out on CNN that survey respondents felt it was more important to have a Supreme Court nominee with judicial experience than it was to have a woman, a black or a Hispanic nominated.
As it turned out, Obama multi–tasked on this nomination. Sotomayor brings extensive judicial experience with her to the confirmation hearings — and she is an Hispanic female.
But the New York Post seems only to see the demographics.
"Once confirmed, she will join Ruth Bader Ginsburg as the High Court's second reflexively liberal, Ivy League–educated, female, former appellate jurist from the Big Apple," writes the Post. "Diversity for thee, but not for me — right, Mr. President?"
Even so, the Post makes a point when it asks, "[D]id Obama make the most of his first opportunity to push the High Court to the left?"
The Post observes that Obama could have picked someone who had a record of defending progressive principles, and, apparently, there were several such names on his list. "It's hard to imagine any of them refusing the opportunity to attempt a principled defense of affirmative action," writes the Post, but Sotomayor, who was chosen to succeed a progressive jurist, did — in the employment discrimination case I mentioned earlier.
Obama has made his position on abortion well known, but he has come across as less than supportive of gay rights or marijuana legalization, two issues that many of his supporters hoped would have a champion in the White House. Sotomayor's positions on those issues, as well as how she stands on national security issues or the death penalty, are unclear.
Even though the confirmation hearings and the Senate as a whole will be controlled by Democrats, I hope the proceedings will not be a rubber–stamp for her nomination, that we will get some idea of where she stands before she is confirmed.
If not, she may well prove to be the kind of unpleasant surprise that Souter turned out to be for George H.W. Bush — a mysterious nominee whose legal views turn out to be different from what the president anticipated.
Labels:
nomination,
Obama,
Sotomayor,
Supreme Court
Economic Forecasts
In what may be greeted in many quarters as good news, the National Association for Business Economics Outlook reports that indicators suggest the end of the recession may be in sight, according to Julianne Pepitone at CNNMoney.com.
The news will continue to be mixed, the NABE says. The panel anticipates a rebound in economic growth in the second half of 2009, but it still expects to see a decline in economic activity for the second quarter. As far as the short term is concerned, that really isn't much of a surprise. We've been seeing a decline in gross domestic product for months now, but recent months have suggested that the decline is leveling off.
But any gains that may come in the remaining seven months of this year are not likely to offset the losses we've seen. That, by the way, is my own interpretation, not the NABE's — and it's based primarily on the 6% drop in GDP that we witnessed in January.
Employment will lag behind, as it typically does during a recovery, and will show signs of turning up by the early months of next year, says NABE president Chris Varvares. The panelists predict that, nationally, unemployment will level off before it gets into double digits, with the rate dropping to 9.3% by the end of 2010, which is higher than the current rate of 8.9%. Presumably, it will take longer in some states than others.
And, as Pepitone writes in her summary of the NABE report, "Almost three out of four survey respondents expect the recession will end by the third quarter of 2009."
Since we're a little more than a month away from the start of the third quarter, that's good news, isn't it? Well, it seems to be — until you take into account what the rest of the respondents say. Granted, they are in the minority. But the majority opinion is not always right.
Are the remaining economists hedging their bets? That's hard to say. But Pepitone points out that 19% of the economists who were surveyed by the NABE say recovery won't begin until the fourth quarter and 7% believe it will begin in 2010. So, clearly, there are skeptics.
Personally, I'm doing what I can, which isn't much in the grand scheme of things. I'm trying to remain patient. But I'm not a trained economist. I've tried to understand the concepts that have been discussed, but it all comes back to my personal situation.
I didn't create the conditions that led to this recession. I didn't profit from the culture of greed that so many people say brought the economy to its knees.
I can only hope that things get better soon. For me, things will be better when I have a job.
Until then, it's mostly white noise to me.
The news will continue to be mixed, the NABE says. The panel anticipates a rebound in economic growth in the second half of 2009, but it still expects to see a decline in economic activity for the second quarter. As far as the short term is concerned, that really isn't much of a surprise. We've been seeing a decline in gross domestic product for months now, but recent months have suggested that the decline is leveling off.
But any gains that may come in the remaining seven months of this year are not likely to offset the losses we've seen. That, by the way, is my own interpretation, not the NABE's — and it's based primarily on the 6% drop in GDP that we witnessed in January.
Employment will lag behind, as it typically does during a recovery, and will show signs of turning up by the early months of next year, says NABE president Chris Varvares. The panelists predict that, nationally, unemployment will level off before it gets into double digits, with the rate dropping to 9.3% by the end of 2010, which is higher than the current rate of 8.9%. Presumably, it will take longer in some states than others.
And, as Pepitone writes in her summary of the NABE report, "Almost three out of four survey respondents expect the recession will end by the third quarter of 2009."
Since we're a little more than a month away from the start of the third quarter, that's good news, isn't it? Well, it seems to be — until you take into account what the rest of the respondents say. Granted, they are in the minority. But the majority opinion is not always right.
Are the remaining economists hedging their bets? That's hard to say. But Pepitone points out that 19% of the economists who were surveyed by the NABE say recovery won't begin until the fourth quarter and 7% believe it will begin in 2010. So, clearly, there are skeptics.
Personally, I'm doing what I can, which isn't much in the grand scheme of things. I'm trying to remain patient. But I'm not a trained economist. I've tried to understand the concepts that have been discussed, but it all comes back to my personal situation.
I didn't create the conditions that led to this recession. I didn't profit from the culture of greed that so many people say brought the economy to its knees.
I can only hope that things get better soon. For me, things will be better when I have a job.
Until then, it's mostly white noise to me.
Tuesday, May 26, 2009
Have the Terrorists Won?
It seems to me that truly historic days live on beyond their normal 24–hour lifespans and continue to influence our lives indefinitely, in ways that are seen and unseen, even if the public's attention has moved on to something else.
In fact, it reminds me of a pebble tossed into a pond or a lake. From that one tiny point of impact, the ripples fan out in ever–expanding circles, affecting everything they touch, until they are stopped by the distant shorelines.
Need a bigger, grander example? Think of the 2004 tsunami that began with an undersea earthquake in the Indian Ocean and led to the deaths of more than 225,000 people in nearly a dozen countries.
It was, in fact, months before the scope of the tsunami was understood. Indeed, it often seems to take the distance that time can provide before the full impact of an historic event can be comprehended.
I was thinking about this the other night while I watched a four–hour program on 9/11. That's an event that has been particularly fascinating for me because, at the time it happened, the office in which I worked had no TV set. Unlike most Americans, I didn't see the events as they unfolded.
I've only seen video footage of the planes crashing into the Twin Towers and the bodies plummeting to earth from great heights. As appalling as those images are, it's one thing to have seen them as they happened, raw and uncensored, playing out in real time, and it's quite another to see film footage of something that has already occurred.
It's kind of like watching the Zapruder film, knowing that President Kennedy's head is about to be engulfed in a bloody halo and being powerless to do anything about it. You may want to yell at the screen, "Don't make that turn!" but you know that Kennedy's limousine will drive down Elm Street past the Texas Schoolbook Depository into history — and there isn't a thing you can do about it.
It's the same with the hijacked planes on September 11, I suppose. You can see the surveillance film of some of the hijackers being briefly detained at the security checkpoints. You can see footage of at least one of the hijackers with what appears to be a box cutter in his hip pocket, and you may feel tempted to yell at the screen, "Don't let him board the plane!" but you know he will, anyway.
And you know that the four planes will be hijacked and nearly 3,000 people will be killed on a crystal clear, early autumn morning.
Sept. 11, 2001, will always be a significant day in American history. But, once the shock from the attacks wore off and the stock market reopened and planes were allowed to fly again, life began to return to normal.
Even so, I think it can be argued that, as meticulous and methodical as the terrorists' planning was, those hijackings had consequences that the terrorists did not anticipate, consequences that continue to influence American life.
In the months after the attacks, for example, Osama bin Laden reportedly told some of his supporters that he didn't foresee the collapse of the Twin Towers. Is that credible? Bin Laden's academic credentials are unclear, but some have said he earned a degree in civil engineering. If that is so, he must have had some idea of what a fire fed by thousands of gallons of jet fuel could do to a skyscraper.
Whether bin Laden believed the towers would fall, his objective seems more certain. He was driven by a desire to bring jihad to American soil. Thus far, that has not happened. We've been told that additional terrorist attacks were thwarted by policies that were followed during the Bush administration, but we've seen no evidence to support that claim. There are those who believe al–Qaeda has been biding its time before striking again, similar to the eight–year gap that passed between the attacks on the World Trade Center.
But, while bin Laden apparently sought to engage the United States in a bloody conflict, he may not have anticipated the direction it would take.
He may not have realized how obsessed the neocons in the Bush administration were with Saddam Hussein, even a decade after the end of the Gulf War, or that they would use the terrorist attacks to justify an invasion of Iraq that continues to claim American lives and money at a time when both could be used more effectively.
But, from bin Laden's perspective, the terrorist attacks may have succeeded in achieving his goal, albeit in unexpected ways. His words may be contradictory, but I think we can agree that his goal seems to be toppling the United States. Al–Qaeda and the Islamic extremists may have a Dark Ages mentality and their objective may be predicated on the use of force, but in the 21st century, the best strategy for destroying a foe is to undermine that foe's economy.
I'm not suggesting that economists sympathetic to Islamic extremism infiltrated the American economy and proceeded to sabotage it. The greed at the top of America's economic food chain owes no allegiance to any faith — only money.
So what is the relationship between the Iraq war, now more than 6 years old, and the recession?
Clearly, many factors have been involved in the recession. But it can be plausibly argued that the combined cost of sustaining the wars in Afghanistan and Iraq since 2001 — more than $860 billion so far — has made things much worse than they might have been.
And that begs the question — Have the terrorists won?
In fact, it reminds me of a pebble tossed into a pond or a lake. From that one tiny point of impact, the ripples fan out in ever–expanding circles, affecting everything they touch, until they are stopped by the distant shorelines.
Need a bigger, grander example? Think of the 2004 tsunami that began with an undersea earthquake in the Indian Ocean and led to the deaths of more than 225,000 people in nearly a dozen countries.
It was, in fact, months before the scope of the tsunami was understood. Indeed, it often seems to take the distance that time can provide before the full impact of an historic event can be comprehended.
I was thinking about this the other night while I watched a four–hour program on 9/11. That's an event that has been particularly fascinating for me because, at the time it happened, the office in which I worked had no TV set. Unlike most Americans, I didn't see the events as they unfolded.
I've only seen video footage of the planes crashing into the Twin Towers and the bodies plummeting to earth from great heights. As appalling as those images are, it's one thing to have seen them as they happened, raw and uncensored, playing out in real time, and it's quite another to see film footage of something that has already occurred.
It's kind of like watching the Zapruder film, knowing that President Kennedy's head is about to be engulfed in a bloody halo and being powerless to do anything about it. You may want to yell at the screen, "Don't make that turn!" but you know that Kennedy's limousine will drive down Elm Street past the Texas Schoolbook Depository into history — and there isn't a thing you can do about it.
It's the same with the hijacked planes on September 11, I suppose. You can see the surveillance film of some of the hijackers being briefly detained at the security checkpoints. You can see footage of at least one of the hijackers with what appears to be a box cutter in his hip pocket, and you may feel tempted to yell at the screen, "Don't let him board the plane!" but you know he will, anyway.
And you know that the four planes will be hijacked and nearly 3,000 people will be killed on a crystal clear, early autumn morning.
Sept. 11, 2001, will always be a significant day in American history. But, once the shock from the attacks wore off and the stock market reopened and planes were allowed to fly again, life began to return to normal.
Even so, I think it can be argued that, as meticulous and methodical as the terrorists' planning was, those hijackings had consequences that the terrorists did not anticipate, consequences that continue to influence American life.
In the months after the attacks, for example, Osama bin Laden reportedly told some of his supporters that he didn't foresee the collapse of the Twin Towers. Is that credible? Bin Laden's academic credentials are unclear, but some have said he earned a degree in civil engineering. If that is so, he must have had some idea of what a fire fed by thousands of gallons of jet fuel could do to a skyscraper.
Whether bin Laden believed the towers would fall, his objective seems more certain. He was driven by a desire to bring jihad to American soil. Thus far, that has not happened. We've been told that additional terrorist attacks were thwarted by policies that were followed during the Bush administration, but we've seen no evidence to support that claim. There are those who believe al–Qaeda has been biding its time before striking again, similar to the eight–year gap that passed between the attacks on the World Trade Center.
But, while bin Laden apparently sought to engage the United States in a bloody conflict, he may not have anticipated the direction it would take.
He may not have realized how obsessed the neocons in the Bush administration were with Saddam Hussein, even a decade after the end of the Gulf War, or that they would use the terrorist attacks to justify an invasion of Iraq that continues to claim American lives and money at a time when both could be used more effectively.
But, from bin Laden's perspective, the terrorist attacks may have succeeded in achieving his goal, albeit in unexpected ways. His words may be contradictory, but I think we can agree that his goal seems to be toppling the United States. Al–Qaeda and the Islamic extremists may have a Dark Ages mentality and their objective may be predicated on the use of force, but in the 21st century, the best strategy for destroying a foe is to undermine that foe's economy.
I'm not suggesting that economists sympathetic to Islamic extremism infiltrated the American economy and proceeded to sabotage it. The greed at the top of America's economic food chain owes no allegiance to any faith — only money.
So what is the relationship between the Iraq war, now more than 6 years old, and the recession?
Clearly, many factors have been involved in the recession. But it can be plausibly argued that the combined cost of sustaining the wars in Afghanistan and Iraq since 2001 — more than $860 billion so far — has made things much worse than they might have been.
And that begs the question — Have the terrorists won?
Labels:
Afghanistan,
Iraq,
September 11,
terrorism
Monday, May 25, 2009
The Meaning of Memorial Day
CNN.com asks an interesting question of its visitors in today's "Quick Vote:"
"Has the true meaning of Memorial Day been forgotten?"
I'm not sure how long the question has been posted on the site. But, at nearly 12:30 p.m. (Central), about 65,000 people have voted — and nearly 70% say yes, the true meaning of Memorial Day has been forgotten.
I'm inclined to agree. The emphasis I see is on sales and the kickoffs of summer activities — the opening of community swimming pools, promotions of tourist attractions, etc. — and not so much attention given to the people who gave their lives in service to their country.
To be sure, there are some who remember the reason behind this holiday.
The Detroit Free Press reminds us of the words of Abraham Lincoln at the dedication of the cemetery in Gettysburg. The newspaper acknowledges that "we have not as a nation always measured up to our expressed ideals," but it still encourages all Americans to defend freedom "whenever and however we can."
It is, therefore, a timely reminder of what we have and could easily lose — and what millions in this world do not have — that we heard today of North Korea's second nuclear bomb test, which, as CNN.com observes, was conducted "in defiance of multiple international warnings."
President Obama was right in saying that North Korea's actions "pose a grave threat to the peace and stability of the world." He pledged a strong response by the United States and the international community.
Today, you can pay tribute to those who made the ultimate sacrifice by pausing at 3 p.m., wherever you are and whatever you're doing, to reflect. Major league baseball games will be interrupted at 3 p.m. local time for everyone in attendance to participate, but you can participate also, even if you're spending the day at home.
That is what is being promoted by the White House Commission on Remembrance.
Have a good holiday. But, please, don't lose sight of the reason for the season.
"Has the true meaning of Memorial Day been forgotten?"
I'm not sure how long the question has been posted on the site. But, at nearly 12:30 p.m. (Central), about 65,000 people have voted — and nearly 70% say yes, the true meaning of Memorial Day has been forgotten.
I'm inclined to agree. The emphasis I see is on sales and the kickoffs of summer activities — the opening of community swimming pools, promotions of tourist attractions, etc. — and not so much attention given to the people who gave their lives in service to their country.
To be sure, there are some who remember the reason behind this holiday.
The Detroit Free Press reminds us of the words of Abraham Lincoln at the dedication of the cemetery in Gettysburg. The newspaper acknowledges that "we have not as a nation always measured up to our expressed ideals," but it still encourages all Americans to defend freedom "whenever and however we can."
It is, therefore, a timely reminder of what we have and could easily lose — and what millions in this world do not have — that we heard today of North Korea's second nuclear bomb test, which, as CNN.com observes, was conducted "in defiance of multiple international warnings."
President Obama was right in saying that North Korea's actions "pose a grave threat to the peace and stability of the world." He pledged a strong response by the United States and the international community.
Today, you can pay tribute to those who made the ultimate sacrifice by pausing at 3 p.m., wherever you are and whatever you're doing, to reflect. Major league baseball games will be interrupted at 3 p.m. local time for everyone in attendance to participate, but you can participate also, even if you're spending the day at home.
That is what is being promoted by the White House Commission on Remembrance.
Have a good holiday. But, please, don't lose sight of the reason for the season.
Labels:
Memorial Day,
North Korea,
Obama
Sunday, May 24, 2009
Random Thoughts on Religion
It seems that, everywhere I turn lately, there is a religious angle.
Sometimes it can't be avoided. Just yesterday, I wrote about Liberty University's decision not to recognize the College Democrats. You can hardly write about that subject without, at some point, observing that Liberty is a private Baptist college that was founded by the late Jerry Falwell.
Sometimes it can be avoided. This week, I've been thinking a lot about my hometown — Conway, Ark. — which has been cast into the spotlight with Kris Allen's victory in the "American Idol" competition.
I suppose Allen makes his home in Conway these days. He wasn't born there, and he didn't go to school there until he was in college. But I assume he is living there now. And, apparently, he's been active in evangelical Christian groups in the central Arkansas area.
"American Idol" seems to have done a pretty good job of resisting whatever temptation there may have been to exploit Allen's faith — but it was mentioned quite a bit when the finale came down to a showdown between Allen and an avant–garde, heavily favored rival from California. I guess the contrasts were too appealing to pass up.
When you live in Conway, Ark., you are surrounded by religion. At least, that was my impression. Many of the people in Arkansas — nearly 40% — are Baptists, and Conway was no exception. There were a lot of Baptists in Conway, and they went to services on Sunday morning, Sunday evening and Wednesday evening — without fail.
All the other Protestant denominations, as well as the Catholics, come in under double digits in Arkansas. But the Methodists just barely missed, with 9% of the population.
I was raised in the Methodist church — and today, incidentally, is Aldersgate Day, which commemorates the day 271 years ago — on May 24, 1738 — that John Wesley experienced his conversion. It's probably of consequence only to Methodists, since Wesley founded the Methodist movement.
My father was a professor of religion and philosophy at a small Methodist college in Conway. To hold that position, he had to be an ordained minister. I can remember several occasions when he was asked to perform the wedding ceremonies for his students.
I attended the Methodist church in Conway. Because of his job, my father became acquainted with all the ministers who came to our church and the nearby churches. Eventually, we heard them all. Some gave good sermons. Some were pretty dry.
We went to church fairly regularly when I was a child, then attendance became more sporadic when I got into my teen years and my mother went back to work. For awhile, I wasn't attending church at all, then I started going, many times by myself. I felt like I was making my own decision, and it seemed to fill a personal need.
Then, when I was in college, I stopped going to church — ironically, right around the time that I talked my mother into going again. I think she continued to attend church for the rest of her life, but my lapse lasted for several years, until I found myself living and working in Little Rock, which is a short drive from Conway. In Little Rock, I resumed my church attendance, even though I was working the night shift on Saturdays and Sundays.
But, after I left Arkansas, my church participation began to lapse again. And this lapse went on for many years — until early this year, as a matter of fact. In February, I began attending a Methodist church here in Dallas. And my attendance has been pretty consistent. I've missed a few services in the last four months, but not many.
Why do I keep going? I've been wondering that myself. There are no simple answers, I guess. I've been out of work for awhile, so I guess that's played a role. If nothing else, it can help to have the support network a church congregation can be.
I don't think this is a fear–of–the–wrath–of–God kind of scenario, though. I've seen that before, and this isn't like that.
Nearly 18 years ago, a good friend of mine in Arkansas was dying of cancer. I had left Arkansas a few years earlier, but I made a couple of trips back to visit my friend. His deterioration was rapid, and I remember, during my last visit to Arkansas before he died, observing that some of my other friends had started attending church regularly. When I lived there, I said, my friends never went to church.
"A lot of us have been going since Mike got sick," one of my friends replied quietly.
That kind of event is bound to inspire some life–altering changes. I guess it depends on what you believe, what you feel in your heart, that determines whether those changes take permanent root. Some of my friends are still going to church while others lapsed into non–attendance after Mike died and the sense of urgency passed.
Mike's illness didn't bring me back to church, nor did the loss of my close companion, my dog, who died after being struck by a car one night a couple of years later. Nor did my mother's death in a flash flood the year after that.
But something has been nudging me back to church this year. Maybe, as I said, the bad economy has had something to do with it. But, while it is true that I have prayed for the end of the recession and guidance for my job search, that isn't the whole story.
I just don't know what the whole story is.
Whatever has been nudging me in this direction, there are times when I wish spirits from the other side could visit me for a few minutes, the way some do in sci–fi TV shows or movies. Maybe my mother or my grandmothers or my grandfathers — or "Aunt Bess," an older lady who was a dear friend of mine when I was growing up and had very strong religious beliefs — could shed some light on some things for me.
Part of it has to do with a friend of mine who lives in St. Louis. He had a heart attack last month and had a bypass this week. From what I've been able to learn about his condition, his doctors were always pretty confident that he would recover because, statistically, he's rather young.
But a bypass is major surgery. And any major surgery carries with it a possibility — however small it may be — that something could go wrong.
I had already been attending my church for nearly three months when my friend had his heart attack. So his condition did not inspire me to return to church. But it did keep me coming to services.
Still, I ask myself — what is it that I truly believe?
That's a hard question for me to answer. At this stage of my life, I am more inclined to believe there is a higher power than I was when I was younger. Does this higher power have "a plan" that is unfolding before us? Of that, I'm not certain. It seems more likely to me that this higher power created the earth and everything in it — or oversaw its evolution — but what we do with it is up to us.
I'd like to believe there was more than a nugget of truth in the words of George Burns, who played God in "Oh, God!" — "However hopeless, helpless, mixed up and scary it all gets, it can work. If you find it hard to believe in me, maybe it would help you to know that I believe in you."
I guess the things I've seen in my life make me hopeful that, if God exists, he's like Burns because that is the kind of God I want to believe in. When Burns was asked about the future by John Denver in the movie, he told Denver that he could speak with authority about things as they exist and about things that have already happened, but he didn't know what would happen in the future.
Considering what most denominations teach about God and the existence of a "plan," I find it oddly reassuring to think that God didn't plan what happens, that "free will" is more than just a human theory.
"If you're God," Denver said to Burns at one point, "how can You permit all the suffering that goes on in the world?"
"I don't permit the suffering," Burns replied. "You do."
Sometimes it can't be avoided. Just yesterday, I wrote about Liberty University's decision not to recognize the College Democrats. You can hardly write about that subject without, at some point, observing that Liberty is a private Baptist college that was founded by the late Jerry Falwell.
Sometimes it can be avoided. This week, I've been thinking a lot about my hometown — Conway, Ark. — which has been cast into the spotlight with Kris Allen's victory in the "American Idol" competition.
I suppose Allen makes his home in Conway these days. He wasn't born there, and he didn't go to school there until he was in college. But I assume he is living there now. And, apparently, he's been active in evangelical Christian groups in the central Arkansas area.
"American Idol" seems to have done a pretty good job of resisting whatever temptation there may have been to exploit Allen's faith — but it was mentioned quite a bit when the finale came down to a showdown between Allen and an avant–garde, heavily favored rival from California. I guess the contrasts were too appealing to pass up.
When you live in Conway, Ark., you are surrounded by religion. At least, that was my impression. Many of the people in Arkansas — nearly 40% — are Baptists, and Conway was no exception. There were a lot of Baptists in Conway, and they went to services on Sunday morning, Sunday evening and Wednesday evening — without fail.
All the other Protestant denominations, as well as the Catholics, come in under double digits in Arkansas. But the Methodists just barely missed, with 9% of the population.
I was raised in the Methodist church — and today, incidentally, is Aldersgate Day, which commemorates the day 271 years ago — on May 24, 1738 — that John Wesley experienced his conversion. It's probably of consequence only to Methodists, since Wesley founded the Methodist movement.
My father was a professor of religion and philosophy at a small Methodist college in Conway. To hold that position, he had to be an ordained minister. I can remember several occasions when he was asked to perform the wedding ceremonies for his students.
I attended the Methodist church in Conway. Because of his job, my father became acquainted with all the ministers who came to our church and the nearby churches. Eventually, we heard them all. Some gave good sermons. Some were pretty dry.
We went to church fairly regularly when I was a child, then attendance became more sporadic when I got into my teen years and my mother went back to work. For awhile, I wasn't attending church at all, then I started going, many times by myself. I felt like I was making my own decision, and it seemed to fill a personal need.
Then, when I was in college, I stopped going to church — ironically, right around the time that I talked my mother into going again. I think she continued to attend church for the rest of her life, but my lapse lasted for several years, until I found myself living and working in Little Rock, which is a short drive from Conway. In Little Rock, I resumed my church attendance, even though I was working the night shift on Saturdays and Sundays.
But, after I left Arkansas, my church participation began to lapse again. And this lapse went on for many years — until early this year, as a matter of fact. In February, I began attending a Methodist church here in Dallas. And my attendance has been pretty consistent. I've missed a few services in the last four months, but not many.
Why do I keep going? I've been wondering that myself. There are no simple answers, I guess. I've been out of work for awhile, so I guess that's played a role. If nothing else, it can help to have the support network a church congregation can be.
I don't think this is a fear–of–the–wrath–of–God kind of scenario, though. I've seen that before, and this isn't like that.
Nearly 18 years ago, a good friend of mine in Arkansas was dying of cancer. I had left Arkansas a few years earlier, but I made a couple of trips back to visit my friend. His deterioration was rapid, and I remember, during my last visit to Arkansas before he died, observing that some of my other friends had started attending church regularly. When I lived there, I said, my friends never went to church.
"A lot of us have been going since Mike got sick," one of my friends replied quietly.
That kind of event is bound to inspire some life–altering changes. I guess it depends on what you believe, what you feel in your heart, that determines whether those changes take permanent root. Some of my friends are still going to church while others lapsed into non–attendance after Mike died and the sense of urgency passed.
Mike's illness didn't bring me back to church, nor did the loss of my close companion, my dog, who died after being struck by a car one night a couple of years later. Nor did my mother's death in a flash flood the year after that.
But something has been nudging me back to church this year. Maybe, as I said, the bad economy has had something to do with it. But, while it is true that I have prayed for the end of the recession and guidance for my job search, that isn't the whole story.
I just don't know what the whole story is.
Whatever has been nudging me in this direction, there are times when I wish spirits from the other side could visit me for a few minutes, the way some do in sci–fi TV shows or movies. Maybe my mother or my grandmothers or my grandfathers — or "Aunt Bess," an older lady who was a dear friend of mine when I was growing up and had very strong religious beliefs — could shed some light on some things for me.
Part of it has to do with a friend of mine who lives in St. Louis. He had a heart attack last month and had a bypass this week. From what I've been able to learn about his condition, his doctors were always pretty confident that he would recover because, statistically, he's rather young.
But a bypass is major surgery. And any major surgery carries with it a possibility — however small it may be — that something could go wrong.
I had already been attending my church for nearly three months when my friend had his heart attack. So his condition did not inspire me to return to church. But it did keep me coming to services.
Still, I ask myself — what is it that I truly believe?
That's a hard question for me to answer. At this stage of my life, I am more inclined to believe there is a higher power than I was when I was younger. Does this higher power have "a plan" that is unfolding before us? Of that, I'm not certain. It seems more likely to me that this higher power created the earth and everything in it — or oversaw its evolution — but what we do with it is up to us.
I'd like to believe there was more than a nugget of truth in the words of George Burns, who played God in "Oh, God!" — "However hopeless, helpless, mixed up and scary it all gets, it can work. If you find it hard to believe in me, maybe it would help you to know that I believe in you."
I guess the things I've seen in my life make me hopeful that, if God exists, he's like Burns because that is the kind of God I want to believe in. When Burns was asked about the future by John Denver in the movie, he told Denver that he could speak with authority about things as they exist and about things that have already happened, but he didn't know what would happen in the future.
Considering what most denominations teach about God and the existence of a "plan," I find it oddly reassuring to think that God didn't plan what happens, that "free will" is more than just a human theory.
"If you're God," Denver said to Burns at one point, "how can You permit all the suffering that goes on in the world?"
"I don't permit the suffering," Burns replied. "You do."
The Peril of Power
Today, Arkansas has four members in its delegation in the House of Representatives, same as it did when I was growing up there.
Representation in the House is based on the most recent census report. Thus, a delegation can grow or shrink, depending upon what the state's population did in the preceding decade. Arkansas is 32nd in population, so its delegation is one of the smallest. Its delegation was larger in the decades before I was born, but it has remained constant since the 1960 Census.
In spite of the fact that Arkansas' congressional representation was small, the state accumulated legislative power through seniority in the middle of the 20th century. The same two men, John McClellan and Bill Fulbright, represented the state in the Senate from the 1940s until the 1970s and served as chairmen of powerful Senate committees (McClellan in Appropriations, Fulbright in Foreign Relations) near the end of their careers.
And, for nearly four decades, from the presidency of Franklin D. Roosevelt until the election of Jimmy Carter, the central Arkansas district where I grew up was represented by Wilbur Mills, who would have been 100 years old today. He rose to become chairman of the Ways and Means Committee, wielding considerable power — which may have prevented him from recognizing certain facts that might have spared him some embarrassment in his later years.
In 1972, for example, Mills was persuaded by many of his supporters in Arkansas and his colleagues in Washington to run for the Democratic presidential nomination. Mills did enter a few primaries that year and tried to position himself as the friend of the senior citizens by supporting the automatic Cost Of Living Adjustment to Social Security.
But Mills wasn't a very effective vote–getter, which may have been due to his experience — or absence of it. He had a lot of experience as a lawmaker, of course, but he seldom faced opposition back home so his electoral experience was lacking. That was a handicap in a field that included several charismatic candidates like Hubert Humphrey, Ed Muskie, eventual nominee George McGovern and, for a time, George Wallace.
Two years later, in what turned out to be his final campaign, Mills had to fight for his seat against a little–known Republican named Judy Petty. His dismal performance in presidential politics had nothing to do with it, but, again, his lack of experience as a campaigner may have played a role in the outcome.
Until about a month before the election, Mills seemed invincible — but then he was found, cut and bleeding, in the Washington Tidal Basin with a stripper named Annabelle Battistella, better known by her stage name of Fanne Fox ("the Argentine Firecracker").
Mills came back to Arkansas and, for the first time in voters' memories, really had to campaign to keep his seat. He won re–election — but, in the decidedly Democratic year of 1974, he could only manage 59% of the vote. Not long after the election, Mills announced that he was an alcoholic and would seek treatment. He gave up his chairmanship, and his congressional career was, effectively, over.
I guess Mills' story is a cautionary tale about the arrogance of power. It is a story that powerful people still need to learn, although I have heard it argued that the mindset that led to Mills' downfall was an aberration, a peculiarity of his generation.
But I believe it is a human flaw that is not confined to a generation.
And political success can be so seductive that some still fail to recognize when the waves of history have left them high and dry (Dick Cheney comes to mind), and present recipients of its largesse must be sensitive to shifts in the electorate, whether it is a call to increase support for same–sex marriage or legalize marijuana or anything else, if they wish to remain in its good graces (take note, Barack Obama).
While she did not defeat Mills, Petty's words during the 1974 campaign are worth remembering: "The most beautiful words in the Constitution are not 'he's the chairman' or 'he's the powerful,' " she said. "It's 'we the people.' "
Representation in the House is based on the most recent census report. Thus, a delegation can grow or shrink, depending upon what the state's population did in the preceding decade. Arkansas is 32nd in population, so its delegation is one of the smallest. Its delegation was larger in the decades before I was born, but it has remained constant since the 1960 Census.
In spite of the fact that Arkansas' congressional representation was small, the state accumulated legislative power through seniority in the middle of the 20th century. The same two men, John McClellan and Bill Fulbright, represented the state in the Senate from the 1940s until the 1970s and served as chairmen of powerful Senate committees (McClellan in Appropriations, Fulbright in Foreign Relations) near the end of their careers.
And, for nearly four decades, from the presidency of Franklin D. Roosevelt until the election of Jimmy Carter, the central Arkansas district where I grew up was represented by Wilbur Mills, who would have been 100 years old today. He rose to become chairman of the Ways and Means Committee, wielding considerable power — which may have prevented him from recognizing certain facts that might have spared him some embarrassment in his later years.
In 1972, for example, Mills was persuaded by many of his supporters in Arkansas and his colleagues in Washington to run for the Democratic presidential nomination. Mills did enter a few primaries that year and tried to position himself as the friend of the senior citizens by supporting the automatic Cost Of Living Adjustment to Social Security.
But Mills wasn't a very effective vote–getter, which may have been due to his experience — or absence of it. He had a lot of experience as a lawmaker, of course, but he seldom faced opposition back home so his electoral experience was lacking. That was a handicap in a field that included several charismatic candidates like Hubert Humphrey, Ed Muskie, eventual nominee George McGovern and, for a time, George Wallace.
Two years later, in what turned out to be his final campaign, Mills had to fight for his seat against a little–known Republican named Judy Petty. His dismal performance in presidential politics had nothing to do with it, but, again, his lack of experience as a campaigner may have played a role in the outcome.
Until about a month before the election, Mills seemed invincible — but then he was found, cut and bleeding, in the Washington Tidal Basin with a stripper named Annabelle Battistella, better known by her stage name of Fanne Fox ("the Argentine Firecracker").
Mills came back to Arkansas and, for the first time in voters' memories, really had to campaign to keep his seat. He won re–election — but, in the decidedly Democratic year of 1974, he could only manage 59% of the vote. Not long after the election, Mills announced that he was an alcoholic and would seek treatment. He gave up his chairmanship, and his congressional career was, effectively, over.
I guess Mills' story is a cautionary tale about the arrogance of power. It is a story that powerful people still need to learn, although I have heard it argued that the mindset that led to Mills' downfall was an aberration, a peculiarity of his generation.
But I believe it is a human flaw that is not confined to a generation.
And political success can be so seductive that some still fail to recognize when the waves of history have left them high and dry (Dick Cheney comes to mind), and present recipients of its largesse must be sensitive to shifts in the electorate, whether it is a call to increase support for same–sex marriage or legalize marijuana or anything else, if they wish to remain in its good graces (take note, Barack Obama).
While she did not defeat Mills, Petty's words during the 1974 campaign are worth remembering: "The most beautiful words in the Constitution are not 'he's the chairman' or 'he's the powerful,' " she said. "It's 'we the people.' "
Saturday, May 23, 2009
Passing Gas
According to the results of a visitor poll at the CNN.com website just after 4 p.m. (Central) today, about 85% of more than 340,000 respondents have no travel plans for the Memorial Day weekend. Apparently, a lot of folks are staying close to home. They may be going to a nearby park for a picnic or a nearby beach for some swimming or a friend's house for a cookout, but if they're going anywhere, they're going places within easy driving range of home.
I wonder if many Californians are traveling this holiday weekend. Gas prices in California are among the nation's highest, and the unemployment rate exceeded 10% several months ago.
You can relax a little, though. No one is suggesting that we will witness a repeat of last summer's $4 gas prices.
Even so, California's in a lot of financial trouble, but the special election this week didn't do much to resolve it. As the Los Angeles Times wrote this week, both liberals and conservatives can apply their own special spins to the results, but it's clear that program cuts will be necessary.
The Times didn't think the results suggested a philosophical shift. Intead, the Times suggested the vote reflected the influence of "high unemployment and scarce cash."
Still, the prudent thing for a Californian to do these days is keep personal spending down. If that is what a lot of Californians are doing this weekend, I wonder what that's doing to the tourist attractions sprinkled across the Golden State. Between joblessness and gas prices, there has to be less disposable income than those attractions are accustomed to.
Speaking of gas, Bonnie Parker and Clyde Barrow were known for the banks they robbed, but they preferred to rob gas stations and stores — I guess they were willing to swap the higher yield one could expect from a bank for a greater likelihood that there would be no security officers on the premises of a store or gas station.
Bonnie and Clyde, who lived in this area before launching their criminal careers, achieved something of a folk–hero status during their brief lives, which came to an end 75 years ago today when they were ambushed near their hideout in Bienville Parish, La. The six members of the posse fired approximately 130 rounds.
Here's an interesting piece of trivia. In Bonnie and Clyde's day, gas stations were called "filling stations." At least, I know that is what they were called here in Texas. I remember my grandparents, who lived in Dallas, always called them "filling stations."
My parents, who were small children when Bonnie and Clyde were killed, grew up with that phrase, but they must have been coming of age when the terminology began to change. So my memory is that they alternated between the two phrases until, at some point, "gas station" took up permanent residence in their heads.
I wonder if many Californians are traveling this holiday weekend. Gas prices in California are among the nation's highest, and the unemployment rate exceeded 10% several months ago.
You can relax a little, though. No one is suggesting that we will witness a repeat of last summer's $4 gas prices.
Even so, California's in a lot of financial trouble, but the special election this week didn't do much to resolve it. As the Los Angeles Times wrote this week, both liberals and conservatives can apply their own special spins to the results, but it's clear that program cuts will be necessary.
The Times didn't think the results suggested a philosophical shift. Intead, the Times suggested the vote reflected the influence of "high unemployment and scarce cash."
Still, the prudent thing for a Californian to do these days is keep personal spending down. If that is what a lot of Californians are doing this weekend, I wonder what that's doing to the tourist attractions sprinkled across the Golden State. Between joblessness and gas prices, there has to be less disposable income than those attractions are accustomed to.
Speaking of gas, Bonnie Parker and Clyde Barrow were known for the banks they robbed, but they preferred to rob gas stations and stores — I guess they were willing to swap the higher yield one could expect from a bank for a greater likelihood that there would be no security officers on the premises of a store or gas station.
Bonnie and Clyde, who lived in this area before launching their criminal careers, achieved something of a folk–hero status during their brief lives, which came to an end 75 years ago today when they were ambushed near their hideout in Bienville Parish, La. The six members of the posse fired approximately 130 rounds.
Here's an interesting piece of trivia. In Bonnie and Clyde's day, gas stations were called "filling stations." At least, I know that is what they were called here in Texas. I remember my grandparents, who lived in Dallas, always called them "filling stations."
My parents, who were small children when Bonnie and Clyde were killed, grew up with that phrase, but they must have been coming of age when the terminology began to change. So my memory is that they alternated between the two phrases until, at some point, "gas station" took up permanent residence in their heads.
Labels:
Bonnie and Clyde,
California,
economy,
gas prices,
history
Religion and Politics
This morning, I saw an intriguing item on the CNN Political Ticker about an action taken by Liberty University, the school in Lynchburg, Va., that was founded nearly 40 years ago by the late Jerry Falwell.
Falwell died two years ago this month. During his lifetime, he was known for his right–wing political beliefs, which were embodied in the agenda of his political organization, the Moral Majority. Liberty University seems to be upholding Falwell's agenda.
Ray Reed writes, in the Lynchburg News & Advance, that Liberty officials say they revoked recognition of the campus' College Democrats for religious, not political, reasons. But part of the problem here is that, thanks in part to Rev. Falwell's activities and statements, the line between religion and politics is blurred. Before the creation of the Moral Majority, neither major party was seen as the home party of conservative Christians.
Actually, until the 1980s, conservative Christians were not united in their political activities. Many conservative Christians were not politically active at all. But Falwell brought them into the political arena and encouraged them to embrace the Republican Party. And there they have been for three decades. It is safe to say they were not motivated by economic policy or, for that matter, global politics — unless there was a direct conflict with evangelical Christians. They used the Republican Party as their platform to bring attention to crusades against abortion and homosexuality and to promote what they considered a traditional family.
At Liberty University, the primary sticking points with the College Democrats — there may be others — seem to be the group's pro–choice stance on abortion and support for gay rights and how those positions conflict with the school's policies. The policies may be presented as representing the views of today's school administration, but they are Falwell's legacy.
Not that there was ever much of a line drawn between religion and politics in this country. In spite of the lofty language one often hears about the "separation of church and state," it always has been virtually impossible to divorce the two in a nation where schoolchildren recite a pledge of allegiance that contains the words "one nation under God" on a daily basis.
Virginia Gov. Tim Kaine, a Democrat, has asked Liberty to reconsider. A reader poll at the News & Advance website shows that more than 90% of respondents disagree with Liberty's decision.
With public sentiment running that high against Liberty, it is no surprise that bloggers are almost unanimous in expressing their outrage. Here is a small sample:
And, as a lifelong supporter of free speech, as well as a Democrat (by the way, Professor Moran, "democrat" with a lowercase "d" refers to someone who supports democracy in general, regardless of party affiliation, while "Democrat" with an uppercase "D" refers to a member of the Democratic Party. I consider myself both, but I am a centrist, which has led many of my Democratic acquaintances to wonder which side I was on), I sympathize.
But what can be done about it?
Liberty University is a private college. I am not aware of any public funding that is used to support the school. A couple of years ago, before Falwell died, Liberty had a debt of between $20 million and $25 million. But Falwell had a $34 million insurance policy, and the proceeds from that were used to pay off Liberty's debts. Since then, it has been a self–sustaining institution.
Thus, as far as I know, Liberty is privately funded. Perhaps it does receive some public funds. If it does, that would be grounds for challenging its actions.
But unless public money is involved in the operation of the school, there isn't anything that can be done if Liberty believes, as Mark Hine, the school's vice president of student affairs, stated in his e–mail to the College Democrats, that "[t]he Democratic Party platform is contrary to the mission of Liberty University and to Christian doctrine."
In America, you are free to disagree with that position. You may, as many bloggers are doing today, argue against it. But there is little else you can do about it.
Except discourage your children from going to school there.
Falwell died two years ago this month. During his lifetime, he was known for his right–wing political beliefs, which were embodied in the agenda of his political organization, the Moral Majority. Liberty University seems to be upholding Falwell's agenda.
Ray Reed writes, in the Lynchburg News & Advance, that Liberty officials say they revoked recognition of the campus' College Democrats for religious, not political, reasons. But part of the problem here is that, thanks in part to Rev. Falwell's activities and statements, the line between religion and politics is blurred. Before the creation of the Moral Majority, neither major party was seen as the home party of conservative Christians.
Actually, until the 1980s, conservative Christians were not united in their political activities. Many conservative Christians were not politically active at all. But Falwell brought them into the political arena and encouraged them to embrace the Republican Party. And there they have been for three decades. It is safe to say they were not motivated by economic policy or, for that matter, global politics — unless there was a direct conflict with evangelical Christians. They used the Republican Party as their platform to bring attention to crusades against abortion and homosexuality and to promote what they considered a traditional family.
At Liberty University, the primary sticking points with the College Democrats — there may be others — seem to be the group's pro–choice stance on abortion and support for gay rights and how those positions conflict with the school's policies. The policies may be presented as representing the views of today's school administration, but they are Falwell's legacy.
Not that there was ever much of a line drawn between religion and politics in this country. In spite of the lofty language one often hears about the "separation of church and state," it always has been virtually impossible to divorce the two in a nation where schoolchildren recite a pledge of allegiance that contains the words "one nation under God" on a daily basis.
Virginia Gov. Tim Kaine, a Democrat, has asked Liberty to reconsider. A reader poll at the News & Advance website shows that more than 90% of respondents disagree with Liberty's decision.
With public sentiment running that high against Liberty, it is no surprise that bloggers are almost unanimous in expressing their outrage. Here is a small sample:
- FaithfulDemocrats.com, which bills itself as an "online Christian community," calls the move "brazenly undemocratic."
- Joe.My.God, an apparently gay–oriented blog, gives a (pardon the expression) straight–forward account of the decision in its post under what appears to be a slanted headline that reads "Liberty" University Bans Democrats Because They May Support Dirty Homos.
- Sandwalk writes that the move is "just the beginning," that it is the first step at Liberty University.
"Mark my words," writes biochemistry Professor Larry Moran. "In a few weeks they're going to shut down the 'Liberty University Gays and Lesbians Club' and the 'Liberty University Secular Humanist Club.' And it's only a matter of time before the 'Liberty University Teletubbies Fan Club' is kicked off campus."
And, as a lifelong supporter of free speech, as well as a Democrat (by the way, Professor Moran, "democrat" with a lowercase "d" refers to someone who supports democracy in general, regardless of party affiliation, while "Democrat" with an uppercase "D" refers to a member of the Democratic Party. I consider myself both, but I am a centrist, which has led many of my Democratic acquaintances to wonder which side I was on), I sympathize.
But what can be done about it?
Liberty University is a private college. I am not aware of any public funding that is used to support the school. A couple of years ago, before Falwell died, Liberty had a debt of between $20 million and $25 million. But Falwell had a $34 million insurance policy, and the proceeds from that were used to pay off Liberty's debts. Since then, it has been a self–sustaining institution.
Thus, as far as I know, Liberty is privately funded. Perhaps it does receive some public funds. If it does, that would be grounds for challenging its actions.
But unless public money is involved in the operation of the school, there isn't anything that can be done if Liberty believes, as Mark Hine, the school's vice president of student affairs, stated in his e–mail to the College Democrats, that "[t]he Democratic Party platform is contrary to the mission of Liberty University and to Christian doctrine."
In America, you are free to disagree with that position. You may, as many bloggers are doing today, argue against it. But there is little else you can do about it.
Except discourage your children from going to school there.
Labels:
College Democrats,
Liberty University,
politics,
religion
Friday, May 22, 2009
When All Things Seem Possible
An old friend of mine sent me an e–mail reporting that he went to his brother's son's high school graduation last night. He didn't have much to say about the actual ceremony, but he did say that he had dinner with his family afterward. His nephew apparently wants to pursue a career in law enforcement and is planning to continue his education with that in mind.
My friend and I grew up in Conway, Ark., the hometown of recent "American Idol" winner Kris Allen. Earlier this week, I wrote about my memories of the Conway of my childhood and Allen's improbable victory at my Birth of a Notion blog, so it isn't my intention to go over that territory here ... except to make a couple of observations.
I don't know how many of my classmates went on to college. When I graduated from high school, I know there were some in my class who chose not to continue their education and went straight into the workforce. Some got married and started having kids right away.
In many ways, that was a different time — in others, it was not so different. We didn't realize it then, but we were about to encounter a severe recession that has frequently been compared to the one we face today. But when my high school friends and I walked across that stage and received our diplomas, all things seemed possible. Unemployment was a source of concern, of course — the rate at the time was around 6% — but the rate had been declining in the months leading up to graduation. So, whether our plans included college or not, the future looked bright.
Today's graduates face a different set of circumstances. There are a lot more of them, for one thing. From what I've been reading on the website for my hometown newspaper, there were nearly 600 graduates in my old high school this year. That's close to twice the number who graduated when I did.
The account of the graduation ceremony indicated that, while the graduates may have been from a different generation, they experienced the same conflicting emotions that we did. They were glad to leave, yet sad at the same time. And, in today's economy, I'm sure there's some ambiguity about what to do next.
My advice would be to go on to college or community college or trade school or whatever.
I don't know if there is a perception among today's graduates that a college degree is a ticket to a lifetime of security, as there was in my graduating class, but I think the current recession has pretty much disabused many of that notion.
From a practical standpoint, staying in college means one can continue to receive health care coverage through his/her parents' employers. Elizabeth Cohen writes about this issue for CNNhealth.com, but she approaches it from the perspective of recent college graduates, not recent high school graduates.
Even so, I think anyone who is finishing one level of schooling these days would be wise to strongly consider moving on to the next level — if only because most policies will continue to cover dependents as long as they are students. It isn't as easy as it once was to get health care benefits with a job offer.
While it is possible that current conditions are more favorable for health care reform than they were when Hillary Clinton tried to achieve it in the 1990s, that isn't a sure thing. The smart thing for young people to do these days is to remain in school and take advantage of their parents' coverage while they can.
And times may change again. An advanced education may be more valuable in the future than it seems to be today. Having a college degree may once again be the advantage that it was.
We can all hope that the recession will be over by the time today's high school graduates walk across that stage again.
My friend and I grew up in Conway, Ark., the hometown of recent "American Idol" winner Kris Allen. Earlier this week, I wrote about my memories of the Conway of my childhood and Allen's improbable victory at my Birth of a Notion blog, so it isn't my intention to go over that territory here ... except to make a couple of observations.
I don't know how many of my classmates went on to college. When I graduated from high school, I know there were some in my class who chose not to continue their education and went straight into the workforce. Some got married and started having kids right away.
In many ways, that was a different time — in others, it was not so different. We didn't realize it then, but we were about to encounter a severe recession that has frequently been compared to the one we face today. But when my high school friends and I walked across that stage and received our diplomas, all things seemed possible. Unemployment was a source of concern, of course — the rate at the time was around 6% — but the rate had been declining in the months leading up to graduation. So, whether our plans included college or not, the future looked bright.
Today's graduates face a different set of circumstances. There are a lot more of them, for one thing. From what I've been reading on the website for my hometown newspaper, there were nearly 600 graduates in my old high school this year. That's close to twice the number who graduated when I did.
The account of the graduation ceremony indicated that, while the graduates may have been from a different generation, they experienced the same conflicting emotions that we did. They were glad to leave, yet sad at the same time. And, in today's economy, I'm sure there's some ambiguity about what to do next.
My advice would be to go on to college or community college or trade school or whatever.
I don't know if there is a perception among today's graduates that a college degree is a ticket to a lifetime of security, as there was in my graduating class, but I think the current recession has pretty much disabused many of that notion.
From a practical standpoint, staying in college means one can continue to receive health care coverage through his/her parents' employers. Elizabeth Cohen writes about this issue for CNNhealth.com, but she approaches it from the perspective of recent college graduates, not recent high school graduates.
Even so, I think anyone who is finishing one level of schooling these days would be wise to strongly consider moving on to the next level — if only because most policies will continue to cover dependents as long as they are students. It isn't as easy as it once was to get health care benefits with a job offer.
While it is possible that current conditions are more favorable for health care reform than they were when Hillary Clinton tried to achieve it in the 1990s, that isn't a sure thing. The smart thing for young people to do these days is to remain in school and take advantage of their parents' coverage while they can.
And times may change again. An advanced education may be more valuable in the future than it seems to be today. Having a college degree may once again be the advantage that it was.
We can all hope that the recession will be over by the time today's high school graduates walk across that stage again.
Thursday, May 21, 2009
The Virginia Governor's Race
Every four years — in odd–numbered years — the voters of Virginia select a new governor. Virginia holds its gubernatorial election the year after a presidential election, so 2009 is an election year in that state.
State law prohibits the incumbent from seeking re–election so Democratic Gov. Tim Kaine cannot seek another four–year term this year. Kaine, however, won't have to look for a job when he leaves office in January; he was chosen to be the new Democratic National Committee chairman earlier this year, a job to which he can devote his full attention starting in 2010.
In recent times, the Virginia governor's race has served as something of a political bellwether in reverse. By that, I mean that, whichever party has won the presidency, the other party has won the governor's office in Virginia the following year. In 2000 and 2004, of course, Republican George W. Bush was elected president; Democrats were elected governor in 2001 and 2005. In 1992 and 1996, Democrat Bill Clinton won the presidency; Republicans were elected governor in 1993 and 1997.
Likewise, during the 1980s, following the elections of Republican presidents — Ronald Reagan in 1980 and 1984 and George H.W. Bush in 1988 — Democrats were elected governor in 1981, 1985 and 1989. In fact, the Democrat who was elected in 1989, Douglas Wilder, was the first black elected governor in the United States.
And, in 1977, the year after Jimmy Carter was elected president, a Republican was elected governor.
That is when the current streak began.
In the elections following Republican Richard Nixon's victories in 1968 and 1972, Republicans were elected governor. Prior to that, Democrats were elected governor in the years following the elections of Democratic presidents in 1960 and 1964.
In fact, the Republican who was elected governor the year after Nixon's triumph in 1968, A. Linwood Holton Jr., was the first Republican elected governor in Virginia since Reconstruction. When busing was an issue in Holton's first year in office, he enrolled his children in the mostly black public schools in Richmond. Later, in 1978, Holton unsuccessfully sought the GOP nomination for the U.S. Senate — the eventual nominee, John Warner, won the election and served five terms before retiring last year.
Holton, by the way, is still alive, in spite of undergoing surgery for bladder cancer in 2005. He's 85 years old, and he actively campaigned for Barack Obama last year.
Anyway, considering that the last eight governors of Virginia came from the party that did not win the White House the year before, logic would suggest that it is the Republicans' turn to win.
And recent polls seem to bear that out. The state's former attorney general, Bob McDonnell, has been leading his most likely Democratic opponent, former Democratic National Committee chairman and former chairman of Hillary Clinton's presidential campaign Terry McAuliffe.
Democratic polls indicate that McAuliffe is the favorite over his two rivals, Brian Moran (the younger brother of Rep. Jim Moran) and state Sen. Creigh Deeds. But, as Kyle Trygstad writes for RealClearPolitics, none of the Democrats has been eager to play up national connections in this campaign.
I find that particularly interesting, given Virginia's history. When I was in school, Virginia was known as the "Mother of Presidents" because eight presidents (George Washington, Thomas Jefferson, James Madison, James Monroe, William Henry Harrison, John Tyler, Zachary Taylor and Woodrow Wilson) were born there, although three of those presidents did not live in Virginia during at least part of their adult years.
Three of those presidents served as governor of Virginia before moving to the White House. And, although it has been more than 180 years since a future president was elected governor of the state, Virginians have, from time to time, been mentioned as possible nominees for president or vice president.
Wilder briefly sought the presidential nomination in 1992. Kaine and another former governor, current Sen. Mark Warner, both were mentioned as possible running mates for Obama.
The most recent example of a Virginian who was, at one time, regarded a major contender was a man who has never campaigned officially in a presidential primary or caucus — at least, not yet — although he did make several trips to Iowa and New Hampshire in what were perceived to be warmups for the presidential contests in those states.
That man was Republican George Allen, who was elected governor in 1993. After leaving office in 1998, Allen went on to be elected to the Senate in 2000, defeating another ex–governor, Chuck Robb, the son–in–law of Lyndon Johnson who was seeking his third term.
Many people thought re–election would be merely a formality for Allen in 2006, and there was already plenty of talk about his chances of winning the 2008 Republican nomination.
But a funny thing happened along the way. In August 2006, during a campaign stop near the Virginia–Kentucky border, Allen spotted an Indian–American in the crowd who was recording Allen with his camcorder for Democratic candidate Jim Webb. Allen referred to the man as "macaca," which is a derogatory term for dark–skinned people that is common among French colonists in North Africa.
Allen's mother was raised in the French colonial community in Morocco, and many people speculated that Allen heard her use that word when he was growing up.
Anyway, the situation snowballed. Later in August, the Jewish periodical The Forward reported that Allen's mother probably was Jewish, an assertion that Allen vigorously denied, then recanted. Then, in September, three of Allen's former college football teammates claimed they heard him use the word "nigger" on several occasions.
Other former teammates stepped forward to say they had never heard Allen use that word, but, by that time, his verifiable past had come back to haunt him. Allen was shown to have had a long interest in the Confederate flag, reportedly wearing a Confederate flag lapel pin for his high school senior class photo and displaying the flag, in one form or another, from 1967 to 2000. He also used the Confederate flag in his first statewide TV commercial when running for governor in 1993.
While he was governor, Allen declared April "Confederate History and Heritage Month" in Virginia. And he opposed the establishment of Martin Luther King Day in the state.
In the end, Webb won a narrow victory over Allen, and any hopes he may have had for securing the GOP presidential nomination in 2008 disappeared. In February 2008, Tim Craig wrote a speculative piece in the Washington Post in which he wondered what might have happened if Allen had never said the word "macaca."
On seemingly inconsequential things elections can turn.
In 1967, George Romney (Mitt Romney's father) was widely considered a leading prospect for the presidency. Then, in an interview about his 1965 trip to Vietnam, he described his earlier views about the war as being the result of "brainwashing," which derailed his campaign and opened the door for former Vice President Richard Nixon.
In 1972, Democratic front–runner Ed Muskie gave an emotional response to an attack on his wife that was published in the Manchester Union–Leader in New Hampshire. Muskie responded outdoors, in the snow, and some people said he was moved to tears, although film of the episode was inconclusive. Snowflakes on his face may have been mistaken for tears.
Muskie's campaign collapsed, opening the door for insurgent George McGovern, although it is also possible that the famed "Canuck letter," which alleged that Muskie slurred the French–Canadians (a fairly substantial voting bloc in New Hampshire) but was actually one of the Nixon campaign's "dirty tricks," may have played a role.
Here in Texas, Ann Richards won a close race for governor in 1990. Earlier in the campaign, she was widely expected to lose to Republican businessman Clayton Williams, even though Williams made a number of unsavory remarks during the campaign, most notably comparing bad weather to rape. "If it's inevitable," he said, "just relax and enjoy it."
What was believed to turn the tide in Richards' favor, however, was a moment that was captured on film a few weeks before the election. In a public debate, Richards offered her hand to Williams, but he refused to shake it. His response was seen as uncouth, and Richards claimed a narrow victory.
Obama won Virginia by more than 6% of the vote, becoming the first Democrat in four decades to carry the state in a presidential election. Although his margin in the Electoral College would not have been significantly altered if he had not carried Virginia, one can only wonder, in hindsight, what effect Allen's presence on the Republican ticket might have had if a dark–skinned man with a camcorder had not been in one of his audiences in August 2006.
State law prohibits the incumbent from seeking re–election so Democratic Gov. Tim Kaine cannot seek another four–year term this year. Kaine, however, won't have to look for a job when he leaves office in January; he was chosen to be the new Democratic National Committee chairman earlier this year, a job to which he can devote his full attention starting in 2010.
In recent times, the Virginia governor's race has served as something of a political bellwether in reverse. By that, I mean that, whichever party has won the presidency, the other party has won the governor's office in Virginia the following year. In 2000 and 2004, of course, Republican George W. Bush was elected president; Democrats were elected governor in 2001 and 2005. In 1992 and 1996, Democrat Bill Clinton won the presidency; Republicans were elected governor in 1993 and 1997.
Likewise, during the 1980s, following the elections of Republican presidents — Ronald Reagan in 1980 and 1984 and George H.W. Bush in 1988 — Democrats were elected governor in 1981, 1985 and 1989. In fact, the Democrat who was elected in 1989, Douglas Wilder, was the first black elected governor in the United States.
And, in 1977, the year after Jimmy Carter was elected president, a Republican was elected governor.
That is when the current streak began.
In the elections following Republican Richard Nixon's victories in 1968 and 1972, Republicans were elected governor. Prior to that, Democrats were elected governor in the years following the elections of Democratic presidents in 1960 and 1964.
In fact, the Republican who was elected governor the year after Nixon's triumph in 1968, A. Linwood Holton Jr., was the first Republican elected governor in Virginia since Reconstruction. When busing was an issue in Holton's first year in office, he enrolled his children in the mostly black public schools in Richmond. Later, in 1978, Holton unsuccessfully sought the GOP nomination for the U.S. Senate — the eventual nominee, John Warner, won the election and served five terms before retiring last year.
Holton, by the way, is still alive, in spite of undergoing surgery for bladder cancer in 2005. He's 85 years old, and he actively campaigned for Barack Obama last year.
Anyway, considering that the last eight governors of Virginia came from the party that did not win the White House the year before, logic would suggest that it is the Republicans' turn to win.
And recent polls seem to bear that out. The state's former attorney general, Bob McDonnell, has been leading his most likely Democratic opponent, former Democratic National Committee chairman and former chairman of Hillary Clinton's presidential campaign Terry McAuliffe.
Democratic polls indicate that McAuliffe is the favorite over his two rivals, Brian Moran (the younger brother of Rep. Jim Moran) and state Sen. Creigh Deeds. But, as Kyle Trygstad writes for RealClearPolitics, none of the Democrats has been eager to play up national connections in this campaign.
I find that particularly interesting, given Virginia's history. When I was in school, Virginia was known as the "Mother of Presidents" because eight presidents (George Washington, Thomas Jefferson, James Madison, James Monroe, William Henry Harrison, John Tyler, Zachary Taylor and Woodrow Wilson) were born there, although three of those presidents did not live in Virginia during at least part of their adult years.
Three of those presidents served as governor of Virginia before moving to the White House. And, although it has been more than 180 years since a future president was elected governor of the state, Virginians have, from time to time, been mentioned as possible nominees for president or vice president.
Wilder briefly sought the presidential nomination in 1992. Kaine and another former governor, current Sen. Mark Warner, both were mentioned as possible running mates for Obama.
The most recent example of a Virginian who was, at one time, regarded a major contender was a man who has never campaigned officially in a presidential primary or caucus — at least, not yet — although he did make several trips to Iowa and New Hampshire in what were perceived to be warmups for the presidential contests in those states.
That man was Republican George Allen, who was elected governor in 1993. After leaving office in 1998, Allen went on to be elected to the Senate in 2000, defeating another ex–governor, Chuck Robb, the son–in–law of Lyndon Johnson who was seeking his third term.
Many people thought re–election would be merely a formality for Allen in 2006, and there was already plenty of talk about his chances of winning the 2008 Republican nomination.
But a funny thing happened along the way. In August 2006, during a campaign stop near the Virginia–Kentucky border, Allen spotted an Indian–American in the crowd who was recording Allen with his camcorder for Democratic candidate Jim Webb. Allen referred to the man as "macaca," which is a derogatory term for dark–skinned people that is common among French colonists in North Africa.
Allen's mother was raised in the French colonial community in Morocco, and many people speculated that Allen heard her use that word when he was growing up.
Anyway, the situation snowballed. Later in August, the Jewish periodical The Forward reported that Allen's mother probably was Jewish, an assertion that Allen vigorously denied, then recanted. Then, in September, three of Allen's former college football teammates claimed they heard him use the word "nigger" on several occasions.
Other former teammates stepped forward to say they had never heard Allen use that word, but, by that time, his verifiable past had come back to haunt him. Allen was shown to have had a long interest in the Confederate flag, reportedly wearing a Confederate flag lapel pin for his high school senior class photo and displaying the flag, in one form or another, from 1967 to 2000. He also used the Confederate flag in his first statewide TV commercial when running for governor in 1993.
While he was governor, Allen declared April "Confederate History and Heritage Month" in Virginia. And he opposed the establishment of Martin Luther King Day in the state.
In the end, Webb won a narrow victory over Allen, and any hopes he may have had for securing the GOP presidential nomination in 2008 disappeared. In February 2008, Tim Craig wrote a speculative piece in the Washington Post in which he wondered what might have happened if Allen had never said the word "macaca."
On seemingly inconsequential things elections can turn.
In 1967, George Romney (Mitt Romney's father) was widely considered a leading prospect for the presidency. Then, in an interview about his 1965 trip to Vietnam, he described his earlier views about the war as being the result of "brainwashing," which derailed his campaign and opened the door for former Vice President Richard Nixon.
In 1972, Democratic front–runner Ed Muskie gave an emotional response to an attack on his wife that was published in the Manchester Union–Leader in New Hampshire. Muskie responded outdoors, in the snow, and some people said he was moved to tears, although film of the episode was inconclusive. Snowflakes on his face may have been mistaken for tears.
Muskie's campaign collapsed, opening the door for insurgent George McGovern, although it is also possible that the famed "Canuck letter," which alleged that Muskie slurred the French–Canadians (a fairly substantial voting bloc in New Hampshire) but was actually one of the Nixon campaign's "dirty tricks," may have played a role.
Here in Texas, Ann Richards won a close race for governor in 1990. Earlier in the campaign, she was widely expected to lose to Republican businessman Clayton Williams, even though Williams made a number of unsavory remarks during the campaign, most notably comparing bad weather to rape. "If it's inevitable," he said, "just relax and enjoy it."
What was believed to turn the tide in Richards' favor, however, was a moment that was captured on film a few weeks before the election. In a public debate, Richards offered her hand to Williams, but he refused to shake it. His response was seen as uncouth, and Richards claimed a narrow victory.
Obama won Virginia by more than 6% of the vote, becoming the first Democrat in four decades to carry the state in a presidential election. Although his margin in the Electoral College would not have been significantly altered if he had not carried Virginia, one can only wonder, in hindsight, what effect Allen's presence on the Republican ticket might have had if a dark–skinned man with a camcorder had not been in one of his audiences in August 2006.
Labels:
Democrats,
governor,
history,
polls,
presidency,
Republicans,
Virginia
Wednesday, May 20, 2009
What the Future May Hold
Chris Isidore reports, for CNNMoney.com, that the Federal Reserve has revised its economic forecast.
The Fed, says Isidore, now expects unemployment to rise to between 9.2% and 9.6%. In January, the Fed anticipated that unemployment would peak between 8.5% and 8.8%.
That revision is bad news, right? Well, yes and no.
Yes, it's bad news because it is higher than the Fed expected at the beginning of the year.
And no, because the unemployment figure in April — 8.9% — already exceeded the Fed's prediction.
Clearly, the Fed expects more jobs to be lost. But that is hardly a stop–the–presses revelation. If the Fed's prediction turns out to be correct, the unemployment rate actually will be lower than many economists have been anticipating. Those economists have been predicting a national unemployment rate in double digits.
I suppose the really bad news in today's report is that the Fed now sees more of a decline in gross domestic product (GDP). In January, the Fed thought GDP would drop 0.5% to 1.3%. The expectation now is for GDP to fall between 1.3% and 2.0%.
Again, that may be a good news/bad news kind of scenario. It certainly isn't good news that GDP will be down for the year. But that was pretty much a given, since GDP was down 6% in the first quarter of 2009. The report suggests — and the minutes of the Fed's April meeting confirm — that Fed members believe GDP will increase — albeit slightly — in the second half of the year.
Initially, the Fed's outlook had negative results on Wall Street. Stocks were down between 0.39% and 0.62% on the Dow, Nasdaq and S&P.
Elsewhere today, there was some good news for consumers. Congress approved legislation making it difficult for credit cards to raise fees and interest rates.
The new rules will go into effect in February.
The Fed, says Isidore, now expects unemployment to rise to between 9.2% and 9.6%. In January, the Fed anticipated that unemployment would peak between 8.5% and 8.8%.
That revision is bad news, right? Well, yes and no.
Yes, it's bad news because it is higher than the Fed expected at the beginning of the year.
And no, because the unemployment figure in April — 8.9% — already exceeded the Fed's prediction.
Clearly, the Fed expects more jobs to be lost. But that is hardly a stop–the–presses revelation. If the Fed's prediction turns out to be correct, the unemployment rate actually will be lower than many economists have been anticipating. Those economists have been predicting a national unemployment rate in double digits.
I suppose the really bad news in today's report is that the Fed now sees more of a decline in gross domestic product (GDP). In January, the Fed thought GDP would drop 0.5% to 1.3%. The expectation now is for GDP to fall between 1.3% and 2.0%.
Again, that may be a good news/bad news kind of scenario. It certainly isn't good news that GDP will be down for the year. But that was pretty much a given, since GDP was down 6% in the first quarter of 2009. The report suggests — and the minutes of the Fed's April meeting confirm — that Fed members believe GDP will increase — albeit slightly — in the second half of the year.
Initially, the Fed's outlook had negative results on Wall Street. Stocks were down between 0.39% and 0.62% on the Dow, Nasdaq and S&P.
Elsewhere today, there was some good news for consumers. Congress approved legislation making it difficult for credit cards to raise fees and interest rates.
The new rules will go into effect in February.
Labels:
credit cards,
Fed,
GDP,
unemployment
Blaming Bush
I think the first time I was exposed to the word "scapegoat" was during the Watergate scandal. Former White House lawyer John Dean claimed he was being made the scapegoat for Richard Nixon's woes.
I didn't know what the word meant. The Random House Dictionary sheds some light on that, telling us that the word "scapegoat" means "a person or group made to bear the blame for others or to suffer in their place."
Apparently, the word has biblical origins. Leviticus, in the Old Testament, describes the ritual. As a part of Yom Kippur ceremonies, a goat was driven into the wilderness to die, symbolically carrying the sins of the people on its back.
Christian theology sees the story of the scapegoat as the foreshadowing of the story of Jesus and his sacrifice for humanity.
I don't know about that. In my life, most of the scapegoats I've known of were athletes who, fairly or unfairly, were blamed for their teams' failures — Bill Buckner, whose error was said to cost the Boston Red Sox the 1986 World Series against the New York Mets, or Scott Norwood, whose missed field goal led to the first of four consecutive Super Bowl defeats for the Buffalo Bills.
A far more serious example of scapegoating occurred before I was born — when Nazi propaganda blamed the Jews for Germany's problems after World War I.
I guess it is a tendency of human nature to look for someone to blame when things go wrong. Perhaps that is why Barack Obama has insisted that Americans should look to the future and not look back as they seek to deal with the many problems facing the nation and the world. It is a sentiment I agree with, to a certain extent, although I still believe, as I have written on this blog before, that Congress should investigate the decisions that were made that led to the invasion of Iraq and the use of torture techniques in affiliated interrogations.
I have advocated such an investigation not because I want to punish anyone (notably the former president and vice president) but because there are lessons to be learned from how those decisions were made, and I believe we can benefit from that knowledge.
But, lately, I've been sensing a real bloodlust on the part of the public, and the previous administration is at the heart of it. As I have pointed out on many occasions, I am a Democrat, and I was never a Bush–Cheney supporter. But, as I have also stated in this blog, economies are massive, complex things. Presidents can give direction from the bully pulpit, but it is unfair and inappropriate to give them excessive credit or blame for the millions of decisions that business owners must make.
And the same thing applies to the people in their administrations.
But some people are adamant about finding someone to blame.
For example, I was looking at the New York Times' website today. For the third straight time, Maureen Dowd wrote a column about former Vice President Dick Cheney. Granted, Cheney's activities recently have been unseemly, to say the least, for a former vice president, but Dowd's columns seem to be particularly vitriolic.
Dowd made no secret of her support for Obama during the campaign, even before Obama's bid for the nomination took hold with the rank and file. Well, Dowd's candidate won, and Cheney's out of office now. Cheney may be in the spotlight by his own choice, but he has no authority to speak of. It seems, to me, that it would be a good idea for Dowd to ease up now.
Dowd isn't the only one, though. On Facebook lately, members have had the option of joining a group that constantly urges people to revel in "not having George Bush as president." Recently, this group has been encouraging people to celebrate the six–month anniversary of the end of the Bush presidency on July 20. From this group's perspective, I suppose it would be expected that parties on that date — which also happens to be the 40th anniversary of the first walk on the moon — would include piñatas in the shape of Bush and Cheney's heads.
More recently, this group has been polling people, asking them whether they would prefer to have Bush back as president ... or be impaled. The latest "results" I saw indicated that 225,000 people would rather be impaled while about 1,000 would opt to have Bush back in the White House.
Talk about a push poll.
Actually, I suspect the results would be different if the choices were real rather than hypothetical.
I understand the temptation to hold Bush and Cheney responsible for all the problems America must deal with now. And, even with all the things that are on the current administration's plate, I still believe there are valuable lessons to be learned from how the previous administration made decisions that determined how foreign policy was conducted, especially regarding how a war was launched.
But some of these other things seem counterproductive to me. They may be psychologically satisfying, but they do little, if anything, to help us find our way out of this wilderness.
I didn't know what the word meant. The Random House Dictionary sheds some light on that, telling us that the word "scapegoat" means "a person or group made to bear the blame for others or to suffer in their place."
Apparently, the word has biblical origins. Leviticus, in the Old Testament, describes the ritual. As a part of Yom Kippur ceremonies, a goat was driven into the wilderness to die, symbolically carrying the sins of the people on its back.
Christian theology sees the story of the scapegoat as the foreshadowing of the story of Jesus and his sacrifice for humanity.
I don't know about that. In my life, most of the scapegoats I've known of were athletes who, fairly or unfairly, were blamed for their teams' failures — Bill Buckner, whose error was said to cost the Boston Red Sox the 1986 World Series against the New York Mets, or Scott Norwood, whose missed field goal led to the first of four consecutive Super Bowl defeats for the Buffalo Bills.
A far more serious example of scapegoating occurred before I was born — when Nazi propaganda blamed the Jews for Germany's problems after World War I.
I guess it is a tendency of human nature to look for someone to blame when things go wrong. Perhaps that is why Barack Obama has insisted that Americans should look to the future and not look back as they seek to deal with the many problems facing the nation and the world. It is a sentiment I agree with, to a certain extent, although I still believe, as I have written on this blog before, that Congress should investigate the decisions that were made that led to the invasion of Iraq and the use of torture techniques in affiliated interrogations.
I have advocated such an investigation not because I want to punish anyone (notably the former president and vice president) but because there are lessons to be learned from how those decisions were made, and I believe we can benefit from that knowledge.
But, lately, I've been sensing a real bloodlust on the part of the public, and the previous administration is at the heart of it. As I have pointed out on many occasions, I am a Democrat, and I was never a Bush–Cheney supporter. But, as I have also stated in this blog, economies are massive, complex things. Presidents can give direction from the bully pulpit, but it is unfair and inappropriate to give them excessive credit or blame for the millions of decisions that business owners must make.
And the same thing applies to the people in their administrations.
But some people are adamant about finding someone to blame.
For example, I was looking at the New York Times' website today. For the third straight time, Maureen Dowd wrote a column about former Vice President Dick Cheney. Granted, Cheney's activities recently have been unseemly, to say the least, for a former vice president, but Dowd's columns seem to be particularly vitriolic.
Dowd made no secret of her support for Obama during the campaign, even before Obama's bid for the nomination took hold with the rank and file. Well, Dowd's candidate won, and Cheney's out of office now. Cheney may be in the spotlight by his own choice, but he has no authority to speak of. It seems, to me, that it would be a good idea for Dowd to ease up now.
Dowd isn't the only one, though. On Facebook lately, members have had the option of joining a group that constantly urges people to revel in "not having George Bush as president." Recently, this group has been encouraging people to celebrate the six–month anniversary of the end of the Bush presidency on July 20. From this group's perspective, I suppose it would be expected that parties on that date — which also happens to be the 40th anniversary of the first walk on the moon — would include piñatas in the shape of Bush and Cheney's heads.
More recently, this group has been polling people, asking them whether they would prefer to have Bush back as president ... or be impaled. The latest "results" I saw indicated that 225,000 people would rather be impaled while about 1,000 would opt to have Bush back in the White House.
Talk about a push poll.
Actually, I suspect the results would be different if the choices were real rather than hypothetical.
I understand the temptation to hold Bush and Cheney responsible for all the problems America must deal with now. And, even with all the things that are on the current administration's plate, I still believe there are valuable lessons to be learned from how the previous administration made decisions that determined how foreign policy was conducted, especially regarding how a war was launched.
But some of these other things seem counterproductive to me. They may be psychologically satisfying, but they do little, if anything, to help us find our way out of this wilderness.
Labels:
Cheney,
George W. Bush,
Maureen Dowd,
Nazis,
Obama,
presidency,
scapegoat
Subscribe to:
Posts (Atom)