A place were I can write...

My simple blog of pictures of travel, friends, activities and the Universe we live in as we go slowly around the Sun.



May 21, 2014

Lost its mind?

America dumbs down

The U.S. is being overrun by a wave of anti-science, anti-intellectual thinking. Has the most powerful nation on Earth lost its mind?

By Jonathon Gatehouse

South Carolina’s state beverage is milk. Its insect is the praying mantis. There’s a designated dance—the shag—as well a sanctioned tartan, game bird, dog, flower, gem and snack food (boiled peanuts). But what Olivia McConnell noticed was missing from among her home’s 50 official symbols was a fossil. So last year, the eight-year-old science enthusiast wrote to the governor and her representatives to nominate the Columbian mammoth. Teeth from the woolly proboscidean, dug up by slaves on a local plantation in 1725, were among the first remains of an ancient species ever discovered in North America. Forty-three other states had already laid claim to various dinosaurs, trilobites, primitive whales and even petrified wood. It seemed like a no-brainer. “Fossils tell us about our past,” the Grade 2 student wrote.

And, as it turns out, the present, too. The bill that Olivia inspired has become the subject of considerable angst at the legislature in the state capital of Columbia. First, an objecting state senator attached three verses from Genesis to the act, outlining God’s creation of all living creatures. Then, after other lawmakers spiked the amendment as out of order for its introduction of the divinity, he took another crack, specifying that the Columbian mammoth “was created on the sixth day with the other beasts of the field.” That version passed in the senate in early April. But now the bill is back in committee as the lower house squabbles over the new language, and it’s seemingly destined for the same fate as its honouree—extinction.

What has doomed Olivia’s dream is a raging battle in South Carolina over the teaching of evolution in schools. Last week, the state’s education oversight committee approved a new set of science standards that, if adopted, would see students learn both the case for, and against, natural selection.

Charles Darwin’s signature discovery—first published 155 years ago and validated a million different ways since—long ago ceased to be a matter for serious debate in most of the world. But in the United States, reconciling science and religious belief remains oddly difficult. A national poll, conducted in March for the Associated Press, found that 42 per cent of Americans are “not too” or “not at all” confident that all life on Earth is the product of evolution. Similarly, 51 per cent of people expressed skepticism that the universe started with a “big bang” 13.8 billion years ago, and 36 per cent doubted the Earth has been around for 4.5 billion years.

The American public’s bias against established science doesn’t stop where the Bible leaves off, however. The same poll found that just 53 per cent of respondents were “extremely” or “very confident” that childhood vaccines are safe and effective. (Worldwide, the measles killed 120,000 people in 2012. In the United States, where a vaccine has been available since 1963, the last recorded measles death was in 2003.) When it comes to global warming, only 33 per cent expressed a high degree of confidence that it is “man made,” something the UN Intergovernmental Panel on Climate Change has declared is all but certain. (The good news, such as it was in the AP poll, was that 69 per cent actually believe in DNA, and 82 per cent now agree that smoking causes cancer.)

If the rise in uninformed opinion was limited to impenetrable subjects that would be one thing, but the scourge seems to be spreading. Everywhere you look these days, America is in a rush to embrace the stupid. Hell-bent on a path that’s not just irrational, but often self-destructive. Common-sense solutions to pressing problems are eschewed in favour of bumper-sticker simplicities and blind faith.

In a country bedevilled by mass shootings—Aurora, Colo.; Fort Hood, Texas; Virginia Tech—efforts at gun control have given way to ever-laxer standards. Georgia recently passed a law allowing people to pack weapons in state and local buildings, airports, churches and bars. Florida is debating legislation that will waive all firearm restrictions during state emergencies like riots or hurricanes. (One opponent has moved to rename it “an Act Relating to the Zombie Apocalypse.”) And since the December 2012 massacre of 20 children and six staff at Sandy Hook Elementary School, in Newtown, Conn., 12 states have passed laws allowing guns to be carried in schools, and 20 more are considering such measures.

The cost of a simple appendectomy in the United States averages $33,000 and it’s not uncommon for such bills to top six figures. More than 15 per cent of the population has no health insurance whatsoever. Yet efforts to fill that gaping hole via the Affordable Health Care Act—a.k.a. Obamacare—remain distinctly unpopular. Nonsensical myths about the government’s “real” intentions have found so much traction that 30 per cent still believe that there will be official “death panels” to make decisions on end-of-life care.

Since 2001, the U.S. government has been engaged in an ever-widening program of spying on its own—and foreign—citizens, tapping phones, intercepting emails and texts, and monitoring social media to track the movements, activities and connections of millions. Still, many Americans seem less concerned with the massive violations of their privacy in the name of the War on Terror, than imposing Taliban-like standards on the lives of others. Last month, the school board in Meridian, Idaho voted to remove The Absolutely True Diary of a Part-Time Indian by Sherman Alexie from its Grade 10 supplemental reading list following parental complaints about its uncouth language and depictions of sex and drug use. When 17-year-old student Brady Kissel teamed up with staff from a local store to give away copies at a park as a protest, a concerned citizen called police. It was the evening of April 23, which was also World Book Night, an event dedicated to “spreading the love of reading.”

If ignorance is contagious, it’s high time to put the United States in quarantine.

Americans have long worried that their education system is leaving their children behind. With good reason: national exams consistently reveal how little the kids actually know. In the last set, administered in 2010 (more are scheduled for this spring), most fourth graders were unable to explain why Abraham Lincoln was an important figure, and only half were able to order North America, the U.S., California and Los Angeles by size. Results in civics were similarly dismal. While math and reading scores have improved over the years, economics remains the “best” subject, with 42 per cent of high school seniors deemed “proficient.”

They don’t appear to be getting much smarter as they age. A 2013 survey of 166,000 adults across 20 countries that tested math, reading and technological problem-solving found Americans to be below the international average in every category. (Japan, Finland, Canada, South Korea and Slovakia were among the 11 nations that scored significantly higher.)

The trends are not encouraging. In 1978, 42 per cent of Americans reported that they had read 11 or more books in the past year. In 2014, just 28 per cent can say the same, while 23 per cent proudly admit to not having read even one, up from eight per cent in 1978. Newspaper and magazine circulation continues to decline sharply, as does viewership for cable news. The three big network supper-hour shows drew a combined average audience of 22.6 million in 2013, down from 52 million in 1980. While 82 per cent of Americans now say they seek out news digitally, the quality of the information they’re getting is suspect. Among current affairs websites, Buzzfeed logs almost as many monthly hits as the Washington Post.

The advance of ignorance and irrationalism in the U.S. has hardly gone unnoticed. The late Columbia University historian Richard Hofstadter won the Pulitzer prize back in 1964 for his book Anti-Intellectualism in American Life, which cast the nation’s tendency to embrace stupidity as a periodic by-product of its founding urge to democratize everything. By 2008, journalist Susan Jacoby was warning that the denseness—“a virulent mixture of anti-rationalism and low expectations”—was more of a permanent state. In her book, The Age of American Unreason, she posited that it trickled down from the top, fuelled by faux-populist politicians striving to make themselves sound approachable rather than smart. Their creeping tendency to refer to everyone—voters, experts, government officials—as “folks” is “symptomatic of a debasement of public speech inseparable from a more general erosion of American cultural standards,” she wrote. “Casual, colloquial language also conveys an implicit denial of the seriousness of whatever issue is being debated: talking about folks going off to war is the equivalent of describing rape victims as girls.”

That inarticulate legacy didn’t end with George W. Bush and Sarah Palin. Barack Obama, the most cerebral and eloquent American leader in a generation, regularly plays the same card, droppin’ his Gs and dialling down his vocabulary to Hee Haw standards. His ability to convincingly play a hayseed was instrumental in his 2012 campaign against the patrician Mitt Romney; in one of their televised debates the President referenced “folks” 17 times.

An aversion to complexity—at least when communicating with the public—can also be seen in the types of answers politicians now provide the media. The average length of a sound bite by a presidential candidate in 1968 was 42.3 seconds. Two decades later, it was 9.8 seconds. Today, it’s just a touch over seven seconds and well on its way to being supplanted by 140-character Twitter bursts.

Little wonder then that distrust—of leaders, institutions, experts, and those who report on them—is rampant. A YouGov poll conducted last December found that three-quarters of Americans agreed that science is a force for good in the world. Yet when asked if they truly believe what scientists tell them, only 36 per cent of respondents said yes. Just 12 per cent expressed strong confidence in the press to accurately report scientific findings. (Although according to a 2012 paper by Gordon Gauchat, a University of North Carolina sociologist, the erosion of trust in science over the past 40 years has been almost exclusively confined to two groups: conservatives and regular churchgoers. Counterintuitively, it is the most highly educated among them—with post-secondary education—who harbour the strongest doubts.)

The term “elitist” has become one of the most used, and feared, insults in American life. Even in the country’s halls of higher learning, there is now an ingrained bias that favours the accessible over the exacting.
“There’s a pervasive suspicion of rights, privileges, knowledge and specialization,” says Catherine Liu, the author of American Idyll: Academic Antielitism as Cultural Critique and a film and media studies professor at University of California at Irvine. Both ends of the political spectrum have come to reject the conspicuously clever, she says, if for very different reasons; the left because of worries about inclusiveness, the right because they equate objections with obstruction. As a result, the very mission of universities has changed, argues Liu. “We don’t educate people anymore. We train them to get jobs.” (Boomers, she says, deserve most of the blame. “They were so triumphalist in promoting pop culture and demoting the canon.”)
The digital revolution, which has brought boundless access to information and entertainment choices, has somehow only enhanced the lowest common denominators—LOL cat videos and the Kardashians. Instead of educating themselves via the Internet, most people simply use it to validate what they already suspect, wish or believe to be true. It creates an online environment where Jenny McCarthy, a former Playboy model with a high school education, can become a worldwide leader of the anti-vaccination movement, naysaying the advice of medical professionals.

Most perplexing, however, is where the stupid is flowing from. As conservative pundit David Frum recently noted, where it was once the least informed who were most vulnerable to inaccuracies, it now seems to be the exact opposite. “More sophisticated news consumers turn out to use this sophistication to do a better job of filtering out what they don’t want to hear,” he blogged.

But are things actually getting worse? There’s a long and not-so-proud history of American electors lashing out irrationally, or voting against their own interests. Political scientists have been tracking, since the early 1950s, just how poorly those who cast ballots seem to comprehend the policies of the parties and people they are endorsing. A wealth of research now suggests that at the most optimistic, only 70 per cent actually select the party that accurately represents their views—and there are only two choices.

Larry Bartels, the co-director of the Center for the Study of Democratic Institutions at Vanderbilt University, says he doubts that the spreading ignorance is a uniquely American phenomenon. Facing complex choices, uncertain about the consequences of the alternatives, and tasked with balancing the demands of jobs, family and the things that truly interest them with boring policy debates, people either cast their ballots reflexively, or not at all. The larger question might be whether engagement really matters. “If your vision of democracy is one in which elections provide solemn opportunities for voters to set the course of public policy and hold leaders accountable, yes,” Bartels wrote in an email to Maclean’s. “If you take the less ambitious view that elections provide a convenient, non-violent way for a society to agree on who is in charge at any given time, perhaps not.”

A study by two Princeton University researchers, Martin Gilens and Benjamin Page, released last month, tracked 1,800 U.S. policy changes between 1981 and 2002, and compared the outcome with the expressed preferences of median-income Americans, the affluent, business interests and powerful lobbies. They concluded that average citizens “have little or no independent influence” on policy in the U.S., while the rich and their hired mouthpieces routinely get their way. “The majority does not rule,” they wrote.

Smart money versus dumb voters is hardly a fair fight. But it does offer compelling evidence that the survival of the fittest remains an unshakable truth even in American life. A sad sort of proof of evolution.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.