Friday, December 13, 2013

A very informative format with photography art suggesting the necessary reflective distance




Arab Spring: 10 unpredicted outcomes


Three years on from the start of the upheaval which became known as the Arab Spring, the Middle East is still in a state of flux. Rebellions have brought down regimes, but other consequences have been far less predictable. The BBC's Middle East correspondent Kevin Connolly sets out 10 unintended outcomes.
 
1. Monarchies weather the storm
Kuwaiti Emir Sheikh Sabah al-Ahmad al-Jaber al-Sabah (L), Saudi Crown Prince Salman bin Abdul Aziz al-Saud (C) and Qatari Emir Shiekh Tamim bin Hamad al-Thani

The royal families of the Middle East have had a pretty good Arab Spring so far - rather better than some of them might have feared. That's been as true in Jordan and Morocco as it's been in the Gulf. The governments that have collapsed or wobbled were more or less modelled on Soviet-style one-party states propped up by powerful security establishments.

There's no one single reason for this of course. Bahrain has shown itself ready to use heavy-handed security tactics while others have deployed subtler measures - Qatar hiked public sector salaries in the first months of upheaval. And of course the Gulf Kingdoms effectively have exportable discontent - most lower-paid jobs are done by migrant workers and if they start chafing about conditions of work or political rights they can be sent home.

It's also possible that people feel a degree of attachment to royal rulers that unelected autocrats can't match - however grand a style they choose to live in.

line break
2. US no longer calls the shots
Egyptian protester with picture of Barack Obama and US flag

The United States has not had a good Arab Spring. At the outset it had a clear view of a rather stagnant Middle East in which it had reliable alliances with countries like Egypt, Israel and Saudi Arabia. It has failed to keep up with events in Egypt which has elected an Islamist, Mohammed Morsi, and then seen him deposed by the army.

No-one can blame the Obama administration for failing to keep up. It likes elections, but didn't like the result - a clear win for the Muslim Brotherhood. And it doesn't like military coups (not in the 21st Century at least) but is probably comfortable enough with a military-backed regime which wants to keep the peace with Israel.

America is still a superpower of course but it doesn't dictate events in the Middle East anymore. It's not alone in that failure - Turkey failed to pick the winning side in Egypt too and is struggling with problematic relationships with rebels in Syria.

line break
3. Sunni versus Shia
Al-Nusra Front members in Syria

The speed with which unarmed protests against a brutal authoritarian government morphed into a vicious civil war with sectarian overtones in Syria has shocked everyone. There are rising tensions between Sunni and Shia Muslims in many parts of the region, and Shia Iran and Sunni Saudi Arabia are now effectively fighting a proxy war in Syria.

The deepening schism between the two branches of Islam has led to startling levels of sectarian violence in Iraq too - it may yet turn out to be one of the most important legacies of these years of change in the Arab world.

line break
4. Iran a winner
Iranians with flags

No-one would have predicted at the beginning of the Arab Spring that Iran would gain from it. At the beginning of the process, it was marginalised and crippled by sanctions imposed because of its nuclear ambitions. Now it's impossible to imagine a solution in Syria without Iranian agreement, and with its presidency under new management its even talking to the world powers about that nuclear programme.

Saudi Arabia and Israel are both alarmed by America's readiness to talk to Tehran - anything that puts those two countries on the same side of an argument has to be pretty historic.

line break
5. Winners are losers
Muslim Brotherhood supporters in Egypt

Picking winners and losers in all this is tricky. Look at the fate of the Muslim Brotherhood in Egypt. When elections were held after the toppling of Hosni Mubarak, it swept into power and after 80 years in the shadows it finally appeared poised to remake the largest country in the Middle East in its own image. Now its been swept back out of power again by the army and forced underground, with its senior leaders facing long prison sentences. A year ago the Brotherhood looked like a winner. Not any more.

That was bad news for the tiny, politically ambitious Gulf Kingdom of Qatar which had backed the Brotherhood in Egypt's power struggle. In the early stages of the Arab Spring, with Qatar backing the Libyan rebels too, it appeared to have hit on a strategy for expanding its regional influence. Not any more.

line break
6. Kurds reap benefits
Supporters of Iraqi Kurdish President Massoud Barzani

The people of Iraqi Kurdistan are starting to look like winners though - and may even be on their way to achieving a long-cherished dream of statehood. They live in the northern region of the country which has oil and is developing independent economic links with its powerful neighbour, Turkey. It has a flag, anthem and armed forces too. The Kurds of Iraq may be a beneficiary of the slow disintegration of the country which no longer functions as a unitary state.

The future won't be trouble-free (there are Kurdish populations in neighbouring Iran, Syria and Turkey too) but in Kurdish cities like Irbil, people think the future looks brighter and freer. That process began before the Arab Spring of course but the Kurds are taking advantage of the mood of change sweeping the region to consolidate changes that were already under way.

line break
7. Women fall victim
Egyptian women

Some of the outcomes of the Arab Spring (so far at least) have been downright depressing. In the crowds in Tahrir Square at the beginning of Egypt's uprising there were plenty of brave and passionate women demanding personal freedoms alongside the political rights which were the focus of the protests.

They will have been bitterly disappointed. Stories of sexual assaults in public are frighteningly common and a Thomson-Reuters Foundation poll said Egypt was the worst place in the Arab world to be a woman - behind even Saudi Arabia. It scored badly for gender violence, reproductive rights, treatment of women in families and inclusion in politics and the economy.

line break
8. Overrated power of social media?
Egyptians watch Mohammed Morsi on a television

At the beginning of the protest movements, there was a lot of excitement in the Western media about the role of innovations like Twitter and Facebook, partly because Western journalists like Twitter and Facebook themselves. Those new social media have an important role in countries like Saudi Arabia, where they allow people to circumvent the hidebound official media and start some kind of national debate.

They had a role at the beginning of the uprisings too, but their use was confined largely to a well-educated and affluent (and often multilingual) liberal elite and their views may have been over-reported for a time. Those secular liberals after all were trounced at the ballot box in Egypt. Satellite TV remains more important in countries where many people can't read and write and don't have access to the internet.

Bassem Youssef

The story of Bassem Youssef, the Egyptian heart-surgeon turned TV satirist, sums it up. He did start by putting his material out on the internet but became an international phenomenon when he switched to a TV channel. He became known as the "Egyptian Jon Stewart".

An important difference is that Mr Stewart plies his trade in the United States - Mr Youssef is going to have to tread rather carefully under Egypt's new rulers just as he did under their Islamist predecessors. Egyptians like to laugh; their leaders don't like to be laughed at. Mr Youssef is currently off the air again.

line break
9. Dubai property bounces back
Dubai skyline

The ramifications of events in the Middle East are still felt far beyond the frontiers of the countries where they happen. There is a theory that the property market in Dubai has spiked as wealthy individuals from destabilised countries like Egypt, Libya, Syria and Tunisia seek a safe haven for their cash - and sometimes their families. The effects could be felt further afield too in property markets like Paris and London.

line break
10. Back to the drawing board
Ottoman troops in 1915

A map of the Middle East that was drawn up by Britain and France in a secret carve-up half way through World War One looks like it's unravelling. That's when states like Syria and Iraq were created in their current forms, and no-one knows whether they'll still exist in their current forms as unitary states in, say, five years from now.

No-one can do much about it either - Libya showed the limits of Western intervention where British and French air power could hasten the demise of a hated old regime but couldn't make sure that it was followed by democracy. Or even stability.

One old lesson - which the world is relearning - is that revolutions are unpredictable and it can take years before their consequences become clear.

Monday, December 9, 2013

Laissez-faire freedoms

Noting the Spanish 'commune-kinda-thing'-reaction to the Spanish crisis (earlier blog, October, Marinaleda), we see here (below) a stunning historical overview of the economic and political misere of the past 35 years by T.O. McGarity. One might suggest that the somewhat radical political style of Mayor Juan and fellow citizens of Marinaleda is in direct response to the political failures described below. It is largely an international dynamic with different faces depending upon local conditions. E.g., what in Marinaleda is cast in socialist garb, in the U.S. becomes a tea party; so much less constructive. In its communal idealism, the former, at least, promises a more equal, sustainable future. The latter, perversely amplifies that which it protests.


============================



What Obama Left Out of His Inequality Speech: Regulation


 
AUSTIN, Tex. — President Obama’s speech on inequality last Wednesday was important in several respects. He identified the threat to economic stability, social cohesion and democratic legitimacy posed by soaring inequality of income and wealth. He put to rest the myths that inequality is mostly a problem afflicting poor minorities, that expanding the economy and reducing inequality are conflicting goals, and that the government cannot do much about the matter.

Mr. Obama also outlined several principles to expand opportunity: strengthening economic productivity and competitiveness; improving education, from prekindergarten to college access to vocational training; empowering workers through collective bargaining and antidiscrimination laws and a higher minimum wage; targeting aid at the communities hardest hit by economic change and the Great Recession; and repairing the social safety net.

But there’s a crucial dimension the president left out: the revival, since the mid-1970s, of the laissez-faire ideology that prevailed in the Gilded Age, roughly the 1870s through the 1910s. It’s no coincidence that this laissez-faire revival — an all-out assault on government regulation — has unfolded over the very period in which inequality has soared to levels not seen since the Gilded Age.
History tells us that in periods when protective governmental institutions are weak, irresponsible companies tend to abuse their economic freedom in ways that harm ordinary workers and consumers. The victims are often less affluent citizens who lack the power either to protect themselves from harm or to hold companies accountable in the courts. We are in such a period today.

The laissez-faire revival of the past 35 years was no accident. The protective statutes and liberal common-law doctrines of the late 1960s and early 1970s — what can be called the Public Interest Era — had a profound impact in such areas as occupational safety and health, environmental protection, consumer finance and the safety of food, drugs and consumer products. This legislative and judicial activism placed far more constraints on the economic freedom of corporate America than had any legal regime preceding it.

It also galvanized a “divert and delay” strategy of resistance by businesses, which lobbied against the new statutes and resisted the efforts of newly empowered regulators and plaintiffs. The laissez-faire revival, however, required more than resistance to change. It also took the determined efforts of a relatively small number of philanthropists and academics to create what I call an “idea infrastructure” around minimalist regulation, popularizing that ideology and persuading Congress, the executive branch, and the courts to scale back constraints on corporations.

Corporate activists — responding in part to a call to action by William E. Simon, a financier and architect of the modern conservative movement, who served as Treasury secretary under Presidents Richard M. Nixon and Gerald R. Ford — devoted tens of millions of dollars to the creation of right-leaning think tanks, media operations and free-enterprise centers in academia, as well as lobbying and public relations firms and “grass-roots” (but actually business-financed) organizations.
The business community launched three frontal assaults on the regulatory agencies that Congress had created over the years to protect the American public.

The first assault came toward the end of Jimmy Carter’s administration, when several newly created agencies were just beginning to hit their stride, and it reached peak intensity during the first three years of Ronald Reagan’s administration, as the Office of Management and Budget slashed agency appropriations requests, a Presidential Task Force on Regulatory Relief asked the industry to recommend rules for repeal, and business-friendly political appointees wrote lax regulations and substituted voluntary compliance programs for tough enforcement.

The second assault came in 1995, after Republicans took control of Congress promising both regulatory reform and tort reform legislation as part of a “Contract With America.” The House of Representatives passed omnibus regulatory reform legislation and radical amendments to the Clean Water Act and the Occupational Safety and Health Act. None of these radical measures, however, survived in the Senate. A backdoor attempt to attach anti-environmental riders to the Environmental Protection Agency’s appropriations bill helped lead to a veto by President Bill Clinton and two government shutdowns before the Republican leadership finally backed down in the face of strong public disapproval. This second assault did, however, result in deep cuts in agency budgets, reduced enforcement, and a noticeable drop in new regulatory protections.

The third assault came with the inauguration of George W. Bush in 2001. With the assistance of the Heritage Foundation, the president filled the top levels of the regulatory agencies with devoted deregulators. Agency budgets, which had begun to creep upward in Mr. Clinton’s second term, were slashed once again, and voluntary compliance became the preferred enforcement tool, despite its demonstrated ineffectiveness. When several deregulatory bills drafted by the Bush administration failed, it sought to achieve its goals administratively. When an agency did try to promulgate a stringent regulation — often because it was required by statute — the regulatory czars in the Office of Management and Budget rewrote the rules to make them weaker or to create generous exemptions.
(Three similar assaults on state common law courts — corresponding to periods when poor investment portfolios forced liability insurance companies to raise rates — resulted in state legislation designed to make it more difficult for victims of negligent medical providers, poorly designed products, and unsafe workplaces to sue for damages in state court or to claim workers’ compensation benefits in state administrative tribunals. Spurred on by the United States Chamber of Commerce and conservative organizations, the business community poured millions of dollars into state judicial elections in the anticipation that business-friendly judges would change the common law rules to be more favorable to defendants.)

The three assaults did not succeed in repealing the bedrock regulatory statutes and common law innovations of the Progressive, New Deal and Public Interest eras. But they were remarkably successful in disabling the institutions charged with establishing the rules of responsible corporate behavior and with holding irresponsible companies accountable for breaking those rules. By the mid-2000s, those resource-starved federal agencies that had not become thoroughly captured by the industries they regulated were at best reluctant regulators.

Three decades of deregulation and restrictions on legal liability had given companies greater freedom to innovate and expand. But irresponsible companies also had greater freedom to subject their workers to unsafe working conditions, to market predatory loans to desperate borrowers, to sell defective toys and automobiles, to discharge toxic pollutants, to invade the privacy of Internet users, to market unsafe food, drugs and medical devices, and to subject the world economy to systemic risks.

The laissez-faire culture that prevailed in both government and the private sector so deeply discounted risks to workers, consumers, the environment and the financial system that a series of crises was inevitable.

The deadly oil refinery explosion in Texas City, Tex., in 2005, the financial sector meltdown of 2007-8, the Upper Big Branch mine catastrophe in West Virginia and the Deepwater Horizon oil spill, both in 2010, multiple disease outbreaks because of contaminated peanuts, eggs, hamburgers and seafood, and dozens of motor vehicle and toy recalls were just a few of the visible consequences of the laissez-faire mentality that has pervaded the American political economy.

Less visible, but equally devastating, were the heart attacks caused by poorly regulated painkillers, the quiet desperation of millions of “underwater” homeowners who owed more in mortgage debt that their homes were worth, and the subtle but steady and irreversible increase in global temperatures as a result of carbon emissions.

The laissez-faire revival also contributed to the growing disparities in wealth and well-being that became painfully obvious during the last decade. While corporate executives, Wall Street bankers and hedge fund managers greatly benefited from the three waves of assault on regulation, the fortunes of blue-collar workers and the working poor steadily declined. Median incomes have fallen over the last decade.

The disparities brought on by the laissez-faire revival, however, go far beyond the vast disparities in income and wealth. It is of fairly small consequence to the disabled miner whose boss violated federal safety standards that the mining company’s revenues, profits and executive bonuses are on the rise. But the disparity becomes unconscionable when lax pension-protection regulations let the company spin off its “legacy liabilities” (pension and health-insurance guarantees) into an undercapitalized shell for the sole purpose of filing for bankruptcy protection.

Not all of the adverse effects of the laissez-faire revival have fallen disproportionately on the middle class and the poor. Lax regulation of airplanes is as risky for passengers in first and business class as in coach. The rich and poor suffer from the side effects of hastily approved prescription drugs. But the overall burden of deregulation is borne by those least able to carry it.

The chief executive of the giant meat producer does not have to worry about losing a finger or contracting carpal-tunnel syndrome as he attempts to extract more “efficiency” from a poultry processing plant by persuading the Department of Agriculture to allow the company to speed up production lines. The health of few rich people is at risk from the plumes of unregulated toxic emissions that migrate through neighborhoods adjacent to large petrochemical complexes. The affluent tend not to live so close to railroad tracks as to be affected by toxic gases escaping from derailed tank cars.

Wealthy people injured by defective products can afford to hire lawyers to sue the responsible companies. Not so the middle class and the poor, who must rely on attorneys working on a contingency-fee basis. The caps on damages and restrictions on liability that state legislatures have enacted at the behest of big business make it very difficult for potential attorneys to justify taking on many entirely valid claims. Unless they are severely injured and incur enormous medical expenses, ordinary people are effectively deprived of their right to recover damages in court.

The recent confluence of crises undermined the bedrock assumptions of laissez-faire minimalism. The stage was set after the 2008 elections to recapture the spirit of reform that permeated the Progressive, New Deal, and Public Interest eras and to enact fundamental changes to reduce short-term profit incentives and enhancing the public good. After all, the economy had been nearly destroyed as a direct result of reckless risk-taking by financial institutions, enabled by decades of deregulation.

Unfortunately, far-reaching reforms have not been forthcoming. The business community’s idea infrastructure shifted to defensive mode and — with help from lavish corporate spending to influence elections — beat back the most significant reform proposals of the Obama administration and congressional Democrats, like the suggestion that giant banks be broken up because they are not only too big to fail, but also too big to manage or regulate.

Instead of comprehensive change, Congress settled for patch-and-repair reforms. The Dodd-Frank financial reform act and the Food Safety Modernization Act, both enacted in 2010 while Democrats still controlled both houses of Congress, were, to be sure, important attempts to fix badly broken regulatory programs. But neither statute will bring about fundamental changes in the underlying incentive structures that ultimately determine the behavior of regulated companies and industries. And the agencies charged with implementing those reforms have made only modest progress.
We are now in the midst of a fourth assault on regulation, following the 2010 midterm elections. Having failed to seize the initiative in 2009, the Obama administration has done very little to deflect that assault. Rather than rising to the defense of beleaguered regulatory agencies, the president hosted a closed-door “summit meeting” with 20 chief executives of major corporations, where he promised to work more closely with the business community. He then ordered all regulatory agencies to review all previously issued rules with an eye toward “streamlining” or eliminating as many as possible.

The business community has emerged virtually untouched from a confluence of crises that in previous eras would have resulted in profound redistributional changes. For this surprising development, the idea and influence infrastructures that conservative foundations and corporate America carefully created over a 35-year period can claim much of the credit.
That gets us back to Mr. Obama’s speech last week. While he touched on many of the macroeconomic forces driving the surge in inequality since the 1970s — skill-biased technological change, the dismantling of American manufacturing, the globalization of commerce and finance, and a “trickle-down” ideology of tax cuts for the rich — he barely mentioned regulation.
In fact, in a speech of some 6,500 words, he mentioned regulation exactly twice: once to note the contribution of “lax regulation” to financial turmoil, and the second time to argue that expanding the economic pie calls for “streamlining regulations that are outdated or unnecessary or too costly.”
We cannot know for certain whether Mr. Obama’s hesitancy to embrace robust regulation results from his own blind spots — which resemble those of his Democratic predecessors Mr. Carter and Mr. Clinton — or from a calculation that a more progressive tax system, investments in education and defending the social safety net (and his embattled health care reform) are so politically daunting that anything that might further antagonize corporate executives should be put off for another day. Most likely, it’s a combination of both.

But Mr. Obama’s failure to examine (or even mention) the laissez-faire revival was a missed opportunity. Deregulation may not be the central cause of the soaring inequality of recent decades, but it has certainly magnified its consequences, making it ever more difficult for workers and consumers to resist the rapacious predations of abusive employers and companies. The weakening of what used to be the great American middle class cannot be understood without also considering the embrace free-market theology. By omitting this critical factor in the rise of inequality, Mr. Obama left unchallenged the argument, recited by business like a mantra, that regulation and economic expansion are inherently in tension.

Sadly, the crises resulting from deregulation will almost certainly continue until political forces realign themselves and a new social bargain is struck under which the business community’s economic freedoms are once again constrained by a government that is more willing to impose greater responsibilities on powerful economic actors and a legal system that is capable of holding them accountable for the harm that they cause. Until then, a crucial check on the seemingly inexorable advance of economic inequality will be missing.


Thomas O. McGarity, a professor of administrative law at the University of Texas, Austin, is the author of “Freedom to Harm: The Lasting Legacy of the Laissez Faire Revival.”

Friday, December 6, 2013

Stating the obvious

God Created Gravity: Why the U.S. Can't Keep Pace With Slovenia

 
Two recent headlines appearing within a few days of each other should have warranted greater attention: "School Science Lesson Claims Gravity Was Created by God" and "Best Education in the World: Finland, South Korea Top Country Rankings, U.S. Rated Average." The explanation for the latter is fully explained by the former, yet not enough of us seem to make the obvious connection.
The far right can stick their collective heads in the sand and talk about American exceptionalism, but the rest of the world is getting educated in the meantime. America is indeed number one - in self-delusion. While flag wavers congratulate themselves on how awesome we are, the world looks on bemused: only six percent of American students achieved advanced levels on an international standard, behind 30 other countries. We rank 25th in math, 17th in science, and 14th in reading. We are behind Lithuania and Slovenia - two countries a majority of American students could not identify on a map.

Many factors have brought us to this sad state of affairs, but we can no longer ignore the 600 pound gorilla and trumpeting elephants in the room: religion is killing us. While our kids are being taught that god created gravity, children in Zaire are learning about Newton and Einstein. As children in Lichtenstein are being taught about the warping of space-time, American kids are learning that "people who do not believe in god" are incapable of understanding gravity.

American religiosity has become an existential threat, undermining the foundation of our future prosperity by contaminating our educational system with superstition, fable and myth. We see this with evolution, vaccines, climate change, energy policy and a host of critical issues that should be based in science but instead are hijacked by ignorance. We are 17th in the world in science, but instead of improving our education, we continue to fight battles more appropriate to the 16th century. Let's look at a few specific and tragic examples in which religion has triumphed at the expense of our educational system and with great harm to society.

Evolution

Religion is the only explanation for why evolution creates such a fuss in our society. We do not see people getting exercised about Quantum Mechanics, String Theory or the Theory of Relativity. But mention evolution and you invoke an immediate and visceral reaction. Local school boards are elected, rejected and then re-elected solely on this issue. No other scientific discovery is so deeply embedded into the fabric of American politics.

Evolution is one of the most successful, thoroughly documented scientific discoveries in human history. We can see evolution in a Petri dish. Evolution has been validated across multiple fields of anthropology, geology, genetics, embryology, bacteriology, virology, and biogeography. Evolution is a fact, an undeniable, proven fact, as certain as the existence of atoms. Only some of the details of the mechanisms of evolution remain to be elucidated. Cancer is a fact, though not all the mechanisms leading to malignancy are understood. Theory does not imply uncertainty; instead, theory means a grand idea, such as General Relativity or Evolution; well-established principles that encompass and explain a broad range of phenomena.

However, more than 75 years after the trial of State of Tennessee v John Scopes and despite incredible advances in biology, many public school boards strive to eliminate the teaching of evolution from the curriculum.

The debate about intelligent design in public schools is a uniquely American phenomenon, a quirk of our history and culture. Beyond the theocracies of the Middle East, religion permeates American politics in a way not found anywhere else in the world. No other developed country, east or west, is host to a serious political movement dedicated to the destruction of secularism, with evolution exhibit number one.

We have to go all the way back to Italy in 1614 to find another example of a powerful political machine dedicated to the suppression of a broad scientific truth with deep implications for human understanding. That is the year in which Galileo's observations of the earth orbiting the sun were first denounced as a threat to the established authority of the Catholic Church, which claimed Galileo's doctrine to be false and contrary to the divine and Holy Scripture. We have regressed four centuries. Intelligent design is nothing but a transparent fig leaf for creationism, a child of that dark era in the 1600s. Comparing creationism or intelligent design to evolution is no different than insisting that we teach today that the sun actually orbits the earth as an alternative theory to modern astronomy. Only in the United States are such discredited views taken seriously by a large portion of the citizenry. We can and should do better. Intelligent design has no place in a science classroom - and it does not in any western country outside these United States.

Vaccines

Perhaps you believe that teaching that god created gravity is harmless, no big deal, nothing to be exercised about. But disdain for objective truth has real and tragic consequences... which brings us to measles and the issue of childhood immunization. Vaccines are one of the greatest achievements of modern medicine, saving hundreds of millions of lives and improving the quality of life for countless others. But because of medical illiteracy and misplaced religious zeal, some parents are, in a display of dangerous ignorance, forcing school boards across the country to accept students with no vaccination history. Consequently we recently witnessed the biggest outbreak of measles in 15 years, double the number of cases seen typically. With the success of vaccines we forget, ironically, that measles is deadly; prior to vaccinations about 5000 people died annually in the United States from the disease. In 2008 measles killed about 170,000 worldwide. With the best intentions to protect their children, parents are in fact playing a deadly game of chicken based purely on ignorance - lack of knowledge of the benefits of vaccination compared to the inaccurate, overstated and simply wrong conclusions about the dangers.

The problem is not theoretical but real and deadly. Because of one paper published in 1998 in the medical journal Lancet, subsequently withdrawn for suspicions of scientific fraud, and fully discredited by later study, tens of thousands of parents risk their children's health by withholding critical vaccinations against terrible diseases. Rates of childhood immunization for measles (rubeolla), mumps, and rubella (German measles) have yet to fully recover from the impact of this one discredited paper. And many parents still insist that vaccines cause autism, even in the absence of any evidence to support the claim with the withdrawal of the original paper. Myth has usurped fact. In many school districts, including wealthy ones like in San Diego County, the number of unvaccinated children has nearly tripled since 1990. This affects everybody, not just those who choose to avoid vaccinations. Case in point: a few years ago San Diego County experienced the worst outbreak of deadly Whooping Cough in local history as more parents eschewed vaccination against that disease. And let's be brutally honest; we can lay the death of every child who dies of this preventable disease directly at the feet of all the parents who chose not to vaccinate their children. Unlike most diseases that require only 85% vaccination to create herd immunity, Whooping Cough, and measles, requires 94% immunization to protect the public. Ignorance, the willingness to dismiss hard evidence when inconvenient, or inversely the readiness to reach a conclusion in the complete absence of evidence are all symptoms of scientific illiteracy growing in the nutritive soup of religiosity.

Climate Change

Oddly, many accept the link between autism and vaccinations with no proof, but when it comes to climate change, the demand for proof is never satisfied no matter how convincing such proof may be. Many accept the existence of ghosts with no evidence, but deny the reality of a changing climate with proof before their eyes. This differential deference to evidence is clear indicator that much of the American public lacks the tools to evaluate issues rationally. Without science, reality becomes just an option to be rejected whenever the real world gives us inconvenient truths. In this frightening environment in which fiction becomes fact, the conclusions from years of careful research, scrutinized by competing scientists and published in peer reviewed journals now carry no more weight with the public than the random thoughts of a bloated pundit. Talking heads with no training now have the same authority as highly qualified experts. So global warming is dismissed as a liberal hoax in spite of a preponderance of overwhelming scientific evidence to the contrary. Climate and weather are mistakenly thought to be the same. So with every cold snap in winter we hear, "See, it snowed - I told you climate change was a joke." Articles noting the acceleration of climate change are ignored by the press, focused on an audience obsessed with the Kardashians; melting ice caps just can't compete. When presented with solid evidence, skeptics selectively demand more "proof" without any sense of irony that they demand no proof for virgin birth, talking snakes, 900 year old men, Immaculate Conception and resurrection.

Wasteland

So let us come back to our low international rating in education. We debate climate change and evolution because society is still largely unable to embrace the scientific method, which is neglected in our classrooms, which perpetuates our downward spiral. Although understanding the basics of science is critical to everyday life in a technology-driven society, the subject is given only cursory treatment in most public schools. As a result, people are often poorly equipped to understand the complexities of an issue before forming an opinion about the costs and benefits of adopting or restricting a particular technology. And so we lag behind Lichtenstein.

Steeped in this wasteland of scientific illiteracy we march ever further toward a theocracy; a secular society cannot stand without deference to fact. We are in danger of becoming the Iran of the West, or a bad copy of the former Soviet Union. Under the communist dictatorship children were taught that Stalin was a hero and that capitalism was a great evil, or that Russia invented the telephone and airplane, with no regard to the truth. We are about to make the same mistake in twisting history to indoctrinate our children with stories about god and gravity.

As religiosity has ascended in American life, policy debates have become faith-based rather than being anchored in logic. Support for a policy position becomes unmoved by contradictory facts because proponents simply "believe" the position to be correct even in the face of incontrovertible evidence to the contrary. Just as there is no way to determine relative validity between religions, or to diminish faith with facts, as soon as logic is removed from policy debates, competing positions are no longer evaluated based on relative merit, but are supported as inherently right, immune to any reasonable counter arguments. This slide away from secular debate leads increasingly to polarization, greater animosity and a loss of civility because the only way to support a position is simply to assert supremacy as loudly as possible. We are reduced to childlike tantrums of "I'm right, you're wrong, I win." Without logic, there is no common basis for discussion, and no way to mediate disputes. The death of secularism is the death of civility, and nothing demonstrates this more clearly than the debate about teaching science in schools free from religion. Our international ranking suffers because we have not yet learned this lesson. Slovenia has.
 

Thursday, December 5, 2013

Old bones, new knowledge


Leg bone gives up oldest human DNA


Sima de los Huesos remains The Pit of Bones has yielded one of the richest assemblages of human bones from this era


The discovery of DNA in a 400,000-year-old human thigh bone will open up a new frontier in the study of our ancestors.

That's the verdict cast by human evolution experts on an analysis in Nature journal of the oldest human genetic material ever sequenced.

The femur comes from the famed "Pit of Bones" site in Spain, which gave up the remains of at least 28 ancient people.

But the results are perplexing, raising more questions than answers about our increasingly complex family tree.

The early human remains from the cave site near the northern Spanish city of Burgos have been painstakingly excavated and pieced together over the course of more than two decades. It has yielded one of the richest assemblages of human bones from this stage of human evolution, in a time called the Middle Pleistocene.

To access the pit (called Sima de los Huesos in Spanish) scientists must crawl for hundreds of metres through narrow cave tunnels and rope down through the dark. The bodies were probably deposited there deliberately - their causes of death unknown.

The fossils carry many traits typical of Neanderthals, and either belong to an ancestral species known as Homo heidelbergensis - or, as the British palaeoanthropologist Chris Stringer suggests - are early representatives of the Neanderthal lineage.

DNA's tendency to break down over time means it has not previously been possible to study the genetics of such ancient members of the human family.

But the recent pace of progress in sequencing technology has astonished many scientists: "Years ago, geneticists said they wouldn't be able to find DNA that was older than 60,000 years old," said co-author Jose Bermudez de Castro, from the National Research Centre for Human Evolution (CENIEH), a member of the team that excavated the fossils.

"Of course, that wasn't true. The techniques have advanced hugely."

Siberia to Iberia




 

Professor Chris Stringer from the Natural History Museum in London describes the significance of the discovery


Researchers at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, under the supervision of Prof Svante Paabo, have been helping drive those advances. The success reported in Nature was the result of applying techniques developed for sequencing the degraded DNA found in Neanderthal fossils to even older specimens.

Prof Paabo, the institute's director, said: "Our results show that we can now study DNA from human ancestors that are hundreds of thousands of years old," adding: "It is tremendously exciting."

Smart spiral



The scientists were able to stitch together a near-complete sequence of mitochondrial DNA, or mtDNA (the genetic material contained in the tiny "batteries" that power our cells) from the ancient femur. But comparisons of the genetic code with that from other humans, ancient and modern, yielded a surprise.

Rather than showing a relationship between the Spanish specimens and Neanderthals, which might be expected based on their physical features, the mitochondrial DNA was most similar to that found in 40,000 year-old material unearthed thousands of kilometres away at Denisova Cave in Siberia.

The Denisovans were a sister group to the Neanderthals, with distinct genetic characteristics. Identified only by DNA extracted from a tiny finger bone and tooth, they are, as some researchers have remarked, "a genome in search of a fossil" because there are no substantial remains representative of this group.

By using missing mutations in the old DNA sequences, the researchers calculated that the Pit of Bones individual shared a common ancestor with the Denisovans about 700,000 years ago.
Muddle in the middle

Sima de los Huesos The Pit of Bones is difficult to access but has ideal conditions for DNA preservation

So there are several possibilities as to how Denisovan-like DNA could turn up in Middle Pleistocene Spain. Firstly, the mitochondrial DNA type from the pit came from a population ancestral to both the Spanish hominids and to Denisovans.

Secondly, interbreeding between the Pit of Bones people (or their ancestors) and yet another early human species brought the Denisovan-like DNA into this western population. Prof Bermudez de Castro thinks there may be a candidate for this cryptic ancestor: an earlier human species known as Homo antecessor. One million years ago, antecessor inhabited the site of Gran Dolina, just a few hundred metres away from the Pit of Bones.

Prof Chris Stringer, from London's Natural History Museum, told BBC News: "We need all the data we can get to build the whole story of human evolution. We can't just build it from stone tools, we can't just build it from the fossils. Having the DNA gives us a whole new way of looking at it."

DNA Techniques developed to sequence Neanderthal DNA can be applied to older fossils

However, he points out, mtDNA is a small and unusual component of our genetic blueprint, from which only limited conclusions can be drawn. For example, no sign of the interbreeding we now know took place between Neanderthals and modern humans remains in the mtDNA of modern people.

To get the full picture, scientists had to sequence nuclear DNA (that kept in the nuclei of cells) from Neanderthals and compare it with that in present-day populations. Likewise, the true relationships between the Pit people and other ancient populations may only be known if and when nuclear DNA is available.

This will be a challenge given the age of the Spanish fossils, but their good state of preservation - largely a product of the fairly constant temperature inside the cave - gives hope.

"That is our next big thing here, to sequence at least part of the nuclear genome from the individual in the Sima de los Huesos," Svante Paabo told BBC News.

"This will answer definitively the question of how they are related to Neanderthals, modern humans and Denisovans."

http://www.bbc.co.uk/news/science-environment-25193442

Tuesday, December 3, 2013

Nature & Culture

Something simple. With and without color.


 
 

Nature… is that which still manages to exist despite human interventions in planet earth’s biosphere. A fairly radical point of view, of course, would be to consider humans part of nature.

Culture… is mankind’s deliberate and accidental rearrangements of biosphere #1 in pursuit of something believed to last for a 1000 years.
____________________
 
"Simple'...? As a natural phaenomenon that still manages to survive despite human interventions, the tit is simple. It lives its own life parallel to humans and we catch it rest for a moment on the stump but we also feel in that image the next moment with the stump alone and the tit gone. Perhaps, the photograph is simple in the sense that the photographer just got lucky with a snapshot capturing the structured beauty of tit and stump which we recognize because we have a reasonably good chance ourselves of experiencing such in nature if we seek it. But, the fixture of that gorgeous tit-moment may otherwise be the result of excruciating preparations, and thus not a simple thing to achieve, and hence the image takes on the double-value of being both a representation of real nature and unnatural artifice as the moment has been fixed forever. (Or, most likely, a larger image cropped.)
 
The Scottish highland sheepherder photo has culture in the foreground at the outskirts of a foreboding natural environment dominating the frame, black and white like something half-forgotten.
 


"Rights and privacy of law-abiding citizens"

They go together:

http://www.theguardian.com/world/2013/dec/03/american-public-mind-its-own-business-survey 

The complex issue of international engagement by the U.S. On which grounds? Humanitarian? Power-politics? Imperialism? Economic expansion/security serving domestic needs? Socio-philosophical hubris?
 
From the above article:
 
- "One of the starkest findings in the survey was in response to a question about whether the US should “mind its own business internationally and let other countries get along the best they can on their own”. "
 
- "A majority of respondents – 52% – said they agreed with the statement, while just 38% disagreed. The authors of a report accompanying the survey described it as “the most lopsided balance in favor of the US ‘minding its own business’ in the nearly 50-year history of the measure”."
 

=====================================

An open letter from Carl Bernstein to Guardian editor Alan Rusbridger

Watergate scandal journalist's letter comes as Guardian editor prepares to appear before MPs over Edward Snowden leaks


Dear Alan,

There is plenty of time – and there are abundant venues – to debate relevant questions about Mr Snowden's historical role, his legal fate, the morality of his actions, and the meaning of the information he has chosen to disclose.

But your appearance before the Commons today strikes me as something quite different in purpose and dangerously pernicious: an attempt by the highest UK authorities to shift the issue from government policies and excessive government secrecy in the United States and Great Britain to the conduct of the press – which has been quite admirable and responsible in the case of the Guardian, particularly, and the way it has handled information initially provided by Mr Snowden.

Indeed, generally speaking, the record of journalists, in Britain and the United States in handling genuine national security information since World War II, without causing harm to our democracies or giving up genuine secrets to real enemies, is far more responsible than the over-classification, disingenuousness, and (sometimes) outright lying by a series of governments, prime ministers and presidents when it comes to information that rightly ought to be known and debated in a free society. Especially in recent years.

You are being called to testify at a moment when governments in Washington and London seem intent on erecting the most serious (and self-serving) barriers against legitimate news reporting – especially of excessive government secrecy – we have seen in decades.

The stories published by The Guardian, the Washington Post and the New York Times based on Mr Snowden's information to date hardly seem to represent reckless disclosure of specific national security secrets of value to terrorists or enemy governments or in such a manner as to make possible the identification of undercover agents or operatives whose lives or livelihoods would be endangered by such disclosure. Such information has been carefully redacted by the Guardian and other publications and withheld from stories based on information from Mr Snowden. Certainly terrorists are already aware that they are under extensive surveillance, and did not need Mr Snowden or the Guardian to tell them that.

Rather, the stories published by the Guardian – like those in the Washington Post and the New York Times – describe the scale and scope of electronic information-gathering our governments have been engaged in – most of it hardly surprising in the aggregate, given the state of today's technology, and a good deal of it previously known and reported and indeed often discussed "on background" with reporters by high government officials from the White House to Downing Street confident that their identities will not be disclosed.

Moreover, the Guardian—like the Times and the Post in the US – has gone to great lengths to consult with Downing Street, the White House and intelligence agencies before publishing certain information, giving time for concerns to be raised, discussed sensibly, and considered.

What is new and most significant about the information originating with Mr Snowden and some of its specificity is how government surveillance has been conducted by intelligence agencies without the proper oversight – especially in the United States – by the legislative and judicial branches of government charged with such oversight, especially as the capabilities of information-gathering have become so pervasive and enveloping and with the potential to undermine the rights of all citizens if not carefully supervised. The "co-operation" of internet and telecommunications companies in some of these activities ought to be of particular concern to legislative bodies like the Commons and the US Congress.

As we have learned following the recent disclosures initiated by Mr Snowden, intelligence agencies – especially the NSA in the United States – have assiduously tried to avoid and get around such oversight, been deliberately unforthcoming and oftentimes disingenuous with even the highest government authorities that are supposed to supervise their activities and prevent abuse.

That is the subject of the rightful and necessary public debate that is now taking place in the US, the UK and elsewhere.

Rather than hauling in journalists for questioning and trying to intimidate them, the Commons would do well to encourage and join that debate over how the vast electronic intelligence-gathering capabilities of the modern security-state can be employed in a manner that gives up little or nothing to real terrorists and real enemies and skilfully uses all our technological capabilities to protect us, while at the same time taking every possible measure to insure that these capabilities are not abused in a way that would abrogate the rights and privacy of law-abiding citizens.

There have always been tensions between such objectives in our democracies, especially in regard to the role of the press. But as we learned in the United States during our experience with the Pentagon Papers and Watergate, it is essential that no prior governmental restraints or intimidation be imposed on a truly free press; otherwise, in such darkness, we encourage the risk of our democracies falling prey to despotism and demagoguery and even criminality by our elected leaders and government officials.

With warmest regards and admiration,

Carl Bernstein