Another scandal crashes and burns…

Source: Politics USA

Author: Jason Easley

“The latest Republican Obama scandal is starting to fall apart too. The IRS didn’t just target conservative groups. They also questioned the tax exempt status of liberal groups too.

In 2012, The Chicago Tribune reported on the IRS denying tax exempt status to a liberal political group,

The IRS announced in May and June that it took the actions against two groups defined as tax-exempt under the 501(c)(4) section of the tax code. The IRS on Thursday declined comment on its tax-exempt final rulings. Tax-exempt groups raising money for both major political parties ahead of the Nov. 6 election walk a fine line between promoting “social welfare” for tax-exempt purposes and purely political interests.

A 501(c)(4) group denied tax-exempt status by the IRS would run afoul of Federal Election Commission rules and could be
required to disclose its donors. Emerge America, a group which helps Democratic women seeking elected office, said it lost it tax-exempt status last October. The IRS invoked the “private benefit doctrine” barring 501(c)(4) status for any group promoting a candidate or political party. The IRS announced its final decision in May.

In June the IRS said it denied 501(c)(4) tax-exemption for an unnamed political group also under the private benefit doctrine. The IRS is barred by law from disclosing the group’s name and the group has not publicly identified itself. The group had one objective: to serve the political goals of its founder, the IRS said. A 501(c)(4) group can spend some funds on political advocacy, but electioneering cannot be its sole reason for existence or comprise a majority of its spending.

It looks like the IRS was not just targeting conservative groups, but was targeting political action groups who may have been violating the tax exemption guidelines. If the IRS wasn’t targeting conservatives, but trying to deal with the surge of dark money groups applying for tax exempt status, this story takes on an entirely different context.

Reuters has obtained part of a yet to be released report from the Treasury Inspector General for Tax Administration (TIGTA) that confirms that the IRS was targeting groups on the left and right who focused their activities on advocating for expanding or limiting of the size of the government. The report also states that the screening process was not influenced by the Obama administration, and that none of the groups screened were denied tax exempt status.

Without the claims of a partisan witch hunt against conservative groups, this latest Republican fueled Obama scandal is set to lose all of its sizzle.

The real reason why Republicans are desperately trying to drum up a scandal here is because they don’t want the IRS forcing their dark money groups to pay taxes. The IRS is threatening their Citizens United fueled political slush fund, and Republicans want it to stop. Republicans are trying to bully the IRS into backing off.

It turns out that Obama isn’t Richard Nixon after all. He wasn’t using the IRS to attack his enemies. In their own bungling way, the IRS was trying to deal with the problems caused by the Supreme Court’s Citizens United decision. First, Benghazi crashes and burns, and now the IRS scandal could be fading fast.

Republicans will pull from their usual “Obama scandal” playbook and hold lots of hearings, but thing looks to be on the fast track to nowhere. Congressional Republicans will try their best, but the IRS “scandal” could backfire and end up making the case for why we need to get rid of Citizens United ASAP.

Emphasis Mine

see: http://www.politicususa.com/gop-scandal-falls-irs-targeted-liberals-2012.html

 

Ayn Rand USA: In 20 Years Corporate Profits Are Up 4X and Their Taxes Have Fallen by 50% — Meanwhile the Workers’ Payroll Tax Has Doubled

Corporations have decided to let middle-class workers pay for national investments that have largely benefited businesses over the years.

Source: AlterNet

Author: Paul Buchheit

Ayn Rand’s novel “Atlas Shrugged” fantasizes a world in which anti-government citizens reject taxes and regulations, and “stop the motor” by withdrawing themselves from the system of production. In a perverse twist on the writer’s theme the prediction is coming true. But instead of productive people rejecting taxes, rejected taxes are shutting down productive people.

Perhaps Ayn Rand never anticipated the impact of unregulated greed on a productive middle class. Perhaps she never understood the fairness of tax money for public research and infrastructure and security, all of which have contributed to the success of big business. She must have known about the inequality of the pre-Depression years. But she couldn’t have foreseen the concurrent rise in technology and globalization that allowed inequality to surge again, more quickly, in a manner that threatens to put the greediest offenders out of our reach.

Ayn Rand’s philosophy suggests that average working people are ‘takers.’ In reality, those in the best position to make money take all they can get, with no scruples about their working class victims, because taking, in the minds of the rich, serves as a model for success. The strategy involves tax avoidance, in numerous forms.

Corporations Stopped Paying

In the past twenty years, corporate profits have quadrupled while the corporate tax percent has dropped by half. The payroll tax, paid by workers, has doubled.

In effect, corporations have decided to let middle-class workers pay for national investments that have largely benefited businesses over the years. The greater part of basic research, especially for technology and health care, has been conducted with government money. Even today 60% of university research is government-supported. Corporations use highways and shipping lanes and airports to ship their products, the FAA and TSA and Coast Guard and Department of Transportation to safeguard them, a nationwide energy grid to power their factories, and communications towers and satellites to conduct online business.

Yet as corporate profits surge and taxes plummet, our infrastructure is deteriorating. The American Society of Civil Engineers estimates that $3.63 trillion is needed over the next seven years to make the necessary repairs.

Turning Taxes Into Thin Air

Corporations have used numerous and creative means to avoid their tax responsibilities. They have about a year’s worth of profits stashed untaxed overseas. According to the Wall Street Journal, about 60% of their cash is offshore. Yet these corporate ‘persons’ enjoy a foreign earned income exclusion that real U.S. persons don’t get.

Corporate tax haven ploys are legendary, with almost 19,000 companies claiming home office space in one building in the low-tax Cayman Islands. But they don’t want to give up their U.S. benefits. Tech companies in 19 tax haven jurisdictions received $18.7 billion in 2011 federal contracts. A lot of smaller companies are legally exempt from taxes. As of 2008, according to IRS data, fully 69% of U.S. corporations were organized as nontaxablebusinesses.

There’s much more. Companies call their CEO bonuses “performance pay” to get a lower rate. Private equity firms call fees “capital gains” to get a lower rate. Fast food companies call their lunch menus “intellectual property” to get a lower rate.

Prisons and casinos have stooped to the level of calling themselves “real estate investment trusts” (REITs) to gain tax exemptions. Stooping lower yet, Disney and others have added cows and sheep to their greenspace to get a farmland exemption.

The Richest Individuals Stopped Paying

The IRS estimated that 17 percent of taxes owed were not paid in 2006, leaving an underpayment of $450 billion. The revenue loss from tax havens approaches $450 billion. Subsidies from special deductions, exemptions, exclusions, credits, capital gains, and loopholes are estimated at over $1 trillion. Expenditures overwhelmingly benefit the richest taxpayers.

In keeping with Ayn Rand’s assurance that “Money is the barometer of a society’s virtue,” the super-rich are relentless in their quest to make more money by eliminating taxes. Instead of calling their income ‘income,’ they call it “carried interest” or “performance-based earnings” or “deferred pay.” And when they cash in their stock options, they might look up last year’s lowest price, write that in as a purchase date, cash in the concocted profits, and take advantage of the lower capital gains tax rate.

So Who Has To Pay?

Middle-class families. The $2 trillion in tax losses from underpayments, expenditures, and tax havens costs every middle-class family about $20,000 in community benefits, including health care and education and food and housing.

Schoolkids, too. A study of 265 large companies by Citizens for Tax Justice (CTJ) determined that about $14 billion per year in state income taxes was unpaid over three years. That’s approximately equal to the loss of 2012-13 education funding due to budget cuts.

And the lowest-income taxpayers make up the difference, based on new data that shows that the Earned Income Tax Credit is the single biggest compliance problem cited by the IRS. The average sentence for cheating with secret offshore financial accounts, according to the Wall Street Journal, is about half as long as in some other types of tax cases.

Atlas Can’t Be Found Among the Rich

Only 3 percent of the CEOs, upper management, and financial professionals were entrepreneurs in 2005, even though they made up about 60 percent of the richest .1% of Americans. A recent study found that less than 1 percent of all entrepreneurs came from very rich or very poor backgrounds. Job creators come from the middle class.

So if the super-rich are not holding the world on their shoulders, what do they do with their money? According to both Marketwatch and economist Edward Wolff, over 90 percent of the assets owned by millionaires are held in a combination of low-risk investments (bonds and cash), personal business accounts, the stock market, and real estate.

Ayn Rand’s hero John Galt said, “We are on strike against those who believe that one man must exist for the sake of another.” In his world, Atlas has it easy, with only himself to think about.

Paul Buchheit teaches economic inequality at DePaul University. He is the founder and developer of the Web sites UsAgainstGreed.orgPayUpNow.org and RappingHistory.org, and the editor and main author of “American Wars: Illusions and Realities” (Clarity Press). He can be reached at paul@UsAgainstGreed.org.

Emphasis Mine

see: http://www.alternet.org/economy/ayn-rand-usa-20-years-corporate-profits-are-4x-and-their-taxes-have-fallen-50-meanwhile?akid=10427.123424.9d7q5C&rd=1&src=newsletter839254&t=5

The Science of Guns Proves Arming Untrained Citizens Is a Bad Idea

Source: Scientific American

Author: Michael Shermer

(N.B.: Consider the source)

“According to the Centers for Disease Control and Prevention, 31,672 people died by guns in 2010 (the most recent year for which U.S. figures are available), a staggering number that is orders of magnitude higher than that of comparable Western democracies. What can we do about it? National Rifle Association executive vice president Wayne LaPierre believes he knows: “The only thing that stops a bad guy with a gun is a good guy with a gun.” If LaPierre means professionally trained police and military who routinely practice shooting at ranges, this observation would at least be partially true. If he means armed private citizens with little to no training, he could not be more wrong.

Consider a 1998 study in the Journal of Trauma and Acute Care Surgery that found that “every time a gun in the home was used in a self-defense or legally justifiable shooting, there were four unintentional shootings, seven criminal assaults or homicides, and 11 attempted or completed suicides.” Pistol owners’ fantasy of blowing away home-invading bad guys or street toughs holding up liquor stores is a myth debunked by the data showing that a gun is 22 times more likely to be used in a criminal assault, an accidental death or injury, a suicide attempt or a homicide than it is for self-defense. I harbored this belief for the 20 years I owned a Ruger .357 Magnum with hollow-point bullets designed to shred the body of anyone who dared to break into my home, but when I learned about these statistics, I got rid of the gun.

More insights can be found in a 2013 book from Johns Hopkins University Press entitled Reducing Gun Violence in America: Informing Policy with Evidence and Analysis, edited by Daniel W. Webster and Jon S. Vernick, both professors in health policy and management at the Johns Hopkins Bloomberg School of Public Health. In addition to the 31,672 people killed by guns in 2010, another 73,505 were treated in hospital emergency rooms for nonfatal bullet wounds, and 337,960 nonfatal violent crimes were committed with guns. Of those 31,672 dead, 61 percent were suicides, and the vast majority of the rest were homicides by people who knew one another.

For example, of the 1,082 women and 267 men killed in 2010 by their intimate partners, 54 percent were shot by guns. Over the past quarter of a century, guns were involved in greater number of intimate partner homicides than all other causes combined. When a woman is murdered, it is most likely by her intimate partner with a gun. Regardless of what really caused Olympic track star Oscar Pistorius to shoot his girlfriend, Reeva Steenkamp (whether he mistook her for an intruder or he snapped in a lover’s quarrel), her death is only the latest such headline. Recall, too, the fate of Nancy Lanza, killed by her own gun in her own home in Connecticut by her son, Adam Lanza, before he went to Sandy Hook Elementary School to murder some two dozen children and adults. As an alternative to arming women against violent men, legislation can help: data show that in states that prohibit gun ownership by men who have received a domestic violence restraining order, gun-caused homicides of intimate female partners have been reduced by 25 percent.

Another myth to fall to the facts is that gun-control laws disarm good people and leave the crooks with weapons. Not so, say the Johns Hopkins authors: “Strong regulation and oversight of licensed gun dealers—defined as having a state law that required state or local licensing of retail firearm sellers, mandatory record keeping by those sellers, law enforcement access to records for inspection, regular inspections of gun dealers, and mandated reporting of theft of loss of firearms—was associated with 64 percent less diversion of guns to criminals by in-state gun dealers.”

Finally, before we concede civilization and arm everyone to the teeth pace the NRA, consider the primary cause of the centuries-long decline of violence as documented by Steven Pinker in his 2011 book The Better Angels of Our Nature: the rule of law by states that turned over settlement of disputes to judicial courts and curtailed private self-help justice through legitimate use of force by police and military trained in the proper use of weapons.”

This article was originally published with the title Gun Science.

Emphasis Mine

see: http://www.scientificamerican.com/article.cfm?id=gun-science-proves-arming-untrained-citizens-bad-idea&WT.mc_id=SA_CAT_BS_20130510

The 1 Percent’s Solution

(N.B.: the frame ‘reduce government spending’ is often code for: reduce spending on people of color who are poor.)

Source: NY Times

Author: Paul Krugman

(N.B.: the frame ‘reduce government spending’ is often code for: reduce spending on people of color who are poor.)

“Economic debates rarely end with a T.K.O. But the great policy debate of recent years between Keynesians, who advocate sustaining and, indeed, increasing government spending in a depression, and austerians, who demand immediate spending cuts, comes close — at least in the world of ideas. At this point, the austerian position has imploded; not only have its predictions about the real world failed completely, but the academic research invoked to support that position has turned out to be riddled with errors, omissions and dubious statistics.

Yet two big questions remain. First, how did austerity doctrine become so influential in the first place? Second, will policy change at all now that crucial austerian claims have become fodder for late-night comics?

On the first question: the dominance of austerians in influential circles should disturb anyone who likes to believe that policy is based on, or even strongly influenced by, actual evidence. After all, the two main studies providing the alleged intellectual justification for austerity — Alberto Alesina and Silvia Ardagna on “expansionary austerity” and Carmen Reinhart and Kenneth Rogoff on the dangerous debt “threshold” at 90 percent of G.D.P. — faced withering criticism almost as soon as they came out.

And the studies did not hold up under scrutiny. By late 2010, the International Monetary Fund had reworked Alesina-Ardagna with better data and reversed their findings, while many economists raised fundamental questions about Reinhart-Rogoff long before we knew about the famous Excel error. Meanwhile, real-world events — stagnation in Ireland, the original poster child for austerity, falling interest rates in the United States, which was supposed to be facing an imminent fiscal crisis — quickly made nonsense of austerian predictions.

Yet austerity maintained and even strengthened its grip on elite opinion. Why?

Part of the answer surely lies in the widespread desire to see economics as a morality play, to make it a tale of excess and its consequences. We lived beyond our means, the story goes, and now we’re paying the inevitable price. Economists can explain ad nauseam that this is wrong, that the reason we have mass unemployment isn’t that we spent too much in the past but that we’re spending too little now, and that this problem can and should be solved. No matter; many people have a visceral sense that we sinned and must seek redemption through suffering — and neither economic argument nor the observation that the people now suffering aren’t at all the same people who sinned during the bubble years makes much of a dent.

But it’s not just a matter of emotion versus logic. You can’t understand the influence of austerity doctrine without talking about class and inequality.

What, after all, do people want from economic policy? The answer, it turns out, is that it depends on which people you ask — a point documented in a recent research paper by the political scientists Benjamin Page, Larry Bartels and Jason Seawright. The paper compares the policy preferences of ordinary Americans with those of the very wealthy, and the results are eye-opening.

Thus, the average American is somewhat worried about budget deficits, which is no surprise given the constant barrage of deficit scare stories in the news media, but the wealthy, by a large majority, regard deficits as the most important problem we face. And how should the budget deficit be brought down? The wealthy favor cutting federal spending on health care and Social Security — that is, “entitlements” — while the public at large actually wants to see spending on those programs rise.

You get the idea: The austerity agenda looks a lot like a simple expression of upper-class preferences, wrapped in a facade of academic rigor. What the top 1 percent wants becomes what economic science says we must do.

Does a continuing depression actually serve the interests of the wealthy? That’s doubtful, since a booming economy is generally good for almost everyone. What is true, however, is that the years since we turned to austerity have been dismal for workers but not at all bad for the wealthy, who have benefited from surging profits and stock prices even as long-term unemployment festers. The 1 percent may not actually want a weak economy, but they’re doing well enough to indulge their prejudices.

And this makes one wonder how much difference the intellectual collapse of the austerian position will actually make. To the extent that we have policy of the 1 percent, by the 1 percent, for the 1 percent, won’t we just see new justifications for the same old policies?

I hope not; I’d like to believe that ideas and evidence matter, at least a bit. Otherwise, what am I doing with my life? But I guess we’ll see just how much cynicism is justified.

A version of this op-ed appeared in print on April 26, 2013, on page A31 of the New York edition with the headline: The 1 Percent’s Solution.

Emphasis Mine

See:http://www.nytimes.com/2013/04/26/opinion/krugman-the-one-percents-solution.html?_r=0

 

50 Reasons You Despised George W. Bush’s Presidency: A Reminder on the Day of His Presidential Library Dedication

Source: AlterNet

Author:  Steven Rosenfeld

“On Thursday, President Obama and all four living ex-presidents will attend the dedication of the $500 million George W. Bush Presidential Libraryat Southern Methodist University in Dallas, Texas. Many progressives will remember Bush as a contender for the “worst president ever,” saying he more aptly deserves a multi-million-dollar prison cell for a litany of war crimes.

Amazingly, the Bush library seeks to ask visitors “What would you have done?” if you were in this president’s shoes. The ex-president’s defenders are betting that the public will reconsider their judgments after a hefty dose of historical amnesia. Bush has been absent from political debates in recent years, instead making millions in private speeches. Today, his popularity is even with Obama’s; both have 47 percent approval rating.

Let’s look at 50 reasons, some large and some small, why W. inspired so much anger.

1. He stole the presidency in 2000. People may forget that Republicans in Florida purged more than 50,000 African-American voters before Election Day, and then went to the Supreme Court where the GOP-appointed majority stopped a recount that would have awarded the presidency to Vice-President Al Gore if all votes were counted. National news organizations verified that outcome long after Bush had been sworn in.

2. Bush’s lies started in that race. Bush ran for office claiming he was a “uniter, not a divider.” Even though he received fewer popular votes than Gore, he quickly claimed he had the mandate from the American public to push his right-wing agenda.

3. He covered up his past. He was a party boy, the scion of a powerful political family who got away with being a deserter during the Vietnam War. He was reportedly AWOL for over a year from his assigned unit, the Texas Air National Guard, which other military outfits called the “Champagne Division.”

4. He loved the death penalty. As Texas governor from 1995-2000, he signedthe most execution orders of any governor in U.S. history—152 people, including the mentally ill and women who were domestic abuse victims. He spared one man’s life, a serial killer.

5. He was a corporate shill from Day 1. Bush locked up the GOP nomination by raising more campaign money from corporate boardrooms than anyone at that time. He lunched with CEOs who would jet into Austin to “educate” him about their political wish lists.

6. He gutted global political progress.He pulled out of the Kyoto Protocol which set requirements for 38 nations to lower greenhouse gas emissions to combat climate change, saying that abiding by the agreement would “harm our economy and hurt our workers.”

7. He embraced global isolationism. He withdrew from the 1972 Anti-Ballistic Missile Treaty, over Russia’s protest, taking the U.S. in a direction not seen since World War I.

8. He ignored warnings about Osama bin Laden. He ignored the Aug. 6, 2001 White House intelligence briefing titled, “Bin Laden determined to strike in the U.S.” Meanwhile, his chief anti-terrorism advisor, Richard Clarke, and first Treasury Secretary, Paul O’Neill, testified in Congress that he was intent on invading Iraq within days of becoming president.

9. Ramped up war on drugs, not terrorists. The Bush administration had twice as many FBI agents assigned to the war on drugs than fighting terrorism before 9/11, and kept thousands in that role after the terror attacks.

10. “My Pet Goat.” He kept reading a picture book to grade-schoolers for seven minutes after his top aides told him that the World Trade Centers had been attacked in 9/11. Then Air Force One flew away from Washington, D.C., vanishing for hours after the attack.

11. Squandered global goodwill after 9/11. Bush thumbed his nose at world sympathy for the victims of the September 11, 2001 attacks, by declaring a global war on terrorism and declaring “you are either with us or against us.”

12. Bush turned to Iraq not Afghanistan. The Bush administration soon started beating war drums for an attack on Iraq, where there was no proven Al Qaeda link, instead of Afghanistan, where the 9/11 bombers had trained and Osama bin Laden was based. His 2002 State of the Union speech declared that Iraq was part of an “Axis of Evil.”

13. Attacked United Nation weapons inspectors. The march to war in Iraqstarted with White House attacks on the credibility of U.N. weapons inspectors in Iraq, whose claims that Saddam Hussein did not have nuclear weapons proved to be true.

14. He flat-out lied about Iraq’s weapons. In a major speech in October 2002, he said that Saddam Hussein had the capacity to send unmanned aircraft to the U.S. with bombs that could range from chemical weapons to nuclear devices. “We cannot wait for the final proof—the smoking gun—that could come in the form of a mushroom cloud,” he said.

15. He ignored the U.N. and launched a war. The Bush administration tried to get the U.N. Security Council to authorize an attack on Iraq, which it refused to do. Bush then decided to lead a “preemptive” attack regardless of international consequences. He did not wait for any congressional authorization to launch a war.

16. Abandoned international Criminal Court. Before invading Iraq, Bush told the U.N. that the U.S. was withdrawing from ratifying the International Criminal Court Treaty to protect American troops from persecution and to allow it to pursue preemptive war.

17. Colin Powell’s false evidence at U.N. The highly decorated soldier turned Secretary of State presented false evidence at the U.N. as the American mainstream media began its jingoistic drumbeat to launch a war of choice on Saddam Hussein and Iraq.

18. He launched a war on CIA whistleblowers. When a former ambassador, Joseph C. Wilson, wrote a New York Times op-ed saying there was no nuclear threat from Iraq, the White House retaliated by leaking the name and destroying the career of his wife, Valerie Plame, one of the CIA’s top national security experts.

19. Bush pardoned the Plame affair leaker. Before leaving office, Bushpardoned the vice president’s top staffer, Scooter Libby, for leaking Plame’s name to the press.

20. Bush launched the second Iraq War. In April 2003, the U.S. military invaded Iraq for the second time in two decades, leading to hundreds of thousands of civilian deaths and more than a million refugees as a years of sectarian violence took hold on Iraq. Nearly 6,700 U.S. soldiers have died in the Iraq and Afghan wars.

21. Baghdad looted except for oil ministry. The Pentagon failure to plan for a military occupation and transition to civilian rule was seen as Baghdad was looted while troops guarded the oil ministry, suggesting this war was fought for oil riches, not terrorism.

22. The war did not make the U.S. safer. In 2006, a National Intelligence Estimate (a consensus report of the heads of 16 U.S. intelligence agencies)asserted that the Iraq war had increased Islamic radicalism and had worsened the terror threat.

23. U.S. troops were given unsafe gear. From inadequate vests from protection against snipers to Humvees that could not protect soldiers from roadside bombs, the military did not sufficiently equip its soldiers in Iraq, leading to an epidemic of brain injuries.

24. Meanwhile, the war propaganda continued. From landing on an aircraft carrier in a flight suit to declare “mission accomplished” to surprising troops in Baghdad with a Thanksgiving turkey that was a table decoration used as a prop, Bush defended his war of choice by using soldiers as PR props.

25. He never attended soldiers’ funerals. For years after the war started, Bushnever attended a funeral even though as of June 2005, 144 soldiers (of the 1,700 killed thus far) were laid to rest in Arlington National Cemetary, about two miles from the White House.

26. Meanwhile, war profiteering surged.The list of top Bush administration officials whose former corporate employers made billions in Pentagon contracts starts with Vice-President Dick Cheney and Halliburton, which made $39.5 billion, and included his daughter, Liz Cheney, who ran a $300 million Middle East partnership program.

27. Bush ignored international ban on torture. Suspected terrorists were captured and tortured by the U.S. military in Baghdad’s Abu Gharib prison, in the highest profile example of how the Bush White House ignored international agreements, such as the Geneva Convention, that banned torture, and created a secret system of detention that was unmasked when photos made their way to the American media outlets.

28. Created the blackhole at Gitmo and renditions. The Bush White House created the offshore military prison at Guantanamo Bay, Cuba, as well as secretdetention sites in eastern Europe to evade domestic and military justice systems. Many of the men still jailed in Cuba were turned over to the U.S. military by bounty hunters.

29. Bush violated U.S. Constitution as well.The Bush White House ignored basic civil liberties, most notably by launching a massive domestic spying program where millions of Americans’ online activities were monitored with the help of big telecom companies. The government had no search warrant or court authority for its electronic dragnet.

30. Iraq war created federal debt crisis.The total costs of the Iraq and Afghan wars will reach between $4 trillion and $6 trillion, when the long-term medical costs are added in for wounded veterans, a March 2013 report by a Harvard researcher has estimated. Earlier reports said the wars cost $2 billion a week.

31. He cut veterans’ healthcare funding. At the height of the Iraq war, the White House cut funding for veterans’ healthcare by several billion dollars, slashed more than one billion from military housing and opposed extending healthcare to National Guard families, even as they were repeatedly tapped for extended and repeat overseas deployments.

32. Then Bush decided to cut income taxes. In 2001 and 2003, a series of bills lowered income tax rates, cutting federal revenues as the cost of the foreign wars escalated. The tax cuts disproportionately benefited the wealthy, with roughly one-quarter going to the top one percent of incomes compared to 8.9% going to the middle 20 percent. The cuts were supposed to expire in 2013, but most are still on the books.

33. Assault on reproductive rights.From the earliest days of his first term, the Bush White House led a prolonged assault on reproductive rights. He cut funds for U.N. family planning programs, barred military bases from offering abortions, put right-wing evangelicals in regulatory positions where they rejected new birth control drugs, and issued regulations making fetuses—but not women—eligible for federal healthcare.

34. Cut Pell Grant loans for poor students. His administration froze Pell Grants for years and tightened eligibility for loans, affecting 1.5 million low-income students. He also eliminated other federal job training programs that targeted young people.

35. Turned corporations loose on environment. Bush’s environmental record was truly appalling, starting with abandoning a campaign pledge to tax carbon emissions and then withdrawing from the Kyoto Protocol on greenhouse gases. The Sierra Club lists 300 actions his staff took to undermine federal laws, from cutting enforcement budgets to putting industry lobbyists in charge of agencies to keeping energy policies secret.

36.. Said evolution was a theory—like intelligent design.One of his most inflammatory comments was saying that public schools should teach that evolution is a theory with as much validity as the religious belief in intelligent design, or God’s active hand in creating life.

37. Misguided school reform effort. Bush’s “No Child Left Behind” initiativemade preparation for standardized tests and resulting test scores the top priority in schools, to the dismay of legions of educators who felt that there was more to learning than taking tests.

38. Appointed flank of right-wing judges. Bush’s two Supreme Court picks—Chief Justice John Roberts and Associate Justice Samuel Alito—have reliably sided with pro-business interests and social conservatives. He also elevated U.S. District Court Judge Charles Pickering to an appeals court, despite his known segregationist views.

39. Gutted the DOJ’s voting rights section. Bush’s Justice Department appointees led a multi-year effort to prosecute so-called voter fraud, includingfiring seven U.S. attorneys who did not pursue overtly political cases because of lack of evidence.

40. Meanwhile average household incomes fell. When Bush took office in 2000, median household incomes were $52,500. In 2008, they were $50,303, a drop of 4.2 percent, making Bush the only recent two-term president to preside over such a drop.

41. And millions more fell below the poverty line. When Bill Clinton left office, 31.6 million Americans were living in poverty. When Bush left office, there were 39.8 million, according to the U.S. Census, an increase of 26.1 percent. The Census said two-thirds of that growth occurred before the economic downturn of 2008.

42. Poverty among children also exploded. The Census also found that 11.6 million children lived below the poverty line when Clinton left office. Under Bush, that number grew by 21 percent to 14.1 million.

43. Millions more lacked access to healthcare. Following these poverty trends, the number of Americans without health insurance was 38.4 million when Clinton left office. When Bush left, that figure had grown by nearly 8 million to 46.3 million, the Census found. Those with employer-provided benefits fell every year he was in office.

44. Bush let black New Orleans drown. Hurricane Katrina exposed Bush’s attitude toward the poor. He didn’t visit the city after the storm destroyed the poorest sections. He praised his Federal Emergency Management Agency director for doing a “heck of a job” as the federal government did little to help thousands in the storm’s aftermath and rebuilding.

45. Yet pandered to religious right. Months before Katrina hit, Bush flew back to the White House to sign a bill to try to stop the comatose Terri Schiavo’s feeding tube from being removed, saying the sanctity of life was at stake.

46. Set record for fewest press conferences. During his first term that was defined by the 9/11 attacks, he had the fewest press conferences of any modern president and had never met with the New York Times editorial board.

47. But took the most vacation time. Reporters analyzing Bush’s record foundthat he took off 1,020 days in two four-year terms—more than one out of every three days. No other modern president comes close. Bush also set the record for the longest vacation among modern presidents—five weeks, the Washington Post noted.

48. Karl Rove, Dick Cheney, Donald Rumsfeld. Not since Richard Nixon’s White House and the era of the Watergate burglary and expansion of the Vietnam War have there been as many power-hungry and arrogant operators holding the levers of power. Cheney ran the White House; Rove the political operation for corporations and the religious right; and Rumsfeld oversaw the wars.

49. He’s escaped accountability for his actions. From Iraq war General Tommy Franks’ declaration that “we don’t do body counts” to numerous efforts to impeach Bush and top administration officials—primarily over launching the war in Iraq—he has never been held to account in any official domestic or international tribunal.

50. He may have stolen the 2004 election as well. The closest Bush came to a public referendum on his presidency was the 2004 election, which came down to the swing state of Ohio. There the GOP’s voter suppression tactics rivaledFlorida in 2000 and many unresolved questions remain about whether the former GOP Secretary of State altered the Election Night totals from rural Bible Belt counties.

Any bright spots? Conservatives will lambaste lists like this for finding nothing good about a president like W. So, yes, he created the largest ocean preserve offshore from Hawaii in his second term. And in his final year in office, his initiative to fight AIDS across Africa has been credited with saving many thousands of lives. But on balance, George W. Bush was more than eight years of missed opportunities for America and the world. He was a disaster, leaving much of America and the world in much worse shape than when he took the oath of office in 2001. His reputation should not be resurrected or restored or seen as anything other than what it was.”

Emphasis Mine

See: http://www.alternet.org/news-amp-politics/50-reasons-you-despised-george-w-bushs-presidency-reminder-day-his-presidential?akid=10363.123424.Vl7-RQ&rd=1&src=newsletter830237&t=3

 

Why Does America Lose Its Head Over ‘Terror’ But Ignore Its Daily Gun Deaths?

The marathon bombs triggered a reaction that is at odds with last week’s inertia over arms control.

Source: From the Guardian, via RSN

Author: Michael Cohen

“The thriving metropolis of Boston was turned into a ghost town on Friday. Nearly a million Bostonians were asked to stay in their homes – and willingly complied. Schools were closed; business shuttered; trains, subways and roads were empty; usually busy streets eerily resembled a post-apocalyptic movie set; even baseball games and cultural events were cancelled – all in response to a 19-year-old fugitive, who was on foot and clearly identified by the news media.

The actions allegedly committed by the Boston marathon bomber, Dzhokhar Tsarnaev and his brother, Tamerlan, were heinous. Four people dead and more than 100 wounded, some with shredded and amputated limbs.

But Londoners, who endured IRA terror for years, might be forgiven for thinking that America over-reacted just a tad to the goings-on in Boston. They’re right – and then some. What we saw was a collective freak-out like few that we’ve seen previously in the United States. It was yet another depressing reminder that more than 11 years after 9/11 Americans still allow themselves to be easily and willingly cowed by the “threat” of terrorism.

After all, it’s not as if this is the first time that homicidal killers have been on the loose in a major American city. In 2002, Washington DC was terrorised by two roving snipers, who randomly shot and killed 10 people. In February, a disgruntled police officer, Christopher Dorner, murdered four people over several days in Los Angeles. In neither case was LA or DC put on lockdown mode, perhaps because neither of these sprees was branded with that magically evocative and seemingly terrifying word for Americans, terrorism.

To be sure, public officials in Boston appeared to be acting out of an abundance of caution. And it’s appropriate for Boston residents to be asked to take precautions or keep their eyes open. But by letting one fugitive terrorist shut down a major American city, Boston not only bowed to outsize and irrational fears, but sent a dangerous message to every would-be terrorist – if you want to wreak havoc in the United States, intimidate its population and disrupt public order, here’s your instruction booklet.

Putting aside the economic and psychological cost, the lockdown also prevented an early capture of the alleged bomber, who was discovered after Bostonians were given the all clear and a Watertown man wandered into his backyard for a cigarette and found a bleeding terrorist on his boat.

In some regards, there is a positive spin on this – it’s a reflection of how little Americans have to worry about terrorism. A population such as London during the IRA bombings or Israel during the second intifada or Baghdad, pretty much every day, becomes inured to random political violence. Americans who have such little experience of terrorism, relatively speaking, are more primed to overreact – and assume the absolute worst when it comes to the threat of a terror attack. It is as if somehow in the American imagination, every terrorist is a not just a mortal threat, but is a deadly combination of Jason Bourne and James Bond.

If only Americans reacted the same way to the actual threats that exist in their country. There’s something quite fitting and ironic about the fact that the Boston freak-out happened in the same week the Senate blocked consideration of a gun control bill that would have strengthened background checks for potential buyers. Even though this reform is supported by more than 90% of Americans, and even though 56 out of 100 senators voted in favour of it, the Republican minority prevented even a vote from being held on the bill because it would have allegedly violated the second amendment rights of “law-abiding Americans”.

So for those of you keeping score at home – locking down an American city: a proper reaction to the threat from one terrorist. A background check to prevent criminals or those with mental illness from purchasing guns: a dastardly attack on civil liberties. All of this would be almost darkly comic if not for the fact that more Americans will die needlessly as a result. Already, more than 30,000 Americans die in gun violence every year (compared to the 17 who died last year in terrorist attacks).

What makes US gun violence so particularly horrifying is how routine and mundane it has become. After the massacre of 20 kindergartners in an elementary school in Newtown, Connecticut, millions of Americans began to take greater notice of the threat from gun violence. Yet since then, the daily carnage that guns produce has continued unabated and often unnoticed.

The same day of the marathon bombing in Boston, 11 Americans were murdered by guns. The pregnant Breshauna Jackson was killed in Dallas, allegedly by her boyfriend. In Richmond, California, James Tucker III was shot and killed while riding his bicycle – assailants unknown. Nigel Hardy, a 13-year-old boy in Palmdale, California, who was being bullied in school, took his own life. He used the gun that his father kept at home. And in Brooklyn, New York, an off-duty police officer used her department-issued Glock 9mm handgun to kill herself, her boyfriend and her one-year old child.

At the same time that investigators were in the midst of a high-profile manhunt for the marathon bombers that ended on Friday evening, 38 more Americans – with little fanfare – died from gun violence. One was a 22-year old resident of Boston. They are a tiny percentage of the 3,531 Americans killed by guns in the past four months – a total that surpasses the number of Americans who died on 9/11 and is one fewer than the number of US soldiers who lost their lives in combat operations in Iraq. Yet, none of this daily violence was considered urgent enough to motivate Congress to impose a mild, commonsense restriction on gun purchasers.

It’s not just firearms that produce such legislative inaction. Last week, a fertiliser plant in West, Texas, which hasn’t been inspected by federal regulators since 1985, exploded, killing 14 people and injuring countless others. Yet many Republicans want to cut further the funding for the agency (OSHA) that is responsible for such reviews. The vast majority of Americans die from one of four ailments – cardiovascular disease, cancer, diabetes and chronic lung disease – and yet Republicans have held three dozen votes to repeal Obamacare, which expands healthcare coverage to 30 million Americans.

It is a surreal and difficult-to-explain dynamic. Americans seemingly place an inordinate fear on violence that is random and unexplainable and can be blamed on “others” – jihadists, terrorists, evil-doers etc. But the lurking dangers all around us – the guns, our unhealthy diets, the workplaces that kill 14 Americans every single day – these are just accepted as part of life, the price of freedom, if you will. And so the violence goes, with more Americans dying preventable deaths. But hey, look on the bright side – we got those sons of bitches who blew up the marathon.”

Emphasis Mine

See: http://readersupportednews.org/opinion2/416-gun-control-/17076-why-does-america-lose-its-head-over-terror-but-ignore-its-daily-gun-deaths

 

History, culture, mistrust combined to defeat gun control effort

Source: McClatchy

(N.B.: when we use the frame ‘gun control’, our message fails; when we use ‘reduced gun violence’, our message succeeds.

It might also be noted that as the First Amendment does not protect the dissemination of child pornography; the Second does not protect unlimited access to firearms of any type.)

Author: David Lightman

Why is it so hard for even the most modest gun-control effort to succeed?

The easy answer is the power of the gun lobby, but the obstacles are far more complex. Growing numbers of people distrust Washington. A deeply rooted gun culture sees big government as a threat to its security, not to mention its constitutional rights. Members of Congress from conservative areas are well aware that votes on gun control, even in baby steps, are politically perilous.

Gun control advocates thought their task would be so different this week. President Barack Obama was making a passionate, heartfelt pitch unlike almost any he’d made before during his presidency. A congressional colleague, former Rep. Gabrielle Giffords, D-Ariz., who was severely wounded during the Tucson shootings two years ago, visited the Capitol to make her plea. Families of recent shooting victims visited senators and watched them vote.

But what began as an energetic effort to finally get something new on the books wound up in defeat after defeat, and on Thursday the bill was pulled. There’s no telling when it will return or what might change if it does, because switching votes is going to require changing some profoundly held views.

The biggest hurdle is overcoming the long-simmering, ever-growing public fear that government is too intrusive and incompetent. That attitude almost scuttled the 2010 health care law, as people resented government forcing them to buy coverage. People also became concerned that “death panels” would be created to determine who’d live or die.

They weren’t, but Republicans have tried nearly three dozen times to repeal the law, which will require nearly everyone to obtain health insurance next year or pay a fine. More government intrusion, they say.

The resistance to more gun control follows a similar pattern.

“There certainly is an erosion of trust and confidence in the competence of government,” said Sen. Susan Collins of Maine, one of four Republicans who voted to toughen background checks. “People often don’t trust government to protect them, and there’s a very distressing lack of any confidence government will keep its word.”

A Pew Research Center survey this month found that only 13 percent of Republicans have favorable views of the federal government, compared with 27 percent of independents and 41 percent of Democrats. Gun rights advocates argue that if Washington wants to gain some trust, it should enforce the laws that already are on the books.

“More gun laws are not the solution,” said Sen. Mike Crapo, R-Idaho. And when gun control advocates try even to tinker with gun laws, they tinker with what Sen. Roger Wicker, R-Miss., called “the depth of feeling about the Second Amendment.”

Millions of Americans grow up with guns in the home, for hunting, self-defense and other uses. “In northern Maine, guns are part of the lifestyle,” Collins said.

Learning to use a gun is as common as learning to drive a car or use hand tools, and any effort by Washington to infringe on that right is viewed with suspicion. That’s why even a mild form of gun control – expanding background checks to gun shows and online sales while exempting private transactions – got nowhere.

“People see it as the nose under the camel’s tent,” former Senate Republican leader Trent Lott said. “They ask, ‘Where does this end?’ ”

Add to this mix some raw politics. Of the five Democrats who voted against expanded background checks, three face difficult re-elections next year: Montana’s Max Baucus, Alaska’s Mark Begich and Arkansas’ Mark Pryor. North Dakota’s Heidi Heitkamp, whose state Obama lost last year by nearly 20 percentage points, joined them. So did Senate Majority Leader Harry Reid of Nevada, though he voted no only for procedural reasons.

While polls suggest that the senators’ re-elections probably won’t be won or lost on gun issues, gun interests are well-heeled and offer a simple explanation as to why the background check plan was misguided.

“Expanding background checks, at gun shows or elsewhere, will not reduce violent crime or keep our kids safe in their schools,” said Chris Cox, the executive director of the Institute for Legislative Action, the political and lobbying arm of the National Rifle Association.

Change is especially difficult in this age of polarization. An NBC News analysis found that 39 of the votes against the expanded background checks came from senators in states that Obama didn’t carry. Two of the swing votes who sided with the opponents, Nevada’s Dean Heller and New Hampshire’s Kelly Ayotte, both Republicans, were from states that Obama won.

If a gun control measure makes it to the House of Representatives, the red-blue state divide is likely to be more obvious. Republicans control 233 of the 435 seats there, and districts often are so carefully drawn that most are downright politically monolithic.

Will the gun control forces’ task get any easier? They say yes, that as people become more educated, as Obama presses harder, as the victims’ families keep up the heat, people will come around.

It won’t be that easy, said Sen. Joe Manchin, D-W.Va., who explained, “It’s very hard for someone from a gun culture to vote for gun control.” Email: dlightman@mcclatchydc.com; Twitter: @lightmandavid

Again, vote for ‘reduced gun violence’, not ‘gun control’!

Emphasis Mine

See: http://www.mcclatchydc.com/2013/04/19/189101/history-culture-mistrust-combined.html#emlnl=Weekly_Politics_Update

 

Fox News’ Audience Is Literally Dying: Is Roger Ailes’ Grand Experiment in Propaganda Doomed?

Source: The Nation, via Alternet

Author: Reed Richardson

“In the annals of Fox News, October 2012 will likely stand out as a shining moment. Buoyed by a wave of Republican optimism about Mitt Romney’s presidential campaign, the network seemed tantalizingly close to realizing one of its key ideological goals: ousting President Obama from the White House. Renewed enthusiasm among conservatives was, in turn, triggering record-high ratings for much of the network’s programming and helping it to beat not just rival news competitors MSNBC and CNN during prime time, but every other TV channel on the cable dial. What’s more, the prospect of an ascendant GOP come January meant Fox News might soon return to the era of access and prestige it enjoyed in Washington during the presidency of George W. Bush. The future looked so bright that News Corporation CEO Rupert Murdoch signed Fox News president Roger Ailes to a lucrative four-year contract extension, even though the 72-year-old Ailes’s existing contract wasn’t due to expire until 2013.

Then November arrived, and with it reality.

Fox News’s shellshocked election night coverage, punctuated by Karl Rove’s surreal meltdown upon hearing of Obama’s victory in Ohio and, thus, the election, capped off a historic day of reckoning for the network and conservatives alike. Chastened by defeat, Republican politicians and right-wing pundits have subsequently been grappling with the repercussions of the caustic tone and incendiary rhetoric their movement has adopted. This ongoing debate about whether broadening conservatism’s appeal requires new messages or just new messaging has ignored the 800-pound gorilla in the room, however. Noticeably absent from all the right wing’s public self-criticism is any interest in confronting the potent role played by the Republican Party’s single most important messenger, Fox News.

Standing at the epicenter of the network—and any new Republican Party groundswell—is Ailes. A former political operative of President Richard Nixon, Ailes has inextricably intertwined his professional and political pursuits since founding Fox News in 1996. Indeed, the network chief functions as a kind of proxy kingmaker within the party, frequently meeting with Republican politicians to offer strategic advice. He is a regular confidant of Senate minority leader Mitch McConnell, and at various times, he (or a network emissary of his) has counseled 2008 GOP vice presidential candidate Sarah Palin, New Jersey Governor Chris Christie and Gen. David Petraeus on their potential future. “Ailes,” says former Reagan White House economic adviser Bruce Bartlett, “is quite open about offering his free advice to Republicans…. If you visit New York City, you go see Roger Ailes and kiss his ring. It’s like visiting the Vatican. My guess is that there’s a lot of back-and-forth between Ailes and whoever is at the pinnacle of power in the Republican Party.”

To keep relying on a shrinking number of elderly, white and male subsets of the public, whether to win elections or win ratings, has become a strategy of diminishing returns, however. “I think that you can’t separate the problem at Fox [News] from the problem that the Republicans are going through,” Bartlett says. He can speak firsthand to this incestuous relationship, as his 2006 book, Impostor—which broke with party orthodoxy over the Bush administration’s deficit spending—quickly made him persona non grata at Fox News, he says. (Fox News did not respond to questions about his comment.) “The Republicans are trying to retool to win. That’s all they care about, and they’re trying to decide, ‘How can we be more pragmatic? How can we shave off the rough edges? How can we get rid of the whack jobs who are embarrassing us, costing us Senate seats? But at the same time, we can’t do this in such a way that it alienates our base.'” Fox News faces a similar dilemma, Bartlett contends: “It’s ‘How do we modernize? How do we attract new audiences without losing the old audience? How do we remain relevant without abandoning our traditions?'”

These are fundamental questions, and lately Fox News’s 
fundamentals—audience, ratings and public trust—have faltered. A 2010 study by Steve Sternberg found the network’s viewership to be the oldest (with an average age of 65) among an already elderly cable news audience. (CNN’s was 63 and MSNBC’s was 59.) By comparison, lifestyle cable channels Oxygen, Bravo and TLC were among the youngest, with an average viewer age of 42. And with MSNBC’s recent decision to plug 34-year-old rising star Chris Hayes into the coveted 
8 pm slot, the average age of that network’s prime-time hosts will now be 45, while Fox News’s rotation, anchored by 63-year-old Bill O’Reilly, has an average age of 57.

Having cable news’s oldest average age for both prime-time hosts and audiences represents something of a double-edged sword for Fox in the cutthroat world of cable TV. One advantage is that older audiences are traditionally more loyal, which is why several industry experts say that Fox News is unlikely to be dislodged from its perch atop overall cable TV news ratings anytime soon. This age-loyalty effect redounds to the benefit of Fox News’s best-known prime-time hosts, Sean Hannity and Bill O’Reilly, as roughly two-thirds of their viewers are age 50 or older, according to a recent Pew State of the News Media survey.

But at the same time, there is an undeniable actuarial reality at work—or as Bartlett bluntly puts it, “Their viewership is quite literally dying.” The most lucrative advertising dollars flow to TV shows that attract viewers “in the demo,” short for “demographic”—industry parlance for people ages 25 to 54. By contrast, Fox News’s prime-time commercial breaks are blanketed with pitches for cheap medical devices and insurance companies aimed at retirees and the elderly. Perhaps not surprisingly, the network’s advertising rates have grown at a much more modest pace in recent years, according to the Pew survey. Similarly, the growth of its ad revenues has diminished every year since 2008.

Because of the relatively older age and smaller size of the cable news audience, viewership tends to be relatively stable, says Columbia University Journalism School professor and former NBC News president Richard Wald. “Its [ratings] move in very small increments.” To understand why viewers come and go, he compares a TV network‘s audience to a target with concentric rings. The core audience—those who are loyal to your channel and watch frequently (and, for partisan media outlets, those who are most ideologically compatible)—is the bull’s-eye. Each concentric ring outward represents a segment of the audience that is less likely to watch because of diminished interest or less enthusiastic partisan sympathies. Dramatic ratings shifts can occur, but they tend to be driven by external events, like elections, rather than programming and thus affect all of the networks simultaneously. Most ratings fluctuations are statistical noise, Wald says, resulting from people in the outermost rings tuning in or out based on varying interest. “I would guess that [Fox News’s] numbers could change by 5, 6, 7, 8 percent and not reflect a change in the loyalty of the audience.”

But here, too, the news does not bode well. Though the network did retain its status as the top-rated cable news network in 2012—its eleventh consecutive year at number one—the steep drop in ratings that its shows have experienced since Election Day has raised eyebrows, precisely because corresponding shows on MSNBC and CNN have not experienced the same precipitous decline.

Just how much of a drop are we talking about? According to Nielsen data, Fox News’s prime-time monthly audience fell to its lowest level in twelve years in January among the 25-to-54 demographic. Daytime Fox News programming likewise saw its lowest monthly ratings in this age cohort since June 2008. Even the network’s two biggest stars, O’Reilly and Hannity, have not been immune from viewer desertion: Hannity lost close to 50 percent of his pre-election audience in the final weeks of 2012, and O’Reilly more than a quarter. The slide hasn’t stopped in 2013, either. Compared with a year ago, O’Reilly’s February prime-time ratings dropped 
26 percent in the coveted 25-to-54 demographic, his worst performance since July 2008. Hannity’s sank even further, to the lowest point in his show’s history.

As Wald points out, short-term ratings snapshots can be deceptive. But in the weeks following Obama’s 2009 inauguration, Fox News’s viewership actually surged, averaging 539,000 prime-time demo viewers versus 388,000 and 357,000 for CNN and MSNBC, respectively. This past January, however, Fox could only muster 267,000 average nightly viewers—a 50 percent drop from that 2009 level, and not much more than MSNBC’s 235,000 or CNN’s 200,000.

So why are all these Fox News viewers tuning out? Some of the decline may be due to a broader cultural trend of people deciding to avoid cable TV news altogether. However, a recent Public Policy Polling survey of news media trustworthiness suggests there’s more going on than public apathy. In February, PPP found a marked drop in Fox News’s credibility. A record-high
46 percent of Americans say they put no trust in the network, a nine-point increase over 2010. What’s more, 39 percent name Fox News as their least-trusted news source, dwarfing all other news channels. (MSNBC came in second, at 14 percent.)

As might be expected, Fox News’s credibility barely budged among liberals and moderates (roughly three-quarters of whom still distrust the network) and very conservative viewers (three-quarters of whom still trust it). However, among those who identified themselves as “somewhat conservative,” the level of trust fell by an eye-opening 27 percentage points during the previous twelve months (from a net plus–47 percent  “trust” rating in 2012 to plus–20 percent now). Only a bare majority of center-right conservatives surveyed by PPP say that Fox News is trustworthy.

“The people who are among the moderate-rights are actually the ones tuning out most,” says Dan Cassino, a political science professor at Fairleigh Dickinson University who specializes in studying partisan psychology. Last May, Cassino conducted a survey that found Fox News’s viewers were less informed about current political issues than those who watched no news at all. In response, the network’s public relations team mocked FDU’s college ranking in Forbes and belittled its student body as “ill-informed.” This kind of ad hominem attack symbolizes the over-the-top, pugilistic messaging style of Ailes, whose no-holds-barred political instincts have dictated the network’s direction since day one.

Ailes’s foundational idea for Fox News, explains Washington Post media critic Erik Wemple, was to package this bias under the guise of “fair and balanced” news. “It is indeed the artifice of neutrality that makes so much of what they do objectionable, or not just objectionable but noteworthy,” Wemple says. And it is effective, he adds: at a recent Value Voters conference, rock-ribbed conservatives almost involuntarily spouted the network’s motto back at him when he asked them about Fox’s coverage. It’s a maddeningly clever bit of misdirection—the network whose branding is most identified with objectivity and accuracy is, in fact, anything but.

“Fox viewers are the most misled…especially in areas of political controversy,” Chris Mooney writes in The Republican Brain, his 2012 book about the psychology of right-wing myths. The network’s singularly corrosive impact on its viewers’ understanding of reality, confirmed by numerous studies Mooney highlights in his book, is amplified by this “fair and balanced” motto, he says. It delegitimizes all other news media to create a vicious feedback loop within the right wing.

Thanks to its loyal conservative audience and its cozy relationship with the GOP leadership, Fox News has long been insulated from the consequences of its serial misinforming. “If your job is to say the most outrageous thing you possibly can and be rewarded for it, why shouldn’t you?” Cassino points out. “As long as you get ratings, you’re going to keep on doing it.” But the recent erosion in ratings and cracks in the network’s reputation, Cassino says, have created external pressure to make changes inside the network. (Neither Ailes nor anyone else at Fox News would comment when contacted for this story.)

Most notable among these post-election changes involved Fox News ridding itself of contributors Sarah Palin and Dick Morris and replacing them with former Congressman and left-wing gadfly Dennis Kucinich, former GOP Senator Scott Brown of Massachusetts, and RedState.com editor in chief Erick Erickson. To some, this personnel turnover confirmed that Fox News was embracing a more intellectually honest, ideologically diverse worldview.

But there’s less here than meets the eye. First of all, the impact an individual contributor can have on the network’s overall nature is minimal; permanent hosts like O’Reilly and Hannity drive its day-to-day brand. And in the midst of the 2012 campaign, Ailes locked up O’Reilly and Hannity as well as news host Bret Baier—the Fox News lineup from 7 through 10 pm—all the way to 2016. What’s more, one shouldn’t read too much into the cashiering of Palin and Morris, since, by all accounts, they were terrible at their jobs: the former was criticized internally for being uncooperative with programming suggestions and personally disloyal to Ailes, while the latter was guilty of humiliating the network with his ridiculous election predictions (as well as auctioning off an unauthorized personal tour of Fox News’ studios at a GOP fund-
raiser). “They were only interested in promoting themselves or perhaps promoting an ideology that may not win,” says Bartlett, who singles out Palin’s lack of substance for his harshest criticism. “Totally and professionally, she’s the Lindsay Lohan of cable news.”

Indeed, Ailes’s new hires are little more than new faces plugged into a well-worn programming strategy. Kucinich fills the slot of house liberal formerly occupied by Alan Colmes, serving as a handy foil for conservatives to shout at or over. The telegenic Brown, a blue-state Republican, endorses textbook anti-woman Republican policies, but does so without giving off an overtly extremist vibe. And die-hard conservative Erickson is there to reassure the Tea Partiers and the netroots—some of whom inexplicably believe that Fox News is drifting left—that they still have a voice on the network.

Erickson is an interesting case. In February, not long after being hired by Fox, he posted a refreshingly frank essay complaining that the conservative media functions like an “echo chamber” that “peddle[s] daily outrage.” Erickson, however, was careful not to include his new employer by name. Of course, selective indignation is something of a running theme for him. After accusing Supreme Court Justice David Souter of bestiality and pederasty in 2009, it took him almost a year to apologize—waiting until after he took a prominent pundit gig at CNN. “Erick Erickson is obviously a whack job by the standards of a normal person,” says Bartlett. “But within the ranks of the right-wing wacko universe, he is far closer to the center than, say, Sarah Palin, because at the bottom, he wants to win, see, where people like Sarah Palin don’t give a fuck about winning.”

Winning, famously, is what drives Ailes, and Rove as well. In the aftermath of the election, Fox instructed Rove to lie low for several weeks. But this benching didn’t last long, and by mid-
January the network had signed him to a new multi-year contract. Coincidentally, Rove launched a new project geared toward finding more electable candidates for 2014 just a few weeks later. But if the past is prologue, many of these future candidates won’t be acceptable to fellow Fox commentator Erickson. “This is perfect grist for the sort of stuff Fox loves to do: ‘Let’s have a debate between somebody on the right and somebody on the far right,'” Bartlett explains. “That suits their agenda just fine.”

In other words, the best interests of Fox News and those of the Republican Party, though inexorably connected, aren’t always aligned. The currency of the former is ratings and of the latter, votes. “There’s always a tension between the two,” says Jonathan Ladd, political science professor at Georgetown University and author of the 2012 book Why Americans Hate the Media and How It Matters. But because the GOP relies so heavily on Fox News to reach its constituents and spread its message, the network exerts its own gravitational pull on the party. “If the Republican Party wants to make an ideological shift, if they want to modify their vision on immigration, say, it matters a lot if Fox commentators and management are willing to go along with that,” Ladd points out.

Fox News clearly jumped out in front of the party on the immigration issue. Only two days after Obama’s re-election, Hannity, a hardline opponent of undocumented immigrants, came out on his radio program (which is not affiliated with Fox News) in favor of a pathway to citizenship for them. To gun-shy Republicans like Senator Marco Rubio, who had spent 2012 opposing just such a proposal, Hannity was sending an unmistakable signal: they would now have some political cover on the network if they similarly changed their public views, which Rubio quickly did. In a February article in The New Republic, Ailes, too, made a point of striking a more moderate tone toward Hispanics and said he dislikes the term “illegal immigrant,” which the Fox News Latino network no longer uses. These changes of heart, it should be noted, involve only as much courage as it takes to agree with the owner of the company. One day after Obama’s re-election and one day before Hannity’s epiphany, Rupert Murdoch had tweeted: “Must have sweeping, generous immigration reform, make existing law-abiding Hispanics welcome.”

Whether these recent, road-to-Damascus conversions are genuine or artificial may not matter much at this point, though. Hannity and many of his Fox News colleagues have invested so much time inciting animosity toward “illegals” and excoriating legislative attempts at “amnesty” that the network has acquired a reputation of harboring anti-Hispanic tendencies. In the aforementioned PPP poll on media trustworthiness, Hispanics ranked Fox News as their least credible news source, with a net four-point negative rating. (Broadcast news networks all enjoyed double-digit positive ratings.) Likewise, a National Hispanic Media Coalition survey from last fall found that Fox News hosts were more likely than those from any other network to negatively stereotype Latinos. It also noted that the network’s audience had the highest percentage of viewers with negative feelings about Hispanics and undocumented immigrants.

Jim Gilmore, the former Republican governor of Virginia and current head of the Free Congress Foundation, a conservative think tank, warned against just this type of demographic alienation in a January interview with National Review. “Shrillness and extreme language are driving away the voters who could help us build a majority,” Gilmore said. When contacted for this story, Gilmore made a point of saying that the network is “vital” to the conservative movement and added that his critique was not an implicit indictment of Fox News: “All I can say is that if they are doing anything like that and polling is reflecting it, they ought to stop it, because that would reflect badly on the Republican Party.”

That Gilmore’s willingness to confront the party’s mistakes hasn’t yet caught up to understanding what’s causing them is symbolic of the broad dilemma confronting the conservative movement right now. The unquestioning faith in Fox gives the network little incentive to undertake real change, since it allows Ailes to feel confident those prodigal conservative viewers will eventually return to the fold. While Fox still enjoys ratings victories, albeit narrower ones, conservatives have suffered significant losses at the ballot box in three of the past four national elections. And they face the prospect of even more defeats if they don’t lead their movement out of the wilderness of serial misinformation and forgo the temptation of perpetual outrage.

Arresting this descent into grand conspiracy theories and self-destructive rancor won’t be easy, though. “It makes it very difficult for guests who are being asked about Benghazi and Solyndra to talk about concrete policy issues,” Cassino notes. Gilmore, at least, acknowledges as much. “It is our burden to go on Fox News and give the right message,” he says. “If for some reason—ratings or whatever the reason is—the commentators try to drag you to a place where you ought not to be, you have to resist going there.”

John Stuart Mill, in his famous treatise On Liberty, understood that a “healthy state of political life” must necessarily include “a party of order or stability, and a party of progress or reform.” So where exactly the conservative movement goes from here becomes a critical issue, since the Republican Party isn’t about to spiral into electoral irrelevance anytime soon. Therefore, the degree to which it is grounded in reality and willing to collaborate reasonably in governance should matter a great deal to liberals, specifically, and to our democracy in general.

The devil’s bargain that Ailes struck between his network and his politics seventeen years ago, however, looks unlikely to change within the foreseeable future. Fox News remains an all-too-comfortable gilded cage for Republicans—one that showcases the party but also shelters it from the slings and arrows of honest intellectual debate. One can rigidly confine an ideology for only so long, however, before its beliefs begin to ossify and its policies atrophy. It’s an ironic twist: the more the network enables conservative ideas to stray from the mainstream, the less appealing the network’s conservative coverage becomes. And after years of deeming their codependent relationship an unalloyed good, it’s time Fox News and the Republican Party face cold reality. For both to enjoy long-term future success, each must recognize that the other isn’t its salvation; instead, they’re both part of the problem.”

Emphasis Mine

see: http://www.alternet.org/fox-news-audience-literally-dying-roger-ailes-grand-experiment-propaganda-doomed?akid=10337.123424.4hBEFS&rd=1&src=newsletter826540&t=5

11 Actions That Prove Republicans Are Intent On Making 2013 A Terrible Year For Sex

SOURCE: HuffPost

AUTHOR:Nick Wing

(N.B.: In 1984, there was the Junior Anti-Sex league…)

“Move over “war on women,” the GOP’s “war on sex” is here to invade your bedroom and reproductive system.

While some of the measures below may resemble salvos fired during the “war on women” — and others are actually carbon copies — it’s 2013, and with a new year comes new ways for Republicans to get in between your sheets, regardless of your gender or sexual orientation.

Emphasis Mine

SEE:http://www.huffingtonpost.com/2013/04/12/republicans-sex_n_3055060.html?ir=Politics&utm_campaign=041213&utm_medium=email&utm_source=Alert-politics&utm_content=Title

 

Margaret Thatcher Was a Privatization Pioneer, and This Is the Story of How Her Agenda Did Nothing But Make Life Worse for Millions of People

From: Michael Hudson’s blog, via alternet

Author: Michael Hudson

As in Chile, privatization in Britain was a victory for Chicago monetarism. This time it was implemented democratically. In fact, voters endorsed Margaret Thatcher’s selloff of public industries so strongly that by 1991, when she was replaced as prime minister by her own party’s John Major, only 35 percent of Britain’s voters supported the Labour Party – half the proportion registered in 1945. The Conservatives sold off public monopolies, used the proceeds to cut taxes, and put the privatized firms on a profit-making basis. Their stock prices rose sharply, making capital gains for investors whose ranks included millions of Britons who had been employees and/or customers of these enterprises.

Yet by 1997 the Conservatives were voted out of office by one of the largest margins in their history. What concerned voters were the results of privatization that Mrs. Thatcher had not warned them about. Prices did not decline proportionally to cost cuts and productivity gains. Many services were cut back, especially on the least utilized transport routes. The largest privatized bus company was charged with cut-throat monopoly practices. The water system broke down, while consumer charges leapt. Electricity prices were shifted against residential consumers in favor of large industrial users. Economic inequality widened as the industrial labor force shrunk by two million from 1979 to 1997, while wages stagnated in the face of soaring profits for the privatized companies. The tax cuts financed by their selloff turned out to benefit mainly the rich.

Opinion polls showed that voters had opposed privatization at the outset (as did the press and many Conservative back benchers), but the Conservatives pointed out that Tony Blair rode to victory in part by abandoning “Clause Four” of the Labour Party’s 1904 constitution, advocating state control over the means of production, distribution and exchange. Most voters wanted tighter regulation in the public interest, but not a return to state ownership. On the other hand, they feared the prospect of selling off the post office, the BBC and the London tube (subway) system.

Nearly everyone agreed that companies were run differently in private hands than was the case under public ownership, even when the same managers remained in charge. Privatization was praised by Mrs. Thatcher and her allies – and blamed by many others – for managing these companies to generate capital gains for stockholders rather than to serve broader social ends.

Many people did not believe that essential public-sector industries should be run as commercial gain-seeking enterprises. Among the norms of public service, making a profit certainly was not one of the yardsticks used by the bureaucracy put in charge of these companies. Public-sector labor unions aimed more at maintaining employment than at producing revenue for the state as owner. The purpose of taxes, after all, was to subsidize basic services to the population.

This attitude had long been shared by many Conservatives, as well as by Labour. When Benjamin Disraeli created the Conservative Party in its modern form in the mid-nineteenth century (replacing the old royalist Tory Party), his major ideological adversary was not socialism but the free-trade liberalism that led Britain to repeal its protectionist agricultural tariffs (the Corn Laws) in 1846. Indeed, as a novelist Disraeli sought to expose the horrors of unbridled laissez faire. In Sybil, or The Two Nations, written in 1845 (three years before the Communist Manifesto), he described the rich and the poor as constituting “two nations between whom there is no intercourse and no sympathy, and . . . who are not governed by the same laws.” His novel assigned the loftiest ideals to Sybil, the daughter of a factory worker, but placed his hopes in a morally regenerate aristocracy. And in due course, Disraeli’s social welfare legislation, especially the public health system introduced from 1874 to 1881 (he said that his motto was Sanitas sanitatum, “Health, all is health”), helped the Conservative Party evolve as a nationalist and sometimes “state socialist” party, especially after World War II under Harold Macmillan in the 1960s and even Edward Heath in the ‘70s.

But it was the Labour Party that pressed for nationalization of the major industries. Fabian socialists such as Sydney and Beatrice Webb, George Bernard Shaw and other wealthy opinion-makers typified the degree to which many of Britain’s leading upper-class intellectuals supported nationalization as a cure for the ills of industrial capitalism. Indeed, the aristocracy underwent a schooling in personal economic values that resembled of those of ancient Greece and Rome in their disdain for the idea that one’s life should be devoted to so lowly a purpose as commercial gain-seeking.

Britain’s government was controlled about half of the postwar period by the British Labour Party, which in turn was controlled by the trade unions. This gave the unions more political power than in any other country. Conversely, the Labour Party’s strength was based on the unions. Most workers employed by the public utilities and other government enterprises belonged Transport and General Workers’ Union (TGWU). Although the number of individual party members was relatively small, all of the TGWU’s approximately one million members were deemed to belong simply by virtue of their union membership. The union’s general secretary cast their votes as a bloc at the Labour Party’s annual convention.

Trade unions were given broad privileges in 1906, subsequently restricted by the Trade Disputes Act of 1929 passed largely in retaliation against the 1926 general strike. This act made it mandatory for union members to opt in to the payment of the union’s political levy to the Labour Party. After World War II the rule was changed to give unions a right to opt out of paying the political levy. This had the ironic effect of placing the Labour party finances more firmly in the hands of the union leaders. At the Labour Party conferences these leaders voted on behalf of all their members who had paid the levy. The TGWU thus was placed in a position to cast one million of the party’s roughly six million votes.

Labour endorsed the nationalization of industry so as to serve the interests of workers. As noted above, Clause Four of its 1918 constitution (added in 1919 in the aftermath of World War I) called for the state to control the means of production, distribution and exchange. In 1945 the incoming Labour government nationalized the gas and electric utilities, as well as most transport lines that remained municipally or privately owned. Nearly all were run at a loss, which duly was covered by public subsidy.

World War II had been the great catalyst for faith in public ownership and national planning. Some four-fifths of Britain’s gross domestic product (GDP) was commandeered by the government. By the end of the 1940s most utilities and natural monopolies were in public ownership at the national or local level, or (as in the case of water) were held by public companies with restricted returns for the owners of their equity shares. The coal mines, gas and electric utilities, road transport and railways all were nationalized. The foundations and basic cost structure of Britain’s economy thus were shaped by these public utilities, public housing and socialized medicine, not to mention British Petroleum (BP) and, in time, the government’s North Sea oil holdings. And in due course the automotive, steel and aircraft sectors were rescued from collapse by being nationalized, henceforth to be run at heavy losses subsidized by taxpayers.

Clement Atlee’s Labour government of 1945-51 cited five reasons for nationalizing British industry. As Mrs. Thatcher’s Treasury Chancellor Nigel Lawson. has summarized, the first reason was to improve industrial relations. In practice, he retorted, this meant caving in to the trade union leaders, especially inasmuch as a second objective of postwar nationalization was to ensure full employment. The effect was to inflate wage rates through make-work programs and featherbedding.

A third reason for nationalization was to maximize productivity gains, by removing absentee rentier owners from the scene. The actual result, pointed out the Thatcherites, was an uneconomic management of the labor force. Nationalization also had focused on regulating natural monopolies in the public-interest – that is, by politicians – by administering prices and providing service on a basis other than profit objectives. The monetarists would argue that straight profit objectives were more efficient.

A fifth argument for nationalization had been the strongest. It was intended to replace short-term profit maximization by wider national and social priorities. But governments tend to live just as much in the short run as do corporate managers. More to the point, politicians seek to win votes by placating labor on the eve of elections. “The nationalized industries,” argued Lawson, “so far from improving industrial relations, proved the source of the biggest threat to industrial peace – doubtless because of the combination of centralized union power and recourse to the bottomless public purse.” At least, this argument was more understandable in 1979 than it had been in 1949.

If it seemed that government enterprise could succeed where private management failed, the reason was to be found largely in its claim on the public purse. The losses run up by these enterprises were financed by income taxes whose rates for business and the upper brackets were among the world’s highest, as were inheritance taxes. Indeed, many considered Britain to have been turned into Europe’s most socialist economy after 1945. Yet the objective seemed not to be the provision of efficient service at world-class levels. Public housing, originally a showpiece, deteriorated into what some called “modernist trash,” while the telephone system remained archaic. Public bureaucracies came to be seen as personal baronies whose administrators made little attempt to apply business methods or cost accounting. Yet their book cost far exceeded the stock-market valuation of private companies.

Most Conservatives acquiesced in the idea of national planning as the government increased its share of the economy from 40 per cent to over 60 per cent by the late 1970s. As Mrs. Thatcher observed, “It was, after all, none other than Harold Macmillan who in 1938 proposed in his influential book The Middle Way to extend state control and planning over a wide range of production and services.” Most social legislation since World War II was bipartisan, including the new National Health Service and the National Insurance legislation of 1946. Running a public enterprise was prestigious for many members of the upper classes. And the government was willing to bail out industries when they went bankrupt, with full compensation to investors – something that the market could not have done.

Margaret Thatcher’s Monetarist World View

Mrs. Thatcher has described how her upbringing living over her father’s grocery store in the small town of Grantham shaped her impressions of how society worked. “There is no better course for understanding free-market economics than life in a corner shop.” It was an experience that inoculated her “against the conventional economic wisdom of post-war Britain,” that is, the faith in government planning and the disdain felt among the literati for entrepreneurial values. Hers was the world of “Methodism, the grocer’s shop, Rotary and all the serious, sober virtues cultivated and esteemed in that environment.”

This Babbitt-like view of the world did not prepare her to think about the economic impact of debt, a serious blind spot for nearly all monetarists. She confessed that her idea of debt management was based balancing the family checkbook, as if this was a proper analogy for public finance and government control of the printing press and a central bank to create money at will. To Mrs. Thatcher a government deficit simply meant more debt, and hence more taxes to be paid. “Thrift was a virtue and profligacy a vice,” she wrote. Taxes were “a deterrent to work,” not the means by which vital public services were supplied. It was as if such services had no economic value. Income policies were epitomized by the undeserving poor living better on state subsidies in public council housing than hard-working families who struggled to pay their rent or meet their mortgage payments. This was a view reflecting middle class resentment against subsidized services extended to families lower on the economic scale.

One does not learn much about macroeconomics from a store. A shopkeeper buys what already has been produced; how it is made is not of much concern. In fact, Mrs. Thatcher’s world view was naturally akin to that of Chicago School monetarism. The focus was simply on how to undercut the prices of one’s competitors, preferably by cutting taxes and the costly social welfare schemes on which they were spent.

The ideological pedigree for the Chicago School’s narrow-minded economics was provided by Frederick Hayek and Milton Friedman. Hayek’s most famous book,The Road to Serfdom(1944), opposed any and all government planning in principle as leading inevitably to either fascist or Communist authoritarianism. When Keith Joseph gave Mrs. Thatcher a copy of this book she readily responded to his hard line. “Hayek saw that Nazism – national socialism – had its roots in nineteenth-century German social planning. He showed that intervention by the state in one area of the economy or society gave rise to almost irresistible pressures to extend planning further into other sectors. He alerted us to the profound, indeed revolutionary, implications of state planning for Western civilization as it had grown up over the centuries.” This would underlie her opposition to European unification under the Maastricht Treaty.

To most people the government appeared as the benign sponsor of the welfare state that emerged from World War II’s mobilization. But by the late 1970s the sclerosis of public industries threatened to make Britain economically ungovernable. In these circumstances the Chicago School’s anti-statism found an increasingly fertile intellectual ground.

It was natural for self-made people such as Margaret Thatcher to prefer a private-sector market economy to a state bureaucracy. Private enterprise beholden to shareholders hardly can afford patronage and cronyism. Of former Conservative Prime Minister Harold Macmillan’s broad and inclusive politics, she acknowledged disdainfully that “The traditional economic liberalism which constituted so important a part of my political make-up . . . was often alien and uncongenial to Conservatives from a more elevated social background.”

She and her supporters stood more in the tradition of the old Liberal Party, dressing up the ideas of Adam Smith in monetarist Chicago garb, seeing in government planning a road to serfdom at worst, and incompetence at best. She warned against the dangers of inflation spurred by government borrowing, but said little about private debt.

Mrs. Thatcher thus was ideologically harder than her pragmatic Conservative predecessor Edward Heath, and represented a break from her party’s traditions. She admired what the Chicago Boys had done in Chile, and would find kindred monetarist souls among Russian “reformists”. “Let us glory in our inequality,” she preached at one banquet, explaining that more inequality meant that more wealth was being created by “savers” at the top of the economic pyramid, presumably to trickle down via new direct investment. However, she recognized that such policies could be introduced in England only by an elected government. The task she set before herself was to win British voters to support her reforms voluntarily, for imposing them by armed force was out of the question.

It was taken as a matter of faith that financial gains would be invested in upgrading the enterprises once they were privatized, installing new machinery and hiring more labor to provide better service while increasing output at falling prices. Workers were invited to think of themselves as finance-capitalists-in-miniature, earning dividends and capital gains by investing their savings in the shares in these companies. This was the essence of Mrs. Thatcher’s “popular capitalism.” In her pursuit of these objectives the Iron Lady became Britain’s first prime minister to be elected for three consecutive terms, to retain this office for over ten consecutive years, and to have an “ism” named after her.

But first, she had to convince her fellow Conservatives. This became her major initial fight, within her own party.

How British Monetarism Planned the Neo-Conservative Takeover

No economic theory can be promoted successfully today without institutional sponsorship. In America, monetarist ideas were spread by policy institutes such as the Heritage Foundation, the Cato Institute and the American Enterprise Institute. Likewise in England, if the history of privatization is dominated by Margaret Thatcher, her victory was largely a product of British monetarism’s main policy institute, the Centre for Policy Studies (CPS), founded in 1974 by her mentor Keith Joseph (then a Member of Parliament). With Mrs. Thatcher as its President, the CPS used the economic philosophy of Frederick Hayek (the “father of monetarism”) and Milton Friedman to launch the “Thatcher Interlude” that culminated in 1979 with her election as Prime Minister.

Britain could claim the Austrian-born Hayek as one of its own. He had become a British citizen in 1938, and held the Tooke Chair in economics at London from 1931 to 1950. (Ironically, Thomas Tooke was the great anti-monetarist, a century and a half earlier, in the 1830s.) To help spread his political philosophy, he helped create the Institute of Economic Affairs in 1957, the Adam Smith Institute in 1977 (serving as its first chairman), and the Social Affairs Unit in 1980.

Hayek wanted to abandon all public regulatory structures. Followed by Friedman, he argued that all attempts by government to shape markets were doomed to failure. Planning itself was wrongheaded in principle. As Nigel Lawson summarized this philosophy: “Economic planning was both impossible and unnecessary. . . . The price mechanism . . . was a much more efficient means of transmitting consumer wants and needs than the vast bureaucracies of Whitehall and the nationalized industries.”

This view of idealism as serving to strengthen state power enabled the Conservatives to take the moral high ground, Lawson continued, “by elevating private actions above public direction and dismissing ‘social justice’ as both vague and arbitrary.” The only valid idealism was to destroy the state. This could best be done by cutting off the government’s financial taproot, the ability to create the money needed to finance its budget deficits. The alternative to government bureaucracy, Lawson concluded, was to create a new political ideal for capitalism: to turn “profit” and “capitalism” into words of praise; “planning,” “government” and “taxes” became the new terms of invective.

Hayek joined the Chicago economics faculty in 1950, two years after Friedman, who spent 1953-54 in England as a visiting Fulbright Lecturer at Cambridge. At that time, he reminisced (in Capitalism and Freedom), “Those of us who were profoundly worried by the danger to freedom and prosperity posed by the growth of government, the triumph of the welfare state and Keynesian ideas, made up a small minority and we were considered eccentric by the vast majority of our intellectual colleagues.” Monetarism was deemed eccentric because it saw in government only the power to tax and oppress, not to protect and support. (Herman Kahn’s wife, Jane, likes to tell the anecdote of how, Milton Friedman once replied to her when she asked whether social spending on needy children was not be one type of public welfare that was well justified: “Mrs. Kahn, why do you want to subsidize the production of orphans.”) To the monetarists, all socially ameliorative spending appeared only as an economic distortion on the expenditure side, and as a burden on industry on the tax side of the tax-and-spend equation.

Mrs. Thatcher’s truculent Joan of Arc personality found a kindred soul in Alfred Sherman, CPS’s Director of Studies, whom she described as an ex-Communist who brought a “convert’s zeal” to the monetarist cause. Like so many former left-wingers, he seems never to have forgiven the working class for not following his early entreaties. And much like a spurned lover, he got his revenge as a Tory. But he retained from Marxism an awareness of economic theory’s political service as apologetics for one class or the other. He found in monetarism not so much an objective analysis of money and credit as a means of blaming inflation on government spending. Cutting off the government’s ability to run into debt would leave the power of private capital (“the market”) to take its place.

If Sherman was the ideological gadfly, Mrs. Thatcher was the master of political tactics. Her genius lay in seeing that public bureaucracies were ripe for the plucking, along with the Keynesian macroeconomic theory that served as their intellectual foundation. Most Britons believed that once a path was embarked upon, it could not be changed, to say nothing of being diametrically reversed. The denationalization of industry appeared politically impossible. Indeed, Labour governments believed they could bring one sector after another into the public domain. To Mrs. Thatcher this was the road to serfdom, and she sought to reverse the trend. She alone had the confidence to go on the offensive rather than passively decrying the trend towards larger public control of the economy. It was largely a result of her initiative that Britain, the nation with Europe’s strongest social democratic tradition and the most highly developed public sector, became the first to reverse what seemed initially to be an inexorable trend toward greater state control.

The Monetarist Attack on Full-Employment “Demand Management”

Mrs. Thatcher, Keith Joseph, Alfred Sherman and Nigel Lawson challenged the idea that economies could be managed by income policies aimed at achieving full employment. This objective, voiced by John Maynard Keynes in the 1930s in his General Theory, had become political orthodoxy throughout most of the world by the 1950s and ‘60s, and was endorsed both by Conservatives and Labour.

In America, the (“Full”) Employment Act of 1946 had replaced what Marx called the chronic “reserve army of the unemployed” by employment policies aimed at absorbing surplus labor through public spending. This policy met its Waterloo at the hands of Gardner Ackley of the Council of Economic Advisors and Robert McNamara, who tried to calculate just how much war America could afford, and indeed how much was needed to create “effective demand.”

In England, Mrs. Thatcher and her allies opposed Keynesian income policy on the ground that it supported wages (and hence, priced British goods out of world markets) simply to create “demand,” without regard for productivity. The achievement of “full-employment stability” was illusory, the monetarists accused, for it entailed monetary instability. Acting as the employer of last resort (or injecting enough “effective demand” to ensure full employment), governments created inflationary pressures by monetizing public debts. The ensuing inflation threatened bondholders and hence deterred their motivation to save, by reducing the purchasing power of their rentier income. The tacit assumption was that their “saving” would have funded new direct investment and employment rather than real estate or stock market speculation in assets already in place.

The major backers of monetarism duly became the rentier interests (banks, insurance companies and other institutional investors, as well as wealthy coupon clippers) who feared seeing the value of their bonds, loans and other claims on the economy’s wealth eroded by inflation. It was not hard for monetarists to show that their self-interest lay in backing an economic doctrine which depicted governments as being inherently inefficient, wasteful and/or corrupt, dominated by vested interests such as the labor unions. The Thatcherites argued that wherever public enterprise played a major role, it suffered from bureaucratic inefficiency and waste. Decision-making by entrenched constituencies (the labor unions in Britain, party members in the USSR and Argentina, and campaign contributors in the United States and Japan) led publicly owned companies to be managed uneconomically.

The way to stop this process was to turn off the monetary spigot which funded public spending. Contrary to Keynesian prescriptions, the monetarists argued, governments should limit their regulatory activity to control over the money supply, increasing it at a constant rate. They could do this only by not running into debt in the pursuit of full employment programs and other social spending. In sum, whereas Keynes had provided a rationale for government planning to sustain full employment, with an inflationary bias that he welcomed as leading to the “euthanasia of the rentier class,” monetarism took the side of creditors in urging fiscal austerity of the type imposed by the IMF on debtor countries.

Inverting Lenin’s view of governments as being the board of directors for the ruling class, the Thatcherites depicted government (at least Labour Governments when in power, which was about half the time under Britain’s two-party system) as the Board of Directors of the labor unions. They argued that industrialists could not manage in the face of unequal competition with the unions. Creditor-oriented monetarism thus merged with free-market economics of a particular kind. A Keynesian “market,” the Thatcherites accused, was very different from what an ideal market should be. The kind of competitive market that union leaders wanted was one of low unemployment conducive to wage-push inflation. For the Thatcherites, creating a “competitive market” and price stability became euphemisms for breaking trade union power.*

Creating a Populist Opposition to Public Spending

Monetarists recognized that in order to reduce taxes (without increasing the public debt), it was necessary to cut back public spending proportionally. This was, conveniently, part of their plan to scale down government in general. The path of least resistance was for politicians to create a backlash against government waste, and to reduce everyone’s taxes somewhat, while “simplifying” the fiscal system by shifting taxes away from wealth (especially in the finance, real estate and insurance sectors) onto consumers via sales taxes, excise taxes and the value-added tax (VAT).

The biggest problem faced by Mrs. Thatcher in pursuing this regressive fiscal policy was that most voters initially viewed the government as subsidizing essential public services, ensuring economic security and helping families in need. But voters also were taxpayers. Mrs.Thatcher played on their resentment against public subsidies to those who were less hard working (i.e., poorer) than themselves. Seeking to attract voters to her cause through their perceptions of the existing system’s unfairness and visible inefficiencies. Although most came from wage-earning families and their natural sympathies lay with labor, she was able to denounce trade unions for their featherbedding and extortionate wage demands.

In sum, Mrs. Thatcher made no apology for fighting against tax-and-spend policies, trade unions and public ownership. What she challenged was nothing less than her society’s traditional value system. She appealed to the narrowest and most immediate self-interest of voters, not to their idealistic hopes. Her success is reflected in the fact that the 1980s became a decade in which income and property taxes were rolled back and governments began to be downsized not only in England but throughout the world.

Opposition to public spending – and the taxes to pay for it – was fanned by warnings about the dangers of inflation eroding the purchasing power of wages. What was not stressed was that the main source of global inflation was the United States, whose war in Southeast Asia had created a budget deficit and forced the world off gold. America quadrupled grain prices in 1971-72, and OPEC countries followed suit with oil prices. By the end of the 1970s the U.S. Federal Reserve raised interest rates to 20 percent in order to end the inflation by deterring bank lending. This plunged England and other countries into economic crises of their own. Future historians no doubt will find it remarkable that they sought to cope by curtailing their own budget deficits and money supply.

The monetarists viewed inflation as a domestic phenomenon that could be countered by cuts in public spending and general austerity. But their policies only made things worse, by collapsing employment and output. Falling tax revenues pushed government budgets even further into deficit, and rising interest rates increased rather than lowered prices. (Economists call this the Gibson Paradox.) High interest rates collapsed the stock and bond markets, leading to capital outflows and lower foreign-exchange rates. This increased the price of imports, pushing up prices accordingly. But monetarist politicians single-mindedly blamed the inflation on not following their austerity policies even more stringently and not cutting government spending by even more!

What the Thatcherites feared was not so much government as such, but the degree to which the trade union bureaucracy controlled the Labour Party. Like America, Britain was ruled by what was essentially a two party system. And when one party remained in office so long that its vested interests overplayed their hand, the other party was voted in, and typically tried to reverse what its predecessor had done. Labour was bound to come to power every five to eight years or so. Under Britain’s “pendulum politics,” the prospect was for it to act as the arm of the trade unions that made up the bulk of its constituency, and to re-nationalize companies that the Conservatives had denationalized.

At the Centre for Policy Studies, Keith Joseph stressed in a 1976 pamphlet, Monetarism is not Enough, that monetary deflation by itself could not solve Britain’s problems. Workers had to be laid off. But Labour’s featherbedding practices blocked the needed downsizing. Indeed, union power was strongest in government departments and public enterprises. To be run efficiently, these had to be shifted to non-union labor. This perception helped promote the privatization of key public industries and government operations.

Mrs. Thatcher’s Anti-Union Strategy

After reducing taxes on wealth and fighting inflation by cutting back government, the monetarist objective was nothing less than to destroy British trade union power. Mrs. Thatcher nurtured a popular reaction against the unions, choosing her battles carefully. Biding her time so as not to alienate public opinion, she waited for the unions to misplay – and then acted with tactics planned in advance both from a legal and public relations vantage point.

By the time her tenure as prime minister ended, Mrs. Thatcher had carried through her program, hinted at already in the late 1970s (see Thatcher 1995:424f.). The 1988 Employment Act gave union members the right not to join in strikes their unions called without holding a ballot. The 1990 act, she wrote, “concluded the long process of whittling away at the closed shop,” by forbidding unions from excluding non-union workers from being hired.

Already in the aftermath of the 1974 coal strike, Edward Heath’s government had scaled back union immunities from law suits making it a legal tort – that is, an actionable offense, punishable by fine – for unions to picket or boycott suppliers (or customers) of companies being struck. Monetary judgments henceforth could be levied against the unions.

Mrs. Thatcher also hit upon the strategy of insisting on union democracy as a ploy to counter hard-line union leaders. The traditional British procedure was for workers to vote for shop stewards (typically the most militant union members) to represent them in casting their votes for the union heads who in turn did the voting for strikes and also wielded power at the Labour Party’s annual convention. Mrs. Thatcher knew that it was much more difficult to frighten these activist shop stewards into submission than to intimidate the rank and file. Her idea accordingly was to insist that all major decisions, above all whether to strike, should to put to a full union vote in open secret-ballot elections. Without this reform, she wrote, “the rest of our programme for national recovery would be blocked. . . . Winning the next [1979] election, even by a large majority, would not be enough if the only basis for it was dissatisfaction with Labour’s performance in office since 1974. Therefore, far from avoiding the union issue – as so many of my colleagues wanted – we should seek to open up debate. Moreover, this debate was not something to fear: the unions were an increasing liability to Labour and correspondingly a political asset to us. With intelligence and courage we could turn on its head the inhibiting and often defeatist talk about ‘confrontation.’”

As one Conservative remarked, “What other right winger would ever have had the cleverness to trust the common sense of the ordinary union member so sincerely? The union bosses were put in an impossible position. As the self-proclaimed tribunes of the workers they could not refuse democracy. They tried to use the argument of the expense of ballots to avoid them, so Maggie said, ‘That’s alright; the government will pay.’ Love her or hate her, one has to admire the accuracy of her perception.”

The unions overplayed their hand in the Winter of Discontent, 1978/79, but the time was not yet ripe for a showdown. “From 1980 we pursued a ‘step-by-step’ programme of trade union reform,” Mrs. Thatcher later reminisced. The 1982 “Tebbit Acts” removed the traditional union “immunities from common law tort action for damages, except for ‘primary’ strikes sanctioned by a majority in secret ballot,” observes one of her advisors, Patrick Minford. This legal chess game set the stage for her to checkmate Arthur Scargill’s coal miners in 1983 (her counterpart to Ronald Reagan’s 1981 destruction of the Air Controllers’ Union), by making union funds subject to awards for damages.

In 1981, Mrs. Thatcher gave into the union rather than engage in a fight she felt she could not win in the public’s eye. She knew just how far she could go up against them, and her sense of timing enabled her to succeed. Her defeat of the 1984-85 miners’ strike (described in the next chapter) effectively cemented the new order. “In 1990, my last year as Prime Minister, the number of industrial stoppages was the lowest in any year since 1935.”

The decline in union power enabled the privatized companies and others to downsize their labor force. Between 1979 and 1986, union membership fell by three million persons. Two million industrial workers were put out of work, including over a million miners. “The new service industries, such as computer software and biotechnology,” Mrs. Thatcher wrote in 1995, “are in any case not easily unionized, and so not held back in the application of new techniques.”

A Conservative politician summed up matters: “The original purpose of privatization was to break up Trades Union Monopsony rather than manufacturer/utility Monopoly.” The politicians who joined Mrs. Thatcher’s inner circle focused on labor’s cost-push inflation, to the extent that British wage rates (and hence, product prices) were negotiated between strong-willed union leaders and (so Mrs. Thatcher claimed) weak-willed government bureaucrats.

The Conservatives depicted their warfare against the unions as being waged not against labor, but against adventurist opportunists using their constituencies for their own glory. Even communists such as Leon Trotsky had attacked craft unions such as America’s American Federation of Labor as representing particular layers of the labor force acting in their own narrow self-interest. Mrs. Thatcher subtly froze the union leaders out of the policy picture simply by ending the traditional ritual of beer and sandwiches in Downing Street. The trade union bosses found themselves cut off.

Mrs. Thatcher ended by excluding children and young adults under twenty-one from the minimum wage regulations, and finally abolished the laws outright. These transformations of the labor market, she concluded, “allowed management once more to manage and so ensured that investment was once again regarded as the first call on profits rather than the last.” But a double standard seems to have been at work. The first call on profits seemed to be for higher salaries and stock options for senior managers. She denounced high taxes for deterring their efforts and praised high salaries for motivating them, yet what seemed to motivate manual workers was poverty and the loss of job security. Her rather vindictive world view did not recognize falling real wages as deterring productivity gains; only falling profits and dividends for the well-to-do led to inefficiency in the monetarist world view.

In the process of privatizing the large public enterprises, Mrs. Thatcher seized labor’s pension funds, wiping out company liability for the pensions saved up by their employees. It took several years for the European Community to rule her act illegal. The money belonged to the workers, not to the buyers of these companies.

But just who were these buyers? Where did workers fit into the picture, via their personal shareholdings and those of their pension funds?

“Popular” or “Peoples’ Capitalism”

Mrs. Thatcher recognized that an anti-union policy by itself would not suffice; she had to give workers something in return. What was needed was to cast monetarism’s anti-labor philosophy in a more positive rhetoric. Her solution was “popular capitalism,” an elaboration of what Anthony Eden and other earlier Conservatives had called a property-owning democracy.

The idea of getting workers to think of themselves as property owners had long been voiced by Conservative politicians. It began with the idea of them owning their own homes, bought on mortgage. Mrs. Thatcher started the process with Council house sales. No less than £24 billion were sold off, larger than any single other public industry. But the privatization that really inaugurated “popular capitalism” was the sale of British Telephone in November 1984. The idea was nothing less than to win workers over to the cause of capitalism as an ideal, by turning them into stockholders in the economy’s commanding heights. This, she hoped, would shift their faith away from socialism in the future to capitalism in the present. “Privatization not only widens share ownership (desirable in itself),” claimed Lawson, “but increases employee share ownership, which previous privatizations show leads to further improved performance.” More politically to the point, giving property to citizens would create “a society with an inbuilt resistance to revolutionary change.”

Lawson hoped that workers would value their shareholdings more than they would resent their falling real wages. In any event, he added, “I give away few political secrets when I say that Governments are likely to be more concerned about the prospect of alienating a mass of individual shareholders” than they would be about offending a few dozen Conservative investment managers. Future Labour governments thus would have to hesitate before taking steps that would threaten the value of shares held by large numbers of workers.

Every attempt therefore was made to spread share ownership as widely as possible, for “the more widely the shares were spread, the more people had a personal stake in privatization, and were thus unlikely to support a Labour Party committed to renationalization. And if this forced Labour to abandon its commitment to renationalization, so much the better. For our objective was, so far as practically possible, to make the transfer of these businesses irreversible.” However, another Conservative politician has assured me that the small private investor “was never more than icing on the political cake.” In the end, it was the large campaign contributors who mattered after all, for their funding enabled the party to buy TV time and media space to attack Labour in the usual ways, which had little to do with the economic self-interest of workers as such.

Mrs. Thatcher’s ideal was for every employee and customer of British Gas, British Telephone and other major utilities to buy into them and thereby to acquire a stake in their efficient management. Workers who were not deemed redundant would find their wages supplemented by dividends (and capital gains) from the stocks they were able to buy with their savings. In good capitalist form they would become owners of the means of production, at least as minority shareholders. This prospect was supposed to gain popular support for breaking the trade unions, dismantling government protection of labor and withdrawing subsidies from public services. Politics became an exercize in the degree to which the perspective of labor’s economic self-interest could be foreshortened and sidetracked.

Lawson had proposed the term “people’s capitalism,” but Mrs. Thatcher felt that this sounded too much like the communist people’s republics, and preferred “popular capitalism.” This still sounded like General Pinochet’s “labor capitalism,” and indeed was a similar program of monetarist austerity, dressed up in populist rhetoric.

The attempt to make privatization irreversible shaped its tactics from the outset. In this respect its history in Britain is as much the story of political expediency as one of economic principles in the abstract. Mrs. Thatcher sought to protect the newly privatized status quo by endowing a coalition of beneficiaries who would form a bulwark against any future attempts by Labour to try to re-nationalize the enterprises being sold off. One constituency of “popular capitalism” was created by giving workers a stake in preserving the value of the shares they held in these enterprises. Another constituency consisted of the buyers (often the former managers) of the enterprises being sold off. Yet another was created by selling shares to foreign investors, so that any attempt to denationalize would have to confront not only British financial institutions and worker-shareholders, but American and other global diplomatic pressure. The strategy was to spread shareholding so widely that it could not be reversed.

This political strategy shaped the early privatizations. It led Lawson to offer shares at a fixed price rather than by auction, on the ground that small subscribers wanted to know just how much they would have to pay in order to be willing to buy. He later ruefully admitted that this political ploy led to an underwriting strategy that resulted in huge losses to the government (and unwarranted gains for the City financiers) as compared to what an open auction of shares would have yielded.

How Britain’s Public Enterprises were Strangled: The Needless Fight over the PSBR

The Thatcherites argued that private ownership would be inherently more efficient than government control, assuming that sound management depended on ownership alone. Lawson insisted that “you can no more make a State industry imitate private enterprise by telling it to follow textbook rules or to stimulate competitive prices, than you can make a mule into a zebra by painting stripes on its back. There is no equivalent in the State sector to the discipline of the share price or the ever-present threat of bankruptcy.” Only the prospect of economic gains would lead enterprises to cut costs, improve service and become more businesslike in general.

One economist (John Kay 1988) pointed out that, “all State-owned corporations improved their productivity remarkably in the 1980s, whether they were privatized or not.” However, Lawton replied, “it was the process of preparing State enterprises for privatization . . . that initially enabled management to be strengthened and motivated, financial disciplines to be imposed and taken seriously, and costs to be cut as trade union attitudes changed.”

The real problem was that Britain’s Treasury refused to authorize the funds needed for investment as long as the enterprises remained in public hands. To stop the inflation that was distorting nearly all economies in the mid-1970s, monetarists had argued that it was necessary to cut budget deficits. The IMF won Labour adherence to this principle already under Dennis Healey after Britain’s 1976 foreign-exchange crisis,. He succumbed to IMF austerity in order to get loans to support the value of sterling. The ensuing impoverishment of Britain contributed to Labour’s 1979 downfall. Rather than leaning against the monetarist wind, Labour itself blocked public industries from financing modernization. Raising the required funds would have increased the Public Sector Borrowing Requirement – the PSBR. Having little idea of how to make public enterprises function efficiently, Labour fatally undercut the viability of these enterprises by letting monetarists control Treasury policy.

Monetarists argued that the way to control inflation was to control the money supply. Friedman explained that this meant in practice the control of the public debt. Monetarists accordingly made a bee-line for the Finance and Treasury ministries in every country. In Britain they were able to control the government through the PSBR, placing a stranglehold on public finances. This forced governments to choose between transferring assets to the public sector, or making do without capital investment and modernization.

The problem could have been cured by letting government departments operate as independent public agencies off the balance sheet, like America’s Tennesee Valley Authority (TVA) and other such entities. But the monetarist objective was not to make governments work better. Just the opposite: it was to claim that they could not work efficiently. Finance or Treasury departments in each country subject to IMF monetarist pressures made sure that this would be the case. This was the prelude in the 1970s setting the stage privatization in the ‘80s.

A double standard was at work. The private sector was assumed to be able to look after itself and not to run into debt imprudently. The financial sector accordingly was deregulated, and promptly created a crises of irresponsible lending. One pitfall was that the PSBR failed to distinguish between productive and unproductive public debt. The idea of productive borrowing outside of PSBR constraints was rejected as being merely a reformist or even left-wing rationale to increase public borrowing and thereby increase the power of government. The last thing Mrs Thatcher and her advisors really wanted to see was a reform that would enable public enterprises to be run more efficiently. In any event, public borrowing would not have generated revenue for directors, after labour’s wage levels had increased. Nor would it have generated the remarkable underwriting fees that resulted from privatization. The upshot was that British Telephone and its other monopolies needing technological revamping in the world of the 1980s could be modernized only by being privatized.

Privatization’s ultimate beneficiary was the City of London, the square mile of financial institutions that obtained the quickest benefits and turned the program into something rather unanticipated by Mrs. Thatcher and Mr. Lawson. The rentiers for their part seem to have perceived the Thatchers and Friedmans as pawns, an advance infantry of promoters wrapping austerity economics in populist garb – policies that otherwise would have been difficult (if not impossible) to sell to voters.

The irony was that most of Mrs. Thatcher’s friends and heroes were businessmen, manufacturers who made or dealt in products, not financial manipulators. But inevitably, her privatization policy led her to rely on the City financiers. Her autobiography and that of Nigel Lawson reflect their growing annoyance and even fury with the way in which the bank underwriters chosen to advise the government turned privatization into a vehicle to grow rich very fast. Mr. Lawson is scathing as to the the City institutions’ lack of competence, exceeded only by their greed (always pointing out how much more venal their global partners were, to be sure). But once the government had chosen these institutions as its partner, the die was cast. It was unable to find a way to control the underwriters, and feared to disengage.

To the investment bankers placed in charge of underwriting over £65 billion (over $105 billion) of enterprises, at fees of over three billion pounds during 1979-97, and probably at least as much in short-term trading gains, the monetarist politicians appeared out of Britain’s ideological woodwork as well-meaning fools, political front-persons presenting privatization – and hence, City underwriting fortunes – as “popular capitalism.” As far as the City financiers were concerned, their disdain for the City enabled them all the better to act as political spear-carriers for a policy that turned control of the British economy over to themselves. What Margaret Thatcher provided was a populist and even idealistic legitimization for their gains.

The Winter of Discontent, 1978/79

Mrs. Thatcher was lucky. Accident – and indeed, the weather – intervened to play a fateful role. Under normal conditions Britain is warmed by the Gulf Stream bringing tropical water across the Atlantic Ocean from the Caribbean. This creates a warm westerly breeze that keeps British winters free of the ice that normally exists at such northerly latitudes (Britain is as far north as Canada). But occasionally – in the winter of 1947, sixteen years later in 1963, and again sixteen years later in 1979 – the wind blows from the east, bringing cold air from Russia and central Europe. Starting in November 1978, Britain was subjected to sharply below-normal temperatures that persisted right up to election day, May 9, 1979.

This 1978/79 winter descended precisely at the time when British labor unions chose to go on strike to demand pay raises in an attempt keep up with the inflation. Like the rest of the world, Britain was suffering from the inflation and high interest rates emanating from the U.S. economy under the hapless Carter presidency. As high prices spread throughout the world, the inflation ate into the purchasing power of wages. The Labour Party had cut its political wrists by subjecting Britain to IMF austerity in the face of this inflation, and stifling new investment and hiring by public enterprises by letting the PSBR put a stranglehold on their financing. The strikes were directed against these public enterprises, for as noted earlier it was here that unionization was strongest.

The British are not equipped to deal with long periods of severe weather even in normal times, given its rarity. As a result of the public-sector strikes, the roads remained unsalted and were not gritted. Few drivers had snow tires for their cars (expecting winters normally to be mild). Traffic along the M6 motorway around Birmingham and other Midlands districts slowed to a crawl, grinding Britain’s industrial heartland to a standstill.

This became known as England’s Winter of Discontent. It turned a majority of voters, who normally had voted for the Labour Party, to resent its alliance with the unions. As Mrs. Thatcher described the political situation, on December 12, 1978, “trade unions representing National Health Service and local authority workers rejected the 5 per cent pay limit and announced that they would strike in the New Year.”

The next three weeks brought heavy snow, gales and floods. Matters came to a head on Wednesday, January 3, when “the TGWU called the lorry drivers out on strike in pursuit of a 25 per cent pay rise. Some two million workers faced being laid off. Hospital patients, including terminally ill cancer patients, were denied treatment. Gravediggers went on strike in Liverpool. Refuse piled up in Leicester Square. . . . In short, Britain ground to a halt. What was more damaging even than this to the Labour Government, however, was that it had handed over the running of the country to local committees of trade unions.”

Mrs. Thatcher emphasized that Labour Prime Minister Callaghan “had based his whole political career on alliance with the trade union leaders. For him, if not for the country, it had been a winning formula. Now that the unions could no longer be appeased, he had no other policy in his locker. . . . The Government could not even decide whether to declare a State of Emergency.” Mrs. Thatcher for her part was not particularly eager to promote a government settlement with the unions; she preferred to mobilize public reaction against them. In fact, she worried that “The Labour Party might just be persuaded to agree to the negotiation of no-strike agreements in essential services, the payment by the taxpayer of the cost of secret ballots in trade unions and even a code of practice to end secondary picketing – though the last was doubtful. Equally, I was clear that if the Government did accept, we were honour-bound to keep our side of the bargain.” However, she made it a condition of support for the government that it should end the closed shop, thereby stripping unions of much of their power – something no Labour government would agreed to do.

On January 16 she opened the debate in the House of Commons by describing how the “transport of goods by road was widely disrupted, in many cases due to secondary picketing of firms and operators not involved in the actual disputes. British Rail had issued a brief statement: ‘There are no trains today.’ . . . many firms were being strangled, due to shortage of materials and inability to move finished goods. There was trouble at the ports, adding to the problems of exporters. At least 125,000 people had been laid off already and the figure was expected to reach a million by the end of the week. The food industry, in particular, was in a shambolic state, with growing shortages of basic supplies like edible oils, yeast, salt and sugar. And all this on top of a winter of strikes – strikes by tanker drivers, bakers, staff at old people’s homes and hospitals; strikes in the press and broadcasting, airports and car plants; a strike of grave diggers.”

She reported that Labour’s George Brown had complained to her that “the unions had been falling more and more under the control of left-wing militancy.” But Prime Minister Callaghan then urged that the government make further concessions to the unions, including “exemptions from the 5 per cent pay limit, tighter price controls and extension of the principle of ‘comparability,’ under which public sector workers could expect more money. All these were intended as inducements to the unions to sign up to a new pay policy. But he signally failed to address what everyone except the far Left considered the main problem, excessive trade union power.”

Using language recalling that used to denounce weak-willed opposition to Hitler on the eve of World War II, she heaped scorn on Mr. Callaghan for “appeasing” the unions. Rather than fearing to alienate them, she urged her own party leaders to seize the opportunity to gain public favor by riding on wave of reaction against union over-reaching. British wages no longer were set by fair bargaining between workers and their employers, she claimed, but were negotiated by trade union leaders dictating terms to weak-willed government managers. The alternative, of course, was the kind of austerity dictated by IMF monetarists maintaining an employers’ market by imposing chronic under-employment and shifting enterprise out of the unionized public sector to newly privatized, non-unionized enterprises – precisely the kind of austerity that Keynesian income policies had sought to prevent.

Upon winning the general election, Mrs. Thatcher appointed loyal monetarists, who developed a more subtle alternative to the tight-money programs imposed by the IMF on hapless third world counties. A general monetary stringency would have lowered profits and stifled capital gains as well as wages. Britain’s monetarist strategy was to depress wage levels through “structural reform” or “structural adjustment.” The restructuring was achieved not by macroeconomic policies affecting the overall money supply and incomes, but by changing the legal framework and institutional structures within which markets operated. Union power was broken by changing the legal rules, while government economic power was dismantled by cutting taxes and selling off enterprises. The industries being privatized were subjected to much the same downsizing and asset stripping as private companies taken over by corporate raiders and/or leveraged buyouts in the 1980s.

How Monetarism Laid the Groundwork for Privatization

Ostensibly a theory of money and prices, monetarism became an ideology to attack government spending and organized labor. The theory’s guiding idea was that price levels could be determined by controlling the money supply – by the central bank managing the rate at which government deficits were monetized. Meanwhile, wage-push inflation could be countered by taking legal steps to break the power of unions to strike and to declare boycotts. The effect was to remove economic planning from the hands of government. The vacuum would be filled by global investment bankers. Efficient management was to take the form of maximizing stock-market gains, not the promotion of full employment and other non-market social welfare objectives.

Keynes had been a monetary theorist of a different stripe. He saw that money, in the sense of spending power, comprised effectively the entire credit superstructure. Any income-yielding asset could be collateralized as the basis for credit. Indeed, credit – and in effect, purchasing power – can be created simply by companies not paying their bills. These unpaid bills became assets on the books of their suppliers (“receivables” that could be financed through the banking system). In this respect the volume of credit and near-money is virtually synonymous with the economy’s overall volume of debt. This perception forms the basis for post-Keynesian “creditary” or “balance sheet” economics, a more comprehensive alternative to monetarist doctrine. (Gardiner 1993 provides a technical discussion.)

Monetarism reveals its political bias by singling out only public debt as the source of inflation, ignoring the mushrooming private debt. This one-sidedness has proved to be its Achilles Heel. Yet it was precisely this narrow anti-government focus that attracted Mrs. Thatcher and other libertarian politicians to monetarism in the first place.

Monetarism’s appeal is political and rhetorical, not based on sound economic evidence. (Its correlations of money and prices fail to acknowledge the arrow of causality, especially at the foreign-exchange margin. See Hudson 1992 for a detailed critique.) Controlling the public debt by reining in government can represent only part of a comprehensive system of monetary management, for in practice the money supply – the means of settling obligations – turns out to be nothing less than the overall credit supply. This in turn includes the economy’s “near-money” in the form of all marketable assets and debt instruments. Attempts to manage money, narrowly defined as government debt, are thus in vain.

The real reason why monetarists seek to control the Treasury or Finance Department and the central bank in every country is to achieve their political ends. From their position in these financial control centers, they put the brakes on government operations across the board, or promote other pet policies. Monetarist doctrine provides the ideological wrapping to present this control as a form of idealism and individualism.

Although privatization was not a centerpiece of Mrs. Thatcher’s original program, she placed members of her inner circle in charge of the financial ministries and the public enterprises first in line to be privatized, to set about preparing them for sale. In addition to helping the government budget, privatization would remove enterprises from control by the trade unions. And turning power over to privatized management would enable them to begin economizing by downsizing their labor force.

Emphasis Mine

see: http://www.alternet.org/margaret-thatcher-was-privatization-pioneer-and-story-how-her-agenda-did-nothing-make-life-worse?akid=10314.123424.EapCrY&rd=1&src=newsletter822988&t=19