Margaret Thatcher Was a Privatization Pioneer, and This Is the Story of How Her Agenda Did Nothing But Make Life Worse for Millions of People

From: Michael Hudson’s blog, via alternet

Author: Michael Hudson

As in Chile, privatization in Britain was a victory for Chicago monetarism. This time it was implemented democratically. In fact, voters endorsed Margaret Thatcher’s selloff of public industries so strongly that by 1991, when she was replaced as prime minister by her own party’s John Major, only 35 percent of Britain’s voters supported the Labour Party – half the proportion registered in 1945. The Conservatives sold off public monopolies, used the proceeds to cut taxes, and put the privatized firms on a profit-making basis. Their stock prices rose sharply, making capital gains for investors whose ranks included millions of Britons who had been employees and/or customers of these enterprises.

Yet by 1997 the Conservatives were voted out of office by one of the largest margins in their history. What concerned voters were the results of privatization that Mrs. Thatcher had not warned them about. Prices did not decline proportionally to cost cuts and productivity gains. Many services were cut back, especially on the least utilized transport routes. The largest privatized bus company was charged with cut-throat monopoly practices. The water system broke down, while consumer charges leapt. Electricity prices were shifted against residential consumers in favor of large industrial users. Economic inequality widened as the industrial labor force shrunk by two million from 1979 to 1997, while wages stagnated in the face of soaring profits for the privatized companies. The tax cuts financed by their selloff turned out to benefit mainly the rich.

Opinion polls showed that voters had opposed privatization at the outset (as did the press and many Conservative back benchers), but the Conservatives pointed out that Tony Blair rode to victory in part by abandoning “Clause Four” of the Labour Party’s 1904 constitution, advocating state control over the means of production, distribution and exchange. Most voters wanted tighter regulation in the public interest, but not a return to state ownership. On the other hand, they feared the prospect of selling off the post office, the BBC and the London tube (subway) system.

Nearly everyone agreed that companies were run differently in private hands than was the case under public ownership, even when the same managers remained in charge. Privatization was praised by Mrs. Thatcher and her allies – and blamed by many others – for managing these companies to generate capital gains for stockholders rather than to serve broader social ends.

Many people did not believe that essential public-sector industries should be run as commercial gain-seeking enterprises. Among the norms of public service, making a profit certainly was not one of the yardsticks used by the bureaucracy put in charge of these companies. Public-sector labor unions aimed more at maintaining employment than at producing revenue for the state as owner. The purpose of taxes, after all, was to subsidize basic services to the population.

This attitude had long been shared by many Conservatives, as well as by Labour. When Benjamin Disraeli created the Conservative Party in its modern form in the mid-nineteenth century (replacing the old royalist Tory Party), his major ideological adversary was not socialism but the free-trade liberalism that led Britain to repeal its protectionist agricultural tariffs (the Corn Laws) in 1846. Indeed, as a novelist Disraeli sought to expose the horrors of unbridled laissez faire. In Sybil, or The Two Nations, written in 1845 (three years before the Communist Manifesto), he described the rich and the poor as constituting “two nations between whom there is no intercourse and no sympathy, and . . . who are not governed by the same laws.” His novel assigned the loftiest ideals to Sybil, the daughter of a factory worker, but placed his hopes in a morally regenerate aristocracy. And in due course, Disraeli’s social welfare legislation, especially the public health system introduced from 1874 to 1881 (he said that his motto was Sanitas sanitatum, “Health, all is health”), helped the Conservative Party evolve as a nationalist and sometimes “state socialist” party, especially after World War II under Harold Macmillan in the 1960s and even Edward Heath in the ‘70s.

But it was the Labour Party that pressed for nationalization of the major industries. Fabian socialists such as Sydney and Beatrice Webb, George Bernard Shaw and other wealthy opinion-makers typified the degree to which many of Britain’s leading upper-class intellectuals supported nationalization as a cure for the ills of industrial capitalism. Indeed, the aristocracy underwent a schooling in personal economic values that resembled of those of ancient Greece and Rome in their disdain for the idea that one’s life should be devoted to so lowly a purpose as commercial gain-seeking.

Britain’s government was controlled about half of the postwar period by the British Labour Party, which in turn was controlled by the trade unions. This gave the unions more political power than in any other country. Conversely, the Labour Party’s strength was based on the unions. Most workers employed by the public utilities and other government enterprises belonged Transport and General Workers’ Union (TGWU). Although the number of individual party members was relatively small, all of the TGWU’s approximately one million members were deemed to belong simply by virtue of their union membership. The union’s general secretary cast their votes as a bloc at the Labour Party’s annual convention.

Trade unions were given broad privileges in 1906, subsequently restricted by the Trade Disputes Act of 1929 passed largely in retaliation against the 1926 general strike. This act made it mandatory for union members to opt in to the payment of the union’s political levy to the Labour Party. After World War II the rule was changed to give unions a right to opt out of paying the political levy. This had the ironic effect of placing the Labour party finances more firmly in the hands of the union leaders. At the Labour Party conferences these leaders voted on behalf of all their members who had paid the levy. The TGWU thus was placed in a position to cast one million of the party’s roughly six million votes.

Labour endorsed the nationalization of industry so as to serve the interests of workers. As noted above, Clause Four of its 1918 constitution (added in 1919 in the aftermath of World War I) called for the state to control the means of production, distribution and exchange. In 1945 the incoming Labour government nationalized the gas and electric utilities, as well as most transport lines that remained municipally or privately owned. Nearly all were run at a loss, which duly was covered by public subsidy.

World War II had been the great catalyst for faith in public ownership and national planning. Some four-fifths of Britain’s gross domestic product (GDP) was commandeered by the government. By the end of the 1940s most utilities and natural monopolies were in public ownership at the national or local level, or (as in the case of water) were held by public companies with restricted returns for the owners of their equity shares. The coal mines, gas and electric utilities, road transport and railways all were nationalized. The foundations and basic cost structure of Britain’s economy thus were shaped by these public utilities, public housing and socialized medicine, not to mention British Petroleum (BP) and, in time, the government’s North Sea oil holdings. And in due course the automotive, steel and aircraft sectors were rescued from collapse by being nationalized, henceforth to be run at heavy losses subsidized by taxpayers.

Clement Atlee’s Labour government of 1945-51 cited five reasons for nationalizing British industry. As Mrs. Thatcher’s Treasury Chancellor Nigel Lawson. has summarized, the first reason was to improve industrial relations. In practice, he retorted, this meant caving in to the trade union leaders, especially inasmuch as a second objective of postwar nationalization was to ensure full employment. The effect was to inflate wage rates through make-work programs and featherbedding.

A third reason for nationalization was to maximize productivity gains, by removing absentee rentier owners from the scene. The actual result, pointed out the Thatcherites, was an uneconomic management of the labor force. Nationalization also had focused on regulating natural monopolies in the public-interest – that is, by politicians – by administering prices and providing service on a basis other than profit objectives. The monetarists would argue that straight profit objectives were more efficient.

A fifth argument for nationalization had been the strongest. It was intended to replace short-term profit maximization by wider national and social priorities. But governments tend to live just as much in the short run as do corporate managers. More to the point, politicians seek to win votes by placating labor on the eve of elections. “The nationalized industries,” argued Lawson, “so far from improving industrial relations, proved the source of the biggest threat to industrial peace – doubtless because of the combination of centralized union power and recourse to the bottomless public purse.” At least, this argument was more understandable in 1979 than it had been in 1949.

If it seemed that government enterprise could succeed where private management failed, the reason was to be found largely in its claim on the public purse. The losses run up by these enterprises were financed by income taxes whose rates for business and the upper brackets were among the world’s highest, as were inheritance taxes. Indeed, many considered Britain to have been turned into Europe’s most socialist economy after 1945. Yet the objective seemed not to be the provision of efficient service at world-class levels. Public housing, originally a showpiece, deteriorated into what some called “modernist trash,” while the telephone system remained archaic. Public bureaucracies came to be seen as personal baronies whose administrators made little attempt to apply business methods or cost accounting. Yet their book cost far exceeded the stock-market valuation of private companies.

Most Conservatives acquiesced in the idea of national planning as the government increased its share of the economy from 40 per cent to over 60 per cent by the late 1970s. As Mrs. Thatcher observed, “It was, after all, none other than Harold Macmillan who in 1938 proposed in his influential book The Middle Way to extend state control and planning over a wide range of production and services.” Most social legislation since World War II was bipartisan, including the new National Health Service and the National Insurance legislation of 1946. Running a public enterprise was prestigious for many members of the upper classes. And the government was willing to bail out industries when they went bankrupt, with full compensation to investors – something that the market could not have done.

Margaret Thatcher’s Monetarist World View

Mrs. Thatcher has described how her upbringing living over her father’s grocery store in the small town of Grantham shaped her impressions of how society worked. “There is no better course for understanding free-market economics than life in a corner shop.” It was an experience that inoculated her “against the conventional economic wisdom of post-war Britain,” that is, the faith in government planning and the disdain felt among the literati for entrepreneurial values. Hers was the world of “Methodism, the grocer’s shop, Rotary and all the serious, sober virtues cultivated and esteemed in that environment.”

This Babbitt-like view of the world did not prepare her to think about the economic impact of debt, a serious blind spot for nearly all monetarists. She confessed that her idea of debt management was based balancing the family checkbook, as if this was a proper analogy for public finance and government control of the printing press and a central bank to create money at will. To Mrs. Thatcher a government deficit simply meant more debt, and hence more taxes to be paid. “Thrift was a virtue and profligacy a vice,” she wrote. Taxes were “a deterrent to work,” not the means by which vital public services were supplied. It was as if such services had no economic value. Income policies were epitomized by the undeserving poor living better on state subsidies in public council housing than hard-working families who struggled to pay their rent or meet their mortgage payments. This was a view reflecting middle class resentment against subsidized services extended to families lower on the economic scale.

One does not learn much about macroeconomics from a store. A shopkeeper buys what already has been produced; how it is made is not of much concern. In fact, Mrs. Thatcher’s world view was naturally akin to that of Chicago School monetarism. The focus was simply on how to undercut the prices of one’s competitors, preferably by cutting taxes and the costly social welfare schemes on which they were spent.

The ideological pedigree for the Chicago School’s narrow-minded economics was provided by Frederick Hayek and Milton Friedman. Hayek’s most famous book,The Road to Serfdom(1944), opposed any and all government planning in principle as leading inevitably to either fascist or Communist authoritarianism. When Keith Joseph gave Mrs. Thatcher a copy of this book she readily responded to his hard line. “Hayek saw that Nazism – national socialism – had its roots in nineteenth-century German social planning. He showed that intervention by the state in one area of the economy or society gave rise to almost irresistible pressures to extend planning further into other sectors. He alerted us to the profound, indeed revolutionary, implications of state planning for Western civilization as it had grown up over the centuries.” This would underlie her opposition to European unification under the Maastricht Treaty.

To most people the government appeared as the benign sponsor of the welfare state that emerged from World War II’s mobilization. But by the late 1970s the sclerosis of public industries threatened to make Britain economically ungovernable. In these circumstances the Chicago School’s anti-statism found an increasingly fertile intellectual ground.

It was natural for self-made people such as Margaret Thatcher to prefer a private-sector market economy to a state bureaucracy. Private enterprise beholden to shareholders hardly can afford patronage and cronyism. Of former Conservative Prime Minister Harold Macmillan’s broad and inclusive politics, she acknowledged disdainfully that “The traditional economic liberalism which constituted so important a part of my political make-up . . . was often alien and uncongenial to Conservatives from a more elevated social background.”

She and her supporters stood more in the tradition of the old Liberal Party, dressing up the ideas of Adam Smith in monetarist Chicago garb, seeing in government planning a road to serfdom at worst, and incompetence at best. She warned against the dangers of inflation spurred by government borrowing, but said little about private debt.

Mrs. Thatcher thus was ideologically harder than her pragmatic Conservative predecessor Edward Heath, and represented a break from her party’s traditions. She admired what the Chicago Boys had done in Chile, and would find kindred monetarist souls among Russian “reformists”. “Let us glory in our inequality,” she preached at one banquet, explaining that more inequality meant that more wealth was being created by “savers” at the top of the economic pyramid, presumably to trickle down via new direct investment. However, she recognized that such policies could be introduced in England only by an elected government. The task she set before herself was to win British voters to support her reforms voluntarily, for imposing them by armed force was out of the question.

It was taken as a matter of faith that financial gains would be invested in upgrading the enterprises once they were privatized, installing new machinery and hiring more labor to provide better service while increasing output at falling prices. Workers were invited to think of themselves as finance-capitalists-in-miniature, earning dividends and capital gains by investing their savings in the shares in these companies. This was the essence of Mrs. Thatcher’s “popular capitalism.” In her pursuit of these objectives the Iron Lady became Britain’s first prime minister to be elected for three consecutive terms, to retain this office for over ten consecutive years, and to have an “ism” named after her.

But first, she had to convince her fellow Conservatives. This became her major initial fight, within her own party.

How British Monetarism Planned the Neo-Conservative Takeover

No economic theory can be promoted successfully today without institutional sponsorship. In America, monetarist ideas were spread by policy institutes such as the Heritage Foundation, the Cato Institute and the American Enterprise Institute. Likewise in England, if the history of privatization is dominated by Margaret Thatcher, her victory was largely a product of British monetarism’s main policy institute, the Centre for Policy Studies (CPS), founded in 1974 by her mentor Keith Joseph (then a Member of Parliament). With Mrs. Thatcher as its President, the CPS used the economic philosophy of Frederick Hayek (the “father of monetarism”) and Milton Friedman to launch the “Thatcher Interlude” that culminated in 1979 with her election as Prime Minister.

Britain could claim the Austrian-born Hayek as one of its own. He had become a British citizen in 1938, and held the Tooke Chair in economics at London from 1931 to 1950. (Ironically, Thomas Tooke was the great anti-monetarist, a century and a half earlier, in the 1830s.) To help spread his political philosophy, he helped create the Institute of Economic Affairs in 1957, the Adam Smith Institute in 1977 (serving as its first chairman), and the Social Affairs Unit in 1980.

Hayek wanted to abandon all public regulatory structures. Followed by Friedman, he argued that all attempts by government to shape markets were doomed to failure. Planning itself was wrongheaded in principle. As Nigel Lawson summarized this philosophy: “Economic planning was both impossible and unnecessary. . . . The price mechanism . . . was a much more efficient means of transmitting consumer wants and needs than the vast bureaucracies of Whitehall and the nationalized industries.”

This view of idealism as serving to strengthen state power enabled the Conservatives to take the moral high ground, Lawson continued, “by elevating private actions above public direction and dismissing ‘social justice’ as both vague and arbitrary.” The only valid idealism was to destroy the state. This could best be done by cutting off the government’s financial taproot, the ability to create the money needed to finance its budget deficits. The alternative to government bureaucracy, Lawson concluded, was to create a new political ideal for capitalism: to turn “profit” and “capitalism” into words of praise; “planning,” “government” and “taxes” became the new terms of invective.

Hayek joined the Chicago economics faculty in 1950, two years after Friedman, who spent 1953-54 in England as a visiting Fulbright Lecturer at Cambridge. At that time, he reminisced (in Capitalism and Freedom), “Those of us who were profoundly worried by the danger to freedom and prosperity posed by the growth of government, the triumph of the welfare state and Keynesian ideas, made up a small minority and we were considered eccentric by the vast majority of our intellectual colleagues.” Monetarism was deemed eccentric because it saw in government only the power to tax and oppress, not to protect and support. (Herman Kahn’s wife, Jane, likes to tell the anecdote of how, Milton Friedman once replied to her when she asked whether social spending on needy children was not be one type of public welfare that was well justified: “Mrs. Kahn, why do you want to subsidize the production of orphans.”) To the monetarists, all socially ameliorative spending appeared only as an economic distortion on the expenditure side, and as a burden on industry on the tax side of the tax-and-spend equation.

Mrs. Thatcher’s truculent Joan of Arc personality found a kindred soul in Alfred Sherman, CPS’s Director of Studies, whom she described as an ex-Communist who brought a “convert’s zeal” to the monetarist cause. Like so many former left-wingers, he seems never to have forgiven the working class for not following his early entreaties. And much like a spurned lover, he got his revenge as a Tory. But he retained from Marxism an awareness of economic theory’s political service as apologetics for one class or the other. He found in monetarism not so much an objective analysis of money and credit as a means of blaming inflation on government spending. Cutting off the government’s ability to run into debt would leave the power of private capital (“the market”) to take its place.

If Sherman was the ideological gadfly, Mrs. Thatcher was the master of political tactics. Her genius lay in seeing that public bureaucracies were ripe for the plucking, along with the Keynesian macroeconomic theory that served as their intellectual foundation. Most Britons believed that once a path was embarked upon, it could not be changed, to say nothing of being diametrically reversed. The denationalization of industry appeared politically impossible. Indeed, Labour governments believed they could bring one sector after another into the public domain. To Mrs. Thatcher this was the road to serfdom, and she sought to reverse the trend. She alone had the confidence to go on the offensive rather than passively decrying the trend towards larger public control of the economy. It was largely a result of her initiative that Britain, the nation with Europe’s strongest social democratic tradition and the most highly developed public sector, became the first to reverse what seemed initially to be an inexorable trend toward greater state control.

The Monetarist Attack on Full-Employment “Demand Management”

Mrs. Thatcher, Keith Joseph, Alfred Sherman and Nigel Lawson challenged the idea that economies could be managed by income policies aimed at achieving full employment. This objective, voiced by John Maynard Keynes in the 1930s in his General Theory, had become political orthodoxy throughout most of the world by the 1950s and ‘60s, and was endorsed both by Conservatives and Labour.

In America, the (“Full”) Employment Act of 1946 had replaced what Marx called the chronic “reserve army of the unemployed” by employment policies aimed at absorbing surplus labor through public spending. This policy met its Waterloo at the hands of Gardner Ackley of the Council of Economic Advisors and Robert McNamara, who tried to calculate just how much war America could afford, and indeed how much was needed to create “effective demand.”

In England, Mrs. Thatcher and her allies opposed Keynesian income policy on the ground that it supported wages (and hence, priced British goods out of world markets) simply to create “demand,” without regard for productivity. The achievement of “full-employment stability” was illusory, the monetarists accused, for it entailed monetary instability. Acting as the employer of last resort (or injecting enough “effective demand” to ensure full employment), governments created inflationary pressures by monetizing public debts. The ensuing inflation threatened bondholders and hence deterred their motivation to save, by reducing the purchasing power of their rentier income. The tacit assumption was that their “saving” would have funded new direct investment and employment rather than real estate or stock market speculation in assets already in place.

The major backers of monetarism duly became the rentier interests (banks, insurance companies and other institutional investors, as well as wealthy coupon clippers) who feared seeing the value of their bonds, loans and other claims on the economy’s wealth eroded by inflation. It was not hard for monetarists to show that their self-interest lay in backing an economic doctrine which depicted governments as being inherently inefficient, wasteful and/or corrupt, dominated by vested interests such as the labor unions. The Thatcherites argued that wherever public enterprise played a major role, it suffered from bureaucratic inefficiency and waste. Decision-making by entrenched constituencies (the labor unions in Britain, party members in the USSR and Argentina, and campaign contributors in the United States and Japan) led publicly owned companies to be managed uneconomically.

The way to stop this process was to turn off the monetary spigot which funded public spending. Contrary to Keynesian prescriptions, the monetarists argued, governments should limit their regulatory activity to control over the money supply, increasing it at a constant rate. They could do this only by not running into debt in the pursuit of full employment programs and other social spending. In sum, whereas Keynes had provided a rationale for government planning to sustain full employment, with an inflationary bias that he welcomed as leading to the “euthanasia of the rentier class,” monetarism took the side of creditors in urging fiscal austerity of the type imposed by the IMF on debtor countries.

Inverting Lenin’s view of governments as being the board of directors for the ruling class, the Thatcherites depicted government (at least Labour Governments when in power, which was about half the time under Britain’s two-party system) as the Board of Directors of the labor unions. They argued that industrialists could not manage in the face of unequal competition with the unions. Creditor-oriented monetarism thus merged with free-market economics of a particular kind. A Keynesian “market,” the Thatcherites accused, was very different from what an ideal market should be. The kind of competitive market that union leaders wanted was one of low unemployment conducive to wage-push inflation. For the Thatcherites, creating a “competitive market” and price stability became euphemisms for breaking trade union power.*

Creating a Populist Opposition to Public Spending

Monetarists recognized that in order to reduce taxes (without increasing the public debt), it was necessary to cut back public spending proportionally. This was, conveniently, part of their plan to scale down government in general. The path of least resistance was for politicians to create a backlash against government waste, and to reduce everyone’s taxes somewhat, while “simplifying” the fiscal system by shifting taxes away from wealth (especially in the finance, real estate and insurance sectors) onto consumers via sales taxes, excise taxes and the value-added tax (VAT).

The biggest problem faced by Mrs. Thatcher in pursuing this regressive fiscal policy was that most voters initially viewed the government as subsidizing essential public services, ensuring economic security and helping families in need. But voters also were taxpayers. Mrs.Thatcher played on their resentment against public subsidies to those who were less hard working (i.e., poorer) than themselves. Seeking to attract voters to her cause through their perceptions of the existing system’s unfairness and visible inefficiencies. Although most came from wage-earning families and their natural sympathies lay with labor, she was able to denounce trade unions for their featherbedding and extortionate wage demands.

In sum, Mrs. Thatcher made no apology for fighting against tax-and-spend policies, trade unions and public ownership. What she challenged was nothing less than her society’s traditional value system. She appealed to the narrowest and most immediate self-interest of voters, not to their idealistic hopes. Her success is reflected in the fact that the 1980s became a decade in which income and property taxes were rolled back and governments began to be downsized not only in England but throughout the world.

Opposition to public spending – and the taxes to pay for it – was fanned by warnings about the dangers of inflation eroding the purchasing power of wages. What was not stressed was that the main source of global inflation was the United States, whose war in Southeast Asia had created a budget deficit and forced the world off gold. America quadrupled grain prices in 1971-72, and OPEC countries followed suit with oil prices. By the end of the 1970s the U.S. Federal Reserve raised interest rates to 20 percent in order to end the inflation by deterring bank lending. This plunged England and other countries into economic crises of their own. Future historians no doubt will find it remarkable that they sought to cope by curtailing their own budget deficits and money supply.

The monetarists viewed inflation as a domestic phenomenon that could be countered by cuts in public spending and general austerity. But their policies only made things worse, by collapsing employment and output. Falling tax revenues pushed government budgets even further into deficit, and rising interest rates increased rather than lowered prices. (Economists call this the Gibson Paradox.) High interest rates collapsed the stock and bond markets, leading to capital outflows and lower foreign-exchange rates. This increased the price of imports, pushing up prices accordingly. But monetarist politicians single-mindedly blamed the inflation on not following their austerity policies even more stringently and not cutting government spending by even more!

What the Thatcherites feared was not so much government as such, but the degree to which the trade union bureaucracy controlled the Labour Party. Like America, Britain was ruled by what was essentially a two party system. And when one party remained in office so long that its vested interests overplayed their hand, the other party was voted in, and typically tried to reverse what its predecessor had done. Labour was bound to come to power every five to eight years or so. Under Britain’s “pendulum politics,” the prospect was for it to act as the arm of the trade unions that made up the bulk of its constituency, and to re-nationalize companies that the Conservatives had denationalized.

At the Centre for Policy Studies, Keith Joseph stressed in a 1976 pamphlet, Monetarism is not Enough, that monetary deflation by itself could not solve Britain’s problems. Workers had to be laid off. But Labour’s featherbedding practices blocked the needed downsizing. Indeed, union power was strongest in government departments and public enterprises. To be run efficiently, these had to be shifted to non-union labor. This perception helped promote the privatization of key public industries and government operations.

Mrs. Thatcher’s Anti-Union Strategy

After reducing taxes on wealth and fighting inflation by cutting back government, the monetarist objective was nothing less than to destroy British trade union power. Mrs. Thatcher nurtured a popular reaction against the unions, choosing her battles carefully. Biding her time so as not to alienate public opinion, she waited for the unions to misplay – and then acted with tactics planned in advance both from a legal and public relations vantage point.

By the time her tenure as prime minister ended, Mrs. Thatcher had carried through her program, hinted at already in the late 1970s (see Thatcher 1995:424f.). The 1988 Employment Act gave union members the right not to join in strikes their unions called without holding a ballot. The 1990 act, she wrote, “concluded the long process of whittling away at the closed shop,” by forbidding unions from excluding non-union workers from being hired.

Already in the aftermath of the 1974 coal strike, Edward Heath’s government had scaled back union immunities from law suits making it a legal tort – that is, an actionable offense, punishable by fine – for unions to picket or boycott suppliers (or customers) of companies being struck. Monetary judgments henceforth could be levied against the unions.

Mrs. Thatcher also hit upon the strategy of insisting on union democracy as a ploy to counter hard-line union leaders. The traditional British procedure was for workers to vote for shop stewards (typically the most militant union members) to represent them in casting their votes for the union heads who in turn did the voting for strikes and also wielded power at the Labour Party’s annual convention. Mrs. Thatcher knew that it was much more difficult to frighten these activist shop stewards into submission than to intimidate the rank and file. Her idea accordingly was to insist that all major decisions, above all whether to strike, should to put to a full union vote in open secret-ballot elections. Without this reform, she wrote, “the rest of our programme for national recovery would be blocked. . . . Winning the next [1979] election, even by a large majority, would not be enough if the only basis for it was dissatisfaction with Labour’s performance in office since 1974. Therefore, far from avoiding the union issue – as so many of my colleagues wanted – we should seek to open up debate. Moreover, this debate was not something to fear: the unions were an increasing liability to Labour and correspondingly a political asset to us. With intelligence and courage we could turn on its head the inhibiting and often defeatist talk about ‘confrontation.’”

As one Conservative remarked, “What other right winger would ever have had the cleverness to trust the common sense of the ordinary union member so sincerely? The union bosses were put in an impossible position. As the self-proclaimed tribunes of the workers they could not refuse democracy. They tried to use the argument of the expense of ballots to avoid them, so Maggie said, ‘That’s alright; the government will pay.’ Love her or hate her, one has to admire the accuracy of her perception.”

The unions overplayed their hand in the Winter of Discontent, 1978/79, but the time was not yet ripe for a showdown. “From 1980 we pursued a ‘step-by-step’ programme of trade union reform,” Mrs. Thatcher later reminisced. The 1982 “Tebbit Acts” removed the traditional union “immunities from common law tort action for damages, except for ‘primary’ strikes sanctioned by a majority in secret ballot,” observes one of her advisors, Patrick Minford. This legal chess game set the stage for her to checkmate Arthur Scargill’s coal miners in 1983 (her counterpart to Ronald Reagan’s 1981 destruction of the Air Controllers’ Union), by making union funds subject to awards for damages.

In 1981, Mrs. Thatcher gave into the union rather than engage in a fight she felt she could not win in the public’s eye. She knew just how far she could go up against them, and her sense of timing enabled her to succeed. Her defeat of the 1984-85 miners’ strike (described in the next chapter) effectively cemented the new order. “In 1990, my last year as Prime Minister, the number of industrial stoppages was the lowest in any year since 1935.”

The decline in union power enabled the privatized companies and others to downsize their labor force. Between 1979 and 1986, union membership fell by three million persons. Two million industrial workers were put out of work, including over a million miners. “The new service industries, such as computer software and biotechnology,” Mrs. Thatcher wrote in 1995, “are in any case not easily unionized, and so not held back in the application of new techniques.”

A Conservative politician summed up matters: “The original purpose of privatization was to break up Trades Union Monopsony rather than manufacturer/utility Monopoly.” The politicians who joined Mrs. Thatcher’s inner circle focused on labor’s cost-push inflation, to the extent that British wage rates (and hence, product prices) were negotiated between strong-willed union leaders and (so Mrs. Thatcher claimed) weak-willed government bureaucrats.

The Conservatives depicted their warfare against the unions as being waged not against labor, but against adventurist opportunists using their constituencies for their own glory. Even communists such as Leon Trotsky had attacked craft unions such as America’s American Federation of Labor as representing particular layers of the labor force acting in their own narrow self-interest. Mrs. Thatcher subtly froze the union leaders out of the policy picture simply by ending the traditional ritual of beer and sandwiches in Downing Street. The trade union bosses found themselves cut off.

Mrs. Thatcher ended by excluding children and young adults under twenty-one from the minimum wage regulations, and finally abolished the laws outright. These transformations of the labor market, she concluded, “allowed management once more to manage and so ensured that investment was once again regarded as the first call on profits rather than the last.” But a double standard seems to have been at work. The first call on profits seemed to be for higher salaries and stock options for senior managers. She denounced high taxes for deterring their efforts and praised high salaries for motivating them, yet what seemed to motivate manual workers was poverty and the loss of job security. Her rather vindictive world view did not recognize falling real wages as deterring productivity gains; only falling profits and dividends for the well-to-do led to inefficiency in the monetarist world view.

In the process of privatizing the large public enterprises, Mrs. Thatcher seized labor’s pension funds, wiping out company liability for the pensions saved up by their employees. It took several years for the European Community to rule her act illegal. The money belonged to the workers, not to the buyers of these companies.

But just who were these buyers? Where did workers fit into the picture, via their personal shareholdings and those of their pension funds?

“Popular” or “Peoples’ Capitalism”

Mrs. Thatcher recognized that an anti-union policy by itself would not suffice; she had to give workers something in return. What was needed was to cast monetarism’s anti-labor philosophy in a more positive rhetoric. Her solution was “popular capitalism,” an elaboration of what Anthony Eden and other earlier Conservatives had called a property-owning democracy.

The idea of getting workers to think of themselves as property owners had long been voiced by Conservative politicians. It began with the idea of them owning their own homes, bought on mortgage. Mrs. Thatcher started the process with Council house sales. No less than £24 billion were sold off, larger than any single other public industry. But the privatization that really inaugurated “popular capitalism” was the sale of British Telephone in November 1984. The idea was nothing less than to win workers over to the cause of capitalism as an ideal, by turning them into stockholders in the economy’s commanding heights. This, she hoped, would shift their faith away from socialism in the future to capitalism in the present. “Privatization not only widens share ownership (desirable in itself),” claimed Lawson, “but increases employee share ownership, which previous privatizations show leads to further improved performance.” More politically to the point, giving property to citizens would create “a society with an inbuilt resistance to revolutionary change.”

Lawson hoped that workers would value their shareholdings more than they would resent their falling real wages. In any event, he added, “I give away few political secrets when I say that Governments are likely to be more concerned about the prospect of alienating a mass of individual shareholders” than they would be about offending a few dozen Conservative investment managers. Future Labour governments thus would have to hesitate before taking steps that would threaten the value of shares held by large numbers of workers.

Every attempt therefore was made to spread share ownership as widely as possible, for “the more widely the shares were spread, the more people had a personal stake in privatization, and were thus unlikely to support a Labour Party committed to renationalization. And if this forced Labour to abandon its commitment to renationalization, so much the better. For our objective was, so far as practically possible, to make the transfer of these businesses irreversible.” However, another Conservative politician has assured me that the small private investor “was never more than icing on the political cake.” In the end, it was the large campaign contributors who mattered after all, for their funding enabled the party to buy TV time and media space to attack Labour in the usual ways, which had little to do with the economic self-interest of workers as such.

Mrs. Thatcher’s ideal was for every employee and customer of British Gas, British Telephone and other major utilities to buy into them and thereby to acquire a stake in their efficient management. Workers who were not deemed redundant would find their wages supplemented by dividends (and capital gains) from the stocks they were able to buy with their savings. In good capitalist form they would become owners of the means of production, at least as minority shareholders. This prospect was supposed to gain popular support for breaking the trade unions, dismantling government protection of labor and withdrawing subsidies from public services. Politics became an exercize in the degree to which the perspective of labor’s economic self-interest could be foreshortened and sidetracked.

Lawson had proposed the term “people’s capitalism,” but Mrs. Thatcher felt that this sounded too much like the communist people’s republics, and preferred “popular capitalism.” This still sounded like General Pinochet’s “labor capitalism,” and indeed was a similar program of monetarist austerity, dressed up in populist rhetoric.

The attempt to make privatization irreversible shaped its tactics from the outset. In this respect its history in Britain is as much the story of political expediency as one of economic principles in the abstract. Mrs. Thatcher sought to protect the newly privatized status quo by endowing a coalition of beneficiaries who would form a bulwark against any future attempts by Labour to try to re-nationalize the enterprises being sold off. One constituency of “popular capitalism” was created by giving workers a stake in preserving the value of the shares they held in these enterprises. Another constituency consisted of the buyers (often the former managers) of the enterprises being sold off. Yet another was created by selling shares to foreign investors, so that any attempt to denationalize would have to confront not only British financial institutions and worker-shareholders, but American and other global diplomatic pressure. The strategy was to spread shareholding so widely that it could not be reversed.

This political strategy shaped the early privatizations. It led Lawson to offer shares at a fixed price rather than by auction, on the ground that small subscribers wanted to know just how much they would have to pay in order to be willing to buy. He later ruefully admitted that this political ploy led to an underwriting strategy that resulted in huge losses to the government (and unwarranted gains for the City financiers) as compared to what an open auction of shares would have yielded.

How Britain’s Public Enterprises were Strangled: The Needless Fight over the PSBR

The Thatcherites argued that private ownership would be inherently more efficient than government control, assuming that sound management depended on ownership alone. Lawson insisted that “you can no more make a State industry imitate private enterprise by telling it to follow textbook rules or to stimulate competitive prices, than you can make a mule into a zebra by painting stripes on its back. There is no equivalent in the State sector to the discipline of the share price or the ever-present threat of bankruptcy.” Only the prospect of economic gains would lead enterprises to cut costs, improve service and become more businesslike in general.

One economist (John Kay 1988) pointed out that, “all State-owned corporations improved their productivity remarkably in the 1980s, whether they were privatized or not.” However, Lawton replied, “it was the process of preparing State enterprises for privatization . . . that initially enabled management to be strengthened and motivated, financial disciplines to be imposed and taken seriously, and costs to be cut as trade union attitudes changed.”

The real problem was that Britain’s Treasury refused to authorize the funds needed for investment as long as the enterprises remained in public hands. To stop the inflation that was distorting nearly all economies in the mid-1970s, monetarists had argued that it was necessary to cut budget deficits. The IMF won Labour adherence to this principle already under Dennis Healey after Britain’s 1976 foreign-exchange crisis,. He succumbed to IMF austerity in order to get loans to support the value of sterling. The ensuing impoverishment of Britain contributed to Labour’s 1979 downfall. Rather than leaning against the monetarist wind, Labour itself blocked public industries from financing modernization. Raising the required funds would have increased the Public Sector Borrowing Requirement – the PSBR. Having little idea of how to make public enterprises function efficiently, Labour fatally undercut the viability of these enterprises by letting monetarists control Treasury policy.

Monetarists argued that the way to control inflation was to control the money supply. Friedman explained that this meant in practice the control of the public debt. Monetarists accordingly made a bee-line for the Finance and Treasury ministries in every country. In Britain they were able to control the government through the PSBR, placing a stranglehold on public finances. This forced governments to choose between transferring assets to the public sector, or making do without capital investment and modernization.

The problem could have been cured by letting government departments operate as independent public agencies off the balance sheet, like America’s Tennesee Valley Authority (TVA) and other such entities. But the monetarist objective was not to make governments work better. Just the opposite: it was to claim that they could not work efficiently. Finance or Treasury departments in each country subject to IMF monetarist pressures made sure that this would be the case. This was the prelude in the 1970s setting the stage privatization in the ‘80s.

A double standard was at work. The private sector was assumed to be able to look after itself and not to run into debt imprudently. The financial sector accordingly was deregulated, and promptly created a crises of irresponsible lending. One pitfall was that the PSBR failed to distinguish between productive and unproductive public debt. The idea of productive borrowing outside of PSBR constraints was rejected as being merely a reformist or even left-wing rationale to increase public borrowing and thereby increase the power of government. The last thing Mrs Thatcher and her advisors really wanted to see was a reform that would enable public enterprises to be run more efficiently. In any event, public borrowing would not have generated revenue for directors, after labour’s wage levels had increased. Nor would it have generated the remarkable underwriting fees that resulted from privatization. The upshot was that British Telephone and its other monopolies needing technological revamping in the world of the 1980s could be modernized only by being privatized.

Privatization’s ultimate beneficiary was the City of London, the square mile of financial institutions that obtained the quickest benefits and turned the program into something rather unanticipated by Mrs. Thatcher and Mr. Lawson. The rentiers for their part seem to have perceived the Thatchers and Friedmans as pawns, an advance infantry of promoters wrapping austerity economics in populist garb – policies that otherwise would have been difficult (if not impossible) to sell to voters.

The irony was that most of Mrs. Thatcher’s friends and heroes were businessmen, manufacturers who made or dealt in products, not financial manipulators. But inevitably, her privatization policy led her to rely on the City financiers. Her autobiography and that of Nigel Lawson reflect their growing annoyance and even fury with the way in which the bank underwriters chosen to advise the government turned privatization into a vehicle to grow rich very fast. Mr. Lawson is scathing as to the the City institutions’ lack of competence, exceeded only by their greed (always pointing out how much more venal their global partners were, to be sure). But once the government had chosen these institutions as its partner, the die was cast. It was unable to find a way to control the underwriters, and feared to disengage.

To the investment bankers placed in charge of underwriting over £65 billion (over $105 billion) of enterprises, at fees of over three billion pounds during 1979-97, and probably at least as much in short-term trading gains, the monetarist politicians appeared out of Britain’s ideological woodwork as well-meaning fools, political front-persons presenting privatization – and hence, City underwriting fortunes – as “popular capitalism.” As far as the City financiers were concerned, their disdain for the City enabled them all the better to act as political spear-carriers for a policy that turned control of the British economy over to themselves. What Margaret Thatcher provided was a populist and even idealistic legitimization for their gains.

The Winter of Discontent, 1978/79

Mrs. Thatcher was lucky. Accident – and indeed, the weather – intervened to play a fateful role. Under normal conditions Britain is warmed by the Gulf Stream bringing tropical water across the Atlantic Ocean from the Caribbean. This creates a warm westerly breeze that keeps British winters free of the ice that normally exists at such northerly latitudes (Britain is as far north as Canada). But occasionally – in the winter of 1947, sixteen years later in 1963, and again sixteen years later in 1979 – the wind blows from the east, bringing cold air from Russia and central Europe. Starting in November 1978, Britain was subjected to sharply below-normal temperatures that persisted right up to election day, May 9, 1979.

This 1978/79 winter descended precisely at the time when British labor unions chose to go on strike to demand pay raises in an attempt keep up with the inflation. Like the rest of the world, Britain was suffering from the inflation and high interest rates emanating from the U.S. economy under the hapless Carter presidency. As high prices spread throughout the world, the inflation ate into the purchasing power of wages. The Labour Party had cut its political wrists by subjecting Britain to IMF austerity in the face of this inflation, and stifling new investment and hiring by public enterprises by letting the PSBR put a stranglehold on their financing. The strikes were directed against these public enterprises, for as noted earlier it was here that unionization was strongest.

The British are not equipped to deal with long periods of severe weather even in normal times, given its rarity. As a result of the public-sector strikes, the roads remained unsalted and were not gritted. Few drivers had snow tires for their cars (expecting winters normally to be mild). Traffic along the M6 motorway around Birmingham and other Midlands districts slowed to a crawl, grinding Britain’s industrial heartland to a standstill.

This became known as England’s Winter of Discontent. It turned a majority of voters, who normally had voted for the Labour Party, to resent its alliance with the unions. As Mrs. Thatcher described the political situation, on December 12, 1978, “trade unions representing National Health Service and local authority workers rejected the 5 per cent pay limit and announced that they would strike in the New Year.”

The next three weeks brought heavy snow, gales and floods. Matters came to a head on Wednesday, January 3, when “the TGWU called the lorry drivers out on strike in pursuit of a 25 per cent pay rise. Some two million workers faced being laid off. Hospital patients, including terminally ill cancer patients, were denied treatment. Gravediggers went on strike in Liverpool. Refuse piled up in Leicester Square. . . . In short, Britain ground to a halt. What was more damaging even than this to the Labour Government, however, was that it had handed over the running of the country to local committees of trade unions.”

Mrs. Thatcher emphasized that Labour Prime Minister Callaghan “had based his whole political career on alliance with the trade union leaders. For him, if not for the country, it had been a winning formula. Now that the unions could no longer be appeased, he had no other policy in his locker. . . . The Government could not even decide whether to declare a State of Emergency.” Mrs. Thatcher for her part was not particularly eager to promote a government settlement with the unions; she preferred to mobilize public reaction against them. In fact, she worried that “The Labour Party might just be persuaded to agree to the negotiation of no-strike agreements in essential services, the payment by the taxpayer of the cost of secret ballots in trade unions and even a code of practice to end secondary picketing – though the last was doubtful. Equally, I was clear that if the Government did accept, we were honour-bound to keep our side of the bargain.” However, she made it a condition of support for the government that it should end the closed shop, thereby stripping unions of much of their power – something no Labour government would agreed to do.

On January 16 she opened the debate in the House of Commons by describing how the “transport of goods by road was widely disrupted, in many cases due to secondary picketing of firms and operators not involved in the actual disputes. British Rail had issued a brief statement: ‘There are no trains today.’ . . . many firms were being strangled, due to shortage of materials and inability to move finished goods. There was trouble at the ports, adding to the problems of exporters. At least 125,000 people had been laid off already and the figure was expected to reach a million by the end of the week. The food industry, in particular, was in a shambolic state, with growing shortages of basic supplies like edible oils, yeast, salt and sugar. And all this on top of a winter of strikes – strikes by tanker drivers, bakers, staff at old people’s homes and hospitals; strikes in the press and broadcasting, airports and car plants; a strike of grave diggers.”

She reported that Labour’s George Brown had complained to her that “the unions had been falling more and more under the control of left-wing militancy.” But Prime Minister Callaghan then urged that the government make further concessions to the unions, including “exemptions from the 5 per cent pay limit, tighter price controls and extension of the principle of ‘comparability,’ under which public sector workers could expect more money. All these were intended as inducements to the unions to sign up to a new pay policy. But he signally failed to address what everyone except the far Left considered the main problem, excessive trade union power.”

Using language recalling that used to denounce weak-willed opposition to Hitler on the eve of World War II, she heaped scorn on Mr. Callaghan for “appeasing” the unions. Rather than fearing to alienate them, she urged her own party leaders to seize the opportunity to gain public favor by riding on wave of reaction against union over-reaching. British wages no longer were set by fair bargaining between workers and their employers, she claimed, but were negotiated by trade union leaders dictating terms to weak-willed government managers. The alternative, of course, was the kind of austerity dictated by IMF monetarists maintaining an employers’ market by imposing chronic under-employment and shifting enterprise out of the unionized public sector to newly privatized, non-unionized enterprises – precisely the kind of austerity that Keynesian income policies had sought to prevent.

Upon winning the general election, Mrs. Thatcher appointed loyal monetarists, who developed a more subtle alternative to the tight-money programs imposed by the IMF on hapless third world counties. A general monetary stringency would have lowered profits and stifled capital gains as well as wages. Britain’s monetarist strategy was to depress wage levels through “structural reform” or “structural adjustment.” The restructuring was achieved not by macroeconomic policies affecting the overall money supply and incomes, but by changing the legal framework and institutional structures within which markets operated. Union power was broken by changing the legal rules, while government economic power was dismantled by cutting taxes and selling off enterprises. The industries being privatized were subjected to much the same downsizing and asset stripping as private companies taken over by corporate raiders and/or leveraged buyouts in the 1980s.

How Monetarism Laid the Groundwork for Privatization

Ostensibly a theory of money and prices, monetarism became an ideology to attack government spending and organized labor. The theory’s guiding idea was that price levels could be determined by controlling the money supply – by the central bank managing the rate at which government deficits were monetized. Meanwhile, wage-push inflation could be countered by taking legal steps to break the power of unions to strike and to declare boycotts. The effect was to remove economic planning from the hands of government. The vacuum would be filled by global investment bankers. Efficient management was to take the form of maximizing stock-market gains, not the promotion of full employment and other non-market social welfare objectives.

Keynes had been a monetary theorist of a different stripe. He saw that money, in the sense of spending power, comprised effectively the entire credit superstructure. Any income-yielding asset could be collateralized as the basis for credit. Indeed, credit – and in effect, purchasing power – can be created simply by companies not paying their bills. These unpaid bills became assets on the books of their suppliers (“receivables” that could be financed through the banking system). In this respect the volume of credit and near-money is virtually synonymous with the economy’s overall volume of debt. This perception forms the basis for post-Keynesian “creditary” or “balance sheet” economics, a more comprehensive alternative to monetarist doctrine. (Gardiner 1993 provides a technical discussion.)

Monetarism reveals its political bias by singling out only public debt as the source of inflation, ignoring the mushrooming private debt. This one-sidedness has proved to be its Achilles Heel. Yet it was precisely this narrow anti-government focus that attracted Mrs. Thatcher and other libertarian politicians to monetarism in the first place.

Monetarism’s appeal is political and rhetorical, not based on sound economic evidence. (Its correlations of money and prices fail to acknowledge the arrow of causality, especially at the foreign-exchange margin. See Hudson 1992 for a detailed critique.) Controlling the public debt by reining in government can represent only part of a comprehensive system of monetary management, for in practice the money supply – the means of settling obligations – turns out to be nothing less than the overall credit supply. This in turn includes the economy’s “near-money” in the form of all marketable assets and debt instruments. Attempts to manage money, narrowly defined as government debt, are thus in vain.

The real reason why monetarists seek to control the Treasury or Finance Department and the central bank in every country is to achieve their political ends. From their position in these financial control centers, they put the brakes on government operations across the board, or promote other pet policies. Monetarist doctrine provides the ideological wrapping to present this control as a form of idealism and individualism.

Although privatization was not a centerpiece of Mrs. Thatcher’s original program, she placed members of her inner circle in charge of the financial ministries and the public enterprises first in line to be privatized, to set about preparing them for sale. In addition to helping the government budget, privatization would remove enterprises from control by the trade unions. And turning power over to privatized management would enable them to begin economizing by downsizing their labor force.

Emphasis Mine

see: http://www.alternet.org/margaret-thatcher-was-privatization-pioneer-and-story-how-her-agenda-did-nothing-make-life-worse?akid=10314.123424.EapCrY&rd=1&src=newsletter822988&t=19

Proof That Obamacare ‘Rate Shock’ Is An Ugly Insurance Company Deception

Source: Forbes

Author: Rick Ungar

“Over the past few months, the nation’s largest health insurance companies have been hard at work selling a narrative claiming that the Affordable Care Act is about to result in dramatically larger premium costs for a significant number of Americans. Indeed, the warnings have become so worrisome that the massive increases they are predicting have taken on a frightening descriptor all its own—rate shock.

At the heart of the health insurers’ retelling of the Chicken Little story is a regulation promulgated by the Department of Health and Human Services a few months back limiting what a health insurer can charge a 64 year old to three times what they charge a 21 year old. Currently, the average bump for older participants is typically five times that of the younger customers—although there are examples where the increase can reach ten times what is paid by the young immortals buying coverage.

As a result of the lower premium prices that will be paid by older participant, the expectation—one created by the large insurance companies—is that the youngest participants will have to pay significantly more to make up the difference.

Now, The Urban Institute—an organization so clearly bi-partisan that even the most suspicious partisan would encounter extreme difficulty making a case for bias—is out with a study that states that the ‘rate shock’ argument is “unfounded”, particularly when applied to the millions of Americans in the individual market.

As noted in the report summary:

“Overall, we find that loosening the rate bands from 3:1
to 5:1 would have very little impact on out-of-pocket
rates paid by the youngest nongroup purchasers, once subsidies are taken into account. This is not only the case for all likely purchasers, but also for two populations of particular concern: the 10 million 21-27 year olds who are currently uninsured and the 3 million who currently have nongroup coverage.”

By suggesting that the insurance company claims are merely ‘unfounded’, The Urban Institute is being quite kind as I would suggest a far harsher explanation for their scare tactics.

What the insurance industry is not telling you—as revealed by The Urban Institute study—is that the overwhelming majority of young people who would be charged a higher premium to make up for the lower premiums to be paid by their elders will either be covered by the premium subsidies offered via the insurance exchanges or eligible for Medicaid under the expansion of the program extending health coverage to those earning 133 percent above the federal poverty line.

Therefore, as clearly stated by the report, the lowered premium costs to the oldest participants in an insurance plan would have very little impact on out-of-pocket rates paid by the youngest nongroup purchasers.” 

According to the study, here are the estimates:

  • 92 percent of people ages 21 to 27 projected to buy an individual plan in an exchange in 2017 are expected to have incomes less than 300 percent of the poverty line, so they would be eligible either for Medicaid (if their state expands it) or for substantial subsidies to help pay premiums in the exchange.
  • Similarly, 88 percent of 18- to 20-year-olds projected to buy a plan in the exchange are expected to be eligible for premium subsidies or Medicaid.

In addition to the above statistics, The Urban Institute study highlights the fact that of the 961,000 young adults between the age of 21 and 27 who currently buy their own health insurance as an individual and make too much money to qualify for premium subsidies or Medicaid, a full two-thirds are 26 years old or younger and are in families receiving employer coverage. Accordingly, these kids can receive health insurance coverage under their parent’s employment policy as Obamacare requires that insurers allow parents to add their kids who are under 26 to their employment based health care plan.

While any new law as significant as the Affordable Care Act creates questions and concerns, the false campaign being waged by the health insurance companies is a prime example of an industry using fear as a tool to get the government to change a regulation that they don’t like.

There remain questions as to the impact the rate band limitations will have on businesses that provide health insurance to employees—particularly those with a younger employee base. However, the expectation is that—given the reality that businesses tend to have a ‘spread’ in the age of employees—things should average out. Under the current structure, businesses are paying less in premium contributions for younger employees but considerably more for older employees. Under Obamacare, the prices will rise at the younger end of the scale but decrease significantly for older workers.

For this reason, the primary concern has been focused on what the changes will mean for younger health insurance customers who purchase individual policies.

As The Urban institute study makes crystal clear, the ‘rate shock’ controversy has far more to do with insurance company lobbying efforts and far less to do with the reality of what health insurance will cost for millions of young Americans.

Contact Rick at thepolicypage@gmail.com and follow me on Twitter andFacebook.

Emphasis Mine

see: http://www.forbes.com/sites/rickungar/2013/03/26/proof-that-obamacare-rate-shock-is-an-ugly-insurance-company-deception/

 

11 Most Absurd Lies Conservatives Are Using to Brainwash America’s School Kids

Now Republicans have a plan to try to recapture the youngest voters out there: Take over the curriculum in public schools, replace education with a bunch of conservative propaganda, and reap the benefits of having a new generation that can’t tell reality from right-wing fantasy.

Source: AlterNet

Author: Amanda Marcotte

” If recent elections have taught us anything, it’s that young Americans have taken a decided turn to the left. Young voters delivered Obama the election: the under-44 set voted Obama and the over-45 set broke for Romney. The youngest voters, age 18-29, gave Obama a whopping 60% of their vote.

Now Republicans have a plan to try to recapture the youngest voters out there: Take over the curriculum in public schools, replace education with a bunch of conservative propaganda, and reap the benefits of having a new generation that can’t tell reality from right-wing fantasy.

How well this plan will work is debatable, but in the meantime, these shenanigans present the very real possibility that public school students will graduate without a proper education. To make it worse, many of these attempts to rewrite school curriculum are happening in Texas,  which can set the textbook standards for the entire country by simply wielding its power as one of the biggest school textbook markets there is. With that in mind, here’s a list of 11 lies your kid may be in danger of learning in school.

Lie #1: Racism has barely been an issue in U.S. history and slavery wasn’t that big a deal.

The Thomas B. Fordham Institute reviewed the new social studies standards laid down by the rightwing-dominated Texas State School Board and found them to be a deplorable example of conservative wishful thinking replacing fact. At the top of list? Downplaying the role that slavery had in starting the Civil War, and instead focusing on “sectionalism” and “states rights,” even though the sectionalism and states rights arguments directly stemmed from Southern states wanting to keep slavery. There’s also a chance your kid might be misled to think post-Civil War racism was no big deal, as the standards excise any mention of the KKK, the phrase “Jim Crow” or the Black Codes. Mention is made of the Southern Democratic opposition to civil rights, but mysteriously, the mass defection of Southern Democrats to the Republican Party to punish the rest of the Democrats for supporting civil rights goes unmentioned.

Lie #2: Joe McCarthy was right.

The red-baiting of the mid-20th century has gone down in history, correctly, as a witch hunt that stemmed from irrational paranoia that gripped the U.S. after WWII. But now, according to the Thomas B. Fordham report, your kid might learn that the red baiters had a point: “It is disingenuously suggested that the House Un-American Activities Committee—and, by extension, McCarthyism—have been vindicated by the Venona decrypts of Soviet espionage activities (which had, in reality, no link to McCarthy’s targets).” Critical lessons about being skeptical of those who attack fellow Americans while wrapping themselves in the flag will be lost for students whose textbooks adhere to these standards.

Lie #3: Climate change is a massive hoaxscientistshave perpetuated on the public.

The American Legislative Exchange Council (ALEC) has been hard at work pushing for laws requiring that climate change denialism be taught in schools as a legitimate scientific theory. Unfortunately,  as Neela Banerjee of the  L.A. Times reports, they’ve already had some serious success: “Texas and Louisiana have introduced education standards that require educators to teach climate change denial as a valid scientific position. South Dakota and Utah passed resolutions denying climate change.” Other states are taking the “teach the controversy” strategy that helped get creationism into biology classrooms, asking teachers to treat climate change like it’s a matter of political debate instead of a scientifically established fact.

The reality is that climate change is a fact that has overwhelming scientific consensus. In 2004, Science  reviewed the 928 relevant studies on climate change published between 1993 and 2003 and found that exactly zero of them denied that climate change was a reality, and most found it had manmade causes. To claim that climate change is a “controversy” requires one to believe that there’s a massive conspiracy involving nearly all the scientists in the world. So, your kids are not only not learning the realities of climate change, they are also learning, if indirectly, to give credence to conspiracy theory paranoia.

Lie #4: The Bible is a history textbook and a scientific document.

Texas passed a law in 2007 pushing schools to teach the Bible as history and literature in schools. Since that was already being done in most schools, the law was clearly just a backdoor way to sneak religious instruction into schools, and a report by the Texas Freedom Network (TFN) demonstrates that many of them have taken full advantage. One district treats the Bible stories like history by “listing biblical events side by side with historical developments from around the globe.” Many other schools are teaching that the Bible “proves” that the Earth is only 6,000 years old. The Earth is actually over 4 billion years old.

Lie #5: Black people are the descendents of Ham and therefore cursed by God.

Among the courses justified by the 2007 Bible law, TFN found two school districts teaching that the various races are descended from the sons of Noah. All the Bible really says about the sons of Noah is that Ham was cursed by his father so that his descendents would be slaves, but American slave owners used this passage to claim that Africans must be the descendents of Ham and therefore their slave-owning was okay by God. Make no mistake. The only reason this legend has persisted and is popping up in 21st-century classrooms is that conservative Christians are still trying to justify the enslavement of African Americans over a century ago.

Lie #6: Evolution is a massive hoaxscientistshave perpetuated on the public.

Creationists have an endless store of creative ways to get around the Constitution and the courts when it comes to replacing legitimate biology education with fundamentalist Christian dogma. Various states have employed an extensive school voucher system that has allowed creationist dogma to flourish. College-age activist Zack Kopplin has been chronicling the problem, and has found various schools nationwide using taxpayer dollars to teach that evolution is a “mistaken belief” and that the Bible “refutes the man-made idea of evolution.” Why do these school administrators believe that scientists are hoaxing the public by making up evolution? Kopplin found a Louisiana school principal who claimed it’s because scientists are “sinful men” seeking to justify their own immorality, and another Florida school teaching that evolutionary theory is “the way of the heathen.”

Lie #7Sex is awful and filthy, and you should save it for someone you love.

While things are improving, even in notoriously fact-phobic states like Mississippi and Texas, “abstinence-only” education continues to persist in school districts across the nation. TFN found that nearly three-quarters of Texas high schools are still teaching abstinence-only, which is based on the fundamental and easily disproved lie that premarital sex is inherently dangerous to a person’s mental and physical health. On top of this, TFN found that many schools are still passing on inaccurate information on condoms and STI transmission, usually exaggerating the dangers in a futile bid to keep kids from having sex. Unfortunately, even Texas school districts that use curriculum that educates correctly on contraception use are still trying to spin abstinence-until-marriage as a desirable option for all students, even though premarital sex is near-universal in the real worldAbstinence-only may be discredited with the voters, but sadly it’s still very normal in Texas, other red states, and even across the nation.

Lie #8: Dragons actually once existed. 

As much as “Game of Thrones” fans might wish otherwise, dragons are not real and have never existed. But as reported by Mother Jones, Louisiana’s notorious voucher school system has let some crazy nonsense fly in the classroom, including the claim that dragons used to roam the planet. A book being used in Louisiana classrooms titled Life Science and published by Bob Jones University Press claims that “scientists” found “dinosaur skulls” that the book suggests are actually dragons. “The large skull chambers could have contained special chemical-producing glands. When the animal forced the chemicals out of its mouth or nose, these substances may have combined and produced fire and smoke,” the book claims.

Lie #9: Gay people do not actually exist.

After being beat back by gay rights and sexual health advocates, Republicans in the Tennessee legislature are once again trying to bring back the “don’t say gay bill.” The law would ban a teacher from admitting the existence of homosexuality to students prior to the 8th grade, even if the students ask them about it. Instead, the bill would require turning a student who confesses to being gay over to his parents, with the legislators clearly hoping that punishment will somehow make the kid not-gay. The entire bill rests on and promotes the premise that homosexuality isn’t a real sexual orientation, but just the result of mental illness or confusion, and if it’s enforced, that message will come across to the students.

Lie #10: Hippies were dirty, immoral Satan-worshippers.

In the 1960s, it was common for conservatives to try to discredit the left by stoking paranoia about hippie culture and denouncing the supposed evils of rock ‘n’ roll. Forty years have passed, but in Louisiana, some school administrators are apparently still afraid that possessing a Beatles record means a young person is on the verge of quitting bathing and taking up a lifestyle of taking LSD and worshipping Satan at psychedelic orgies.

A history textbook snagged from a Louisiana school funded by the voucher program tells students: “Many young people turned to drugs and immoral lifestyles and these youths became known as hippies. They went without bathing, wore dirty, ragged, unconventional clothing, and deliberately broke all codes of politeness or manners. Rock music played an important part in the hippie movement and had great influence over the hippies. Many of the rock musicians they followed belonged to Eastern religious cults or practiced Satan worship.” It’s unclear if the book also teaches that if you play a Queen record backward, you can hear Satan telling you to smoke pot, but that kind of critical information could also be conveyed during the teacher’s lectures on the subject.

Lie #11: Ayn Rand’s books have literary value.

Idaho state senator John Goedde, chairman of the state’s Senate Education Committee has introduced a bill that would require students not only to read Rand’s ponderous novel Atlas Shrugged, but also to pass a test on it in order to graduate. Goedde claims to mostly not be serious about this bill, but instead is using it as a childish attempt to piss off the liberals, but it’s still the sort of item parents need to watch out for.

After all, Texas textbook standards require that an obsession with the gold standard be taught as a legitimate economic theory instead of the mad ravings of cranks that it is. We live in an era where no amount of right-wing lunacy is considered too much to be pushed on innocent children like it’s fact. Anyone who doubts that should just remember one word: Dragons.”

Emphasis Mine

See: http://www.alternet.org/education/11-most-absurd-lies-conservatives-are-using-brainwash-americas-school-kids

Shocking New Evidence Reveals Depths of ‘Treason’ and ‘Treachery’ of Watergate and Iran-Contra

New evidence continues to accumulate showing how Official Washington got key elements of two major presidential scandals of the Nixon and Reagan administrations wrong.

Source: AlterNet

Author: Robert Parry

“A favorite saying of Official Washington is that “the cover-up is worse than the crime.” But that presupposes you accurately understand what the crime was. And, in the case of the two major U.S. government scandals of the last third of the Twentieth Century – Watergate and Iran-Contra – that doesn’t seem to be the case.

Indeed, newly disclosed documents have put old evidence into a sharply different light and suggest that history has substantially miswritten the two scandals by failing to understand that they actually were sequels to earlier scandals that were far worse. Watergate and Iran-Contra were, in part at least, extensions of the original crimes, which involved dirty dealings to secure the immense power of the presidency.

Shortly after Nixon took office in 1969, FBI Director J. Edgar Hoover informed him of the existence of the file containing national security wiretaps documenting how Nixon’s emissaries had gone behind President Lyndon Johnson’s back to convince the South Vietnamese government to boycott the Paris Peace Talks, which were close to ending the Vietnam War in fall 1968.In the case of Watergate – the foiled Republican break-in at the Democratic National Committee in June 1972 and Richard Nixon’s botched cover-up leading to his resignation in August 1974 – the evidence is now clear that Nixon created the Watergate burglars out of his panic that the Democrats might possess a file on his sabotage of Vietnam peace talks in 1968.

The disruption of Johnson’s peace talks then enabled Nixon to hang on for a narrow victory over Democrat Hubert Humphrey. However, as the new President was taking steps in 1969 to extend the war another four-plus years, he sensed the threat from the wiretap file and ordered two of his top aides, chief of staff H.R. “Bob” Haldeman and National Security Advisor Henry Kissinger, to locate it. But they couldn’t find the file.

We now know that was because President Johnson, who privately had called Nixon’s Vietnam actions “treason,” had ordered the file removed from the White House by his national security aide Walt Rostow.

Rostow labeled the file “The ‘X’ Envelope” [3] and kept it in his possession, although having left government, he had no legal right to possess the highly classified documents, many of which were stamped “Top Secret.” Johnson had instructed Rostow to retain the papers as long as he, Johnson, was alive and then afterwards to decide what to do with them.

Nixon, however, had no idea that Johnson and Rostow had taken the missing file or, indeed, who might possess it. Normally, national security documents are passed from the outgoing President to the incoming President to maintain continuity in government.

But Haldeman and Kissinger had come up empty in their search. They were only able to recreate the file’s contents, which included incriminating conversations between Nixon’s emissaries and South Vietnamese officials regarding Nixon’s promise to get them a better deal if they helped him torpedo Johnson’s peace talks.

So, the missing file remained a troubling mystery inside Nixon’s White House, but Nixon still lived up to his pre-election agreement with South Vietnamese President Nguyen van Thieu to extend U.S. military participation in the war with the goal of getting the South Vietnamese a better outcome than they would have received from Johnson in 1968.

Nixon not only continued the Vietnam War, which had already claimed more than 30,000 American lives and an estimated one million Vietnamese, but he expanded it, with intensified bombing campaigns and a U.S. incursion into Cambodia. At home, the war was bitterly dividing the nation with a massive anti-war movement and an angry backlash from war supporters.

Pentagon Papers

It was in that intense climate in 1971 that Daniel Ellsberg, a former senior Defense Department official, gave the New York Times a copy of the Pentagon Papers, the secret U.S. history of the Vietnam War from 1945 to 1967. The voluminous report documented many of the lies – most told by Democrats – to draw the American people into the war.

The Times began publishing the Pentagon Papers on June 13, 1971, and the disclosures touched off a public firestorm. Trying to tamp down the blaze, Nixon took extraordinary legal steps to stop dissemination of the secrets, ultimately failing in the U.S. Supreme Court.

But Nixon had an even more acute fear. He knew something that few others did, that there was a sequel to the Pentagon Papers that was arguably more explosive – the missing file containing evidence that Nixon had covertly prevented the war from being brought to a conclusion so he could maintain a political edge in Election 1968.

If anyone thought the Pentagon Papers represented a shocking scandal – and clearly millions of Americans did – how would people react to a file that revealed Nixon had kept the slaughter going – with thousands of additional American soldiers dead and the violence spilling back into the United States – just so he could win an election?

A savvy political analyst, Nixon recognized this threat to his reelection in 1972, assuming he would have gotten that far. Given the intensity of the anti-war movement, there would surely have been furious demonstrations around the White House and likely an impeachment effort on Capitol Hill.

So, on June 17, 1971, Nixon summoned Haldeman and Kissinger into the Oval Office and – as Nixon’s own recording devices whirred softly – pleaded with them again to locate the missing file. “Do we have it?” a Nixon asked Haldeman. “I’ve asked for it. You said you didn’t have it.”

Haldeman: “We can’t find it.”

Kissinger: “We have nothing here, Mr. President.”

Nixon: “Well, damnit, I asked for that because I need it.”

Kissinger: “But Bob and I have been trying to put the damn thing together.”

Haldeman: “We have a basic history in constructing our own, but there is a file on it.”

Nixon: “Where?”

Haldeman: “[Presidential aide Tom Charles] Huston swears to God that there’s a file on it and it’s at Brookings.”

Nixon: “Bob? Bob? Now do you remember Huston’s plan [for White House-sponsored break-ins as part of domestic counter-intelligence operations]? Implement it.”

Kissinger: “Now Brookings has no right to have classified documents.”

Nixon: “I want it implemented. … Goddamnit, get in and get those files. Blow the safe and get it.”

Haldeman: “They may very well have cleaned them by now, but this thing, you need to –“

Kissinger: “I wouldn’t be surprised if Brookings had the files.”

Haldeman: “My point is Johnson knows that those files are around. He doesn’t know for sure that we don’t have them around.”

But Johnson did know that the file was no longer at the White House because he had ordered Rostow to remove it in the final days of his own presidency.

Forming the Burglars

On June 30, 1971, Nixon again berated Haldeman about the need to break into Brookings and “take it [the file] out.” Nixon even suggested using former CIA officer E. Howard Hunt to conduct the Brookings break-in.

“You talk to Hunt,” Nixon told Haldeman. “I want the break-in. Hell, they do that. You’re to break into the place, rifle the files, and bring them in. … Just go in and take it. Go in around 8:00 or 9:00 o’clock.”

Haldeman: “Make an inspection of the safe.”

Nixon: “That’s right. You go in to inspect the safe. I mean, clean it up.”

For reasons that remain unclear, it appears that the Brookings break-in never took place, but Nixon’s desperation to locate Johnson’s peace-talk file was an important link in the chain of events that led to the creation of Nixon’s burglary unit under Hunt’s supervision. Hunt later oversaw the two Watergate break-ins in May and June of 1972.

While it’s possible that Nixon was still searching for the file about his Vietnam-peace sabotage when the Watergate break-ins occurred nearly a year later, it’s generally believed that the burglary was more broadly focused, seeking any information that might have an impact on Nixon’s re-election, either defensively or offensively.

As it turned out, Nixon’s burglars were nabbed inside the Watergate complex on their second break-in on June 17, 1972, exactly one year after Nixon’s tirade to Haldeman and Kissinger about the need to blow the safe at the Brookings Institution in pursuit of the missing Vietnam peace-talk file.

Ironically, too, Johnson and Rostow had no intention of exposing Nixon’s dirty secret regarding LBJ’s Vietnam peace talks, presumably for the same reasons that they kept their mouths shut back in 1968, out of a benighted belief that revealing Nixon’s actions might somehow not be “good for the country.”

In November 1972, despite the growing scandal over the Watergate break-in, Nixon handily won reelection, crushing Sen. George McGovern, Nixon’s preferred opponent. Nixon then reached out to Johnson seeking his help in squelching Democratic-led investigations of the Watergate affair and slyly noting that Johnson had ordered wiretaps of Nixon’s campaign in 1968.

Johnson reacted angrily to the overture, refusing to cooperate. On Jan. 20, 1973, Nixon was sworn in for his second term. On Jan. 22, 1973, Johnson died of a heart attack.

Toward Resignation

In the weeks that followed Nixon’s Inauguration and Johnson’s death, the scandal over the Watergate cover-up grew more serious, creeping ever closer to the Oval Office. Meanwhile, Rostow struggled to decide what he should do with “The ‘X’ Envelope.”

On May 14, 1973, in a three-page “memorandum for the record,” Rostow summarized what was in “The ‘X’ Envelope” and provided a chronology for the events in fall 1968. Rostow reflected, too, on what effect LBJ’s public silence then may have had on the unfolding Watergate scandal.

I am inclined to believe the Republican operation in 1968 relates in two ways to the Watergate affair of 1972,” Rostow wrote. He noted, first, that Nixon’s operatives may have judged that their “enterprise with the South Vietnamese” – in frustrating Johnson’s last-ditch peace initiative – had secured Nixon his narrow margin of victory over Hubert Humphrey in 1968.

“Second, they got away with it,” Rostow wrote. “Despite considerable press commentary after the election, the matter was never investigated fully. Thus, as the same men faced the election in 1972, there was nothing in their previous experience with an operation of doubtful propriety (or, even, legality) to warn them off, and there were memories of how close an election could get and the possible utility of pressing to the limit – and beyond.” [To read Rostow’s memo, click here [4], here [5] and here [6].]

What Rostow didn’t know was that there was a third – and more direct – connection between the missing file and Watergate. Nixon’s fear about the file surfacing as a follow-up to the Pentagon Papers was Nixon’s motive for creating Hunt’s burglary team in the first place.

Rostow apparently struggled with what to do with the file for the next month as the Watergate scandal expanded. On June 25, 1973, fired White House counsel John Dean delivered his blockbuster Senate testimony, claiming that Nixon got involved in the cover-up within days of the June 1972 burglary at the Democratic National Committee. Dean also asserted that Watergate was just part of a years-long program of political espionage directed by Nixon’s White House.

The very next day, as headlines of Dean’s testimony filled the nation’s newspapers, Rostow reached his conclusion about what to do with “The ‘X’ Envelope.” In longhand, he wrote a “Top Secret” note [7] which read, “To be opened by the Director, Lyndon Baines Johnson Library, not earlier than fifty (50) years from this date June 26, 1973.”

In other words, Rostow intended this missing link of American history to stay missing for another half century. In a typed cover letter [8] to LBJ Library director Harry Middleton, Rostow wrote: “Sealed in the attached envelope is a file President Johnson asked me to hold personally because of its sensitive nature. In case of his death, the material was to be consigned to the LBJ Library under conditions I judged to be appropriate. …

“After fifty years the Director of the LBJ Library (or whomever may inherit his responsibilities, should the administrative structure of the National Archives change) may, alone, open this file. … If he believes the material it contains should not be opened for research [at that time], I would wish him empowered to re-close the file for another fifty years when the procedure outlined above should be repeated.”

Ultimately, however, the LBJ Library didn’t wait that long. After a little more than two decades, on July 22, 1994, the envelope was opened and the archivists began the long process of declassifying the contents.

Yet, because Johnson and Rostow chose to withhold the file on Nixon’s “treason,” a distorted history of Watergate took shape and then hardened into what all the Important People of Washington “knew” to be true. The conventional wisdom was that Nixon was unaware of the Watergate break-in beforehand – that it was some harebrained scheme of a few overzealous subordinates – and that the President only got involved later in covering it up.

Sure, the Washington groupthink went, Nixon had his “enemies list” and played hardball with his rivals, but he couldn’t be blamed for the Watergate break-in, which many insiders regarded as “the third-rate burglary” that Nixon’s White House called it.

Even journalists and historians who took a broader view of Watergate didn’t pursue the remarkable clue from Nixon’s rant about the missing file on June 17, 1971. Though a few other historians did write, sketchily, about the 1968 events, they also didn’t put the events together.

So, the beloved saying took shape: “the cover-up is worse than the crime.” And Official Washington hates to rethink some history that is considered already settled. In this case, it would make too many important people who have expounded on the “worse” part of Watergate, i.e. the cover-up, look stupid. [For details, see Robert Parry’sAmerica’s Stolen Narrative [9].]

The Iran-Contra Cover-up

Similarly, Official Washington and many mainstream historians have tended to dismiss Ronald Reagan’s Iran-Contra scandal as another case of some overzealous subordinates intuiting what the President wanted and getting everybody into trouble.

The “Big Question” that insiders were asking after the scandal broke in November 1986 was whether President Reagan knew about the decision by White House aide Oliver North and his boss, National Security Advisor John Poindexter, to divert some profits from secret arms sales to Iran to secretly buy weapons for the Nicaraguan Contra rebels.

Once, Poindexter testified that he had no recollection of letting Reagan in on that secret – and with Reagan a beloved figure to many in Official Washington – the inquiry was relegated to insignificance. The remaining investigation focused on smaller questions, like misleading Congress and a scholarly dispute over whether the President’s foreign policy powers overrode Congress’ power to appropriate funds).

At the start of the Iran-Contra investigation, Attorney General Edwin Meese had set the time parameters from 1984 to 1986, thus keeping outside of the frame the possibility of a much more serious scandal originating during Campaign 1980, i.e., whether Reagan’s campaign undermined President Jimmy Carter’s negotiations to free 52 American hostages in Iran and then paid off the Iranians by allowing Israel to ship weapons to Iran for the Iran-Iraq War.

So, while congressional and federal investigators looked only at how the specific 1985-86 arms sales to Iran got started, there was no timely attention paid to evidence that the Reagan administration had quietly approved Israeli arms sales to Iran in 1981 and that those contacts went back to the days before Election 1980 when the hostage crisis destroyed Carter’s reelection hopes and ensured Reagan’s victory.

The 52 hostages were not released until Reagan was sworn in on Jan. 20, 1981.

Over the years, about two dozen sources – including Iranian officials, Israeli insiders, European intelligence operatives, Republican activists and even Palestinian leader Yasser Arafat – have provided information about alleged contacts with Iran by the Reagan campaign.

And, there were indications early in the Reagan presidency that something peculiar was afoot. On July 18, 1981, an Israeli-chartered plane crashed or was shot down after straying over the Soviet Union on a return flight from delivering U.S.-manufactured weapons to Iran.

In a PBS interview nearly a decade later, Nicholas Veliotes, Reagan’s assistant secretary of state for the Middle East, said he looked into the incident by talking to top administration officials. “It was clear to me after my conversations with people on high that indeed we had agreed that the Israelis could transship to Iran some American-origin military equipment,” Veliotes said.

In checking out the Israeli flight, Veliotes came to believe that the Reagan camp’s dealings with Iran dated back to before the 1980 election. “It seems to have started in earnest in the period probably prior to the election of 1980, as the Israelis had identified who would become the new players in the national security area in the Reagan administration,” Veliotes said. “And I understand some contacts were made at that time.”

When I re-interviewed Veliotes on Aug. 8, 2012, he said he couldn’t recall who the “people on high” were who had described the informal clearance of the Israeli shipments but he indicated that “the new players” were the young neoconservatives who were working on the Reagan campaign, many of whom later joined the administration as senior political appointees.

Neocon Schemes

Newly discovered documents [10] at the Reagan presidential library reveal that Reagan’s neocons at the State Department – particularly Robert McFarlane and Paul Wolfowitz – initiated a policy review in 1981 to allow Israel to undertake secret military shipments to Iran. McFarlane and Wolfowitz also maneuvered to put McFarlane in charge of U.S. relations toward Iran and to establish a clandestine U.S. back-channel to the Israeli government outside the knowledge of even senior U.S. government officials.

Not only did the documents tend to support the statements by Veliotes but they also fit with comments that former Israeli Prime Minister Yitzhak Shamir made in a 1993 interview in Tel Aviv. Shamir said he had read the 1991 book, October Surprise, by Carter’s former National Security Council aide Gary Sick, which made the case for believing that the Republicans had intervened in the 1980 hostage negotiations to disrupt Carter’s reelection.

With the topic raised, one interviewer asked, “What do you think? Was there an October Surprise?”

“Of course, it was,” Shamir responded without hesitation. “It was.”

And, there were plenty of other corroborating statements as well. In 1996, for instance, while former President Carter was meeting with Palestine Liberation Organization leader Arafat in Gaza City, Arafat tried to confess his role in the Republican maneuvering to block Carter’s Iran-hostage negotiations.

“There is something I want to tell you,” Arafat said, addressing Carter in the presence of historian Douglas Brinkley. “You should know that in 1980 the Republicans approached me with an arms deal [for the PLO] if I could arrange to keep the hostages in Iran until after the [U.S. presidential] election,” Arafat said, according to Brinkley’s article in the fall 1996 issue of Diplomatic Quarterly.

As recently as this past week, former Iranian President Abolhassan Bani-Sadr reiterated his account of Republican overtures to Iran during the 1980 hostage crisis and how that secret initiative prevented release of the hostages.

In a Christian Science Monitor commentary about the movie “Argo,” Bani-Sadr wrote that “Ayatollah Khomeini and Ronald Reagan had organized a clandestine negotiation … which prevented the attempts by myself and then-U.S. President Jimmy Carter to free the hostages before the 1980 U.S. presidential election took place. The fact that they were not released tipped the results of the election in favor of Reagan.”

Though Bani-Sadr had discussed the Reagan-Khomeini collaboration before, he added in his commentary that “two of my advisors, Hussein Navab Safavi and Sadr-al-Hefazi, were executed by Khomeini’s regime because they had become aware of this secret relationship between Khomeini, his son Ahmad, … and the Reagan administration.”

In December 1992, when a House Task Force was examining this so-called “October Surprise” controversy – and encountering fierce Republican resistance – Bani-Sadr submitted a letter detailing his behind-the-scenes struggle with Khomeini and his son Ahmad over their secret dealings with the Reagan campaign.

Bani-Sadr’s letter – dated Dec. 17, 1992 – was part of a flood of last-minute evidence implicating the Reagan campaign in the hostage scheme. However, by the time the letter and the other evidence arrived, the leadership of the House Task Force had decided to simply declare the Reagan campaign innocent. [See Consortiumnews.com’s “‘October Surprise’ and ‘Argo.’ [11]”]

Burying the History

Lawrence Barcella, who served as Task Force chief counsel, later told me that so much incriminating evidence arrived late that he asked Task Force chairman, Rep. Lee Hamilton, a centrist Democrat from Indiana, to extend the inquiry for three months but that Hamilton said no. (Hamilton told me that he had no recollection of Barcella’s request.)

Instead of giving a careful review to the new evidence, the House Task Force ignored, disparaged or buried it. I later unearthed some of the evidence in unpublished Task Force files. However, in the meantime, Official Washington dismissed the “October Surprise” and other Iran-Contra-connected scandals, like Contra drug trafficking, as conspiracy theories. [For the latest information on the October Surprise case, see Robert Parry’sAmerica’s Stolen Narrative [9].]

As with Watergate and Nixon, Official Washington has refused to rethink its conclusions absolving President Ronald Reagan and his successor President George H.W. Bush of guilt in a range of crimes collected under the large umbrella of Iran-Contra.

When journalist Gary Webb revived the Contra-Cocaine scandal in the mid-to-late 1990s, he faced unrelenting hostility from Establishment reporters at the New York Times, Washington Post and Los Angeles Times. The attacks were so ugly that Webb’s editors at the San Jose Mercury News forced him out, setting in motion his professional destruction.

It didn’t even matter when an internal investigation by the CIA’s inspector general in 1998 confirmed that the Reagan and Bush-41 administrations had tolerated and protected drug trafficking by the Contras. The major newspapers largely ignored the findings and did nothing to help rehabilitate Webb’s career, eventually contributing to his suicide in 2004. [For details on the CIA report, see Robert Parry’s Lost History [9].]

The major newspapers have been equally unwilling to rethink the origins – and the significance – of the October Surprise/Iran-Contra scandal. It doesn’t matter how much new evidence accumulates. It remains much easier to continue the politically safe deification of “Gipper” Reagan and the fond remembrances of “Poppy” Bush.

Not only would rethinking Iran-Contra and Watergate stir up anger and abuse from Republican operatives and the Right, but the process would reflect badly on many journalists and historians who built careers, in part, by getting these important historical stories wrong.

However, there must come a point when the weight of the new evidence makes the old interpretations of these scandals intellectually untenable and when treasured sayings – like “the cover-up is worse than the crime” – are swept into the historical dustbin.

Emphasis Mine

see: http://www.alternet.org/tea-party-and-right/shocking-new-evidence-reveals-depths-treason-and-treachery-watergate-and-iran

 

What Obama Should Do Now

Source: Reich’s Blog

By: Robert Reich

“What should the President do now?

Push to repeal the sequester (a reconciliation bill in the Senate would allow repeal with 51 votes, thereby putting pressure on House Republicans), and replace it with a “Build America’s Future” Act that would close tax loopholes used by the wealthy, end corporate welfare, impose a small (1/10 of 1%) tax on financial transactions, and reduce the size of the military.

Half the revenues would be used for deficit reduction, the other half for investments in our future through education (from early-childhood through affordable higher ed), infrastructure, and basic R&D.

Also included in that bill – in order to make sure our future isn’t jeopardized by another meltdown of Wall Street – would be a resurrection of Glass-Steagall and a limit on the size of the biggest banks.

I’d make clear to the American people that they made a choice in 2012 but that right-wing House Republicans have been blocking that choice, and the only way to implement that choice is for Congress to pass the Build America’s Future Act.

If House Republicans still block it, I’d make 2014 a referendum on it and them, and do whatever I could to take back the House.

In short, the President must reframe the public debate around the future of the country and the investments we must make together in that future, rather than austerity economics. And focus on good jobs and broad-based prosperity rather than prosperity for a few and declining wages and insecurity for the many.

Emphasis Mine

see:http://readersupportednews.org/opinion2/279-82/16330-focus-what-obama-should-do-now

 

Cleveland Laborfest & Forum and the Labor and New Deal Art Traveling Exhibition

The Main branch of the Cleveland Public Library (www,cpl.org) has been hosting a display of labor and New Deal (visual) art from January 18 – March 24, and on February 23 was host to Laborfest: a multimedia celebration from videos and powerpoint presentations to live drama and music.

I took the RTA downtown.

IMG_1975

Checked out some Russian  books in the old section:

IMG_1979

IMG_1982

The event was in the new section (Louis Stokes) wing:

IMG_1996

We were warmly welcomed by Leonard DiCosimo (President, Cleveland Federation of Musicians), who introduce Harriet AppleGate, (Executive Secretary of the North Shore Federation of Labor), who also welcomed us and introduced Patrick Gallagher (USW).

They were followed by three speakers: Prof. Ahmed White, Colorado School of Law; Prof. Patricia Hills, Boston University; and Dr. M. Melissa Wolfe, Curator of American Art at the Columbus Museum of Art. All were very well qualified to speak in their areas.

Mr. White covered the infamous ‘little steel’ strike of 1937, ranging from the Chicago Massacre, in which ten strikers were killed, through actions in Ohio, to the eventual, inadequate settlements, concluded in 1942. (I might add that he used the phrases ‘class struggle’ and ‘class consciousness’ positively and freely, and observed openly contributions made by Communists and ‘fellow travelers’). Those unfamiliar with the strike might see, for example: http://en.wikipedia.org/wiki/Memorial_Day_massacre_of_1937, or http://www.ohiohistorycentral.org/entry.php?rec=513

Ms. Hills spoke on Art and Politics in the Popular Front: The Union Work and Social Realism of Philip Evergood. (http://en.wikipedia.org/wiki/Philip_Evergood) She displayed works and covered the life of several other labor/New Deal artists, including William Gropper, Louis Lozowick, and Hugo Gellert. Many artists of the period gravitated to the CPUSA, and to the John Reed Clubs. (http://en.wikipedia.org/wiki/John_Reed_Clubs). Her talk emphasized the Popular Front era of the Party in the late 1930’s, which focused on stopping fascism and expanding union organizing. (The works of these and others were on display in the main library building.)

Ms. Wolfe presented the life and works of Joe Jones – a worker-artist. Joe – who started out as a house painter – said he wanted to make art that would knock holes in walls, rather than merely make them pretty. Ms. Wolfe: “What did it mean to be a Communist artist, as Jones clearly decided he would be?…To be a Communist artist during the Third Period of the Communist Party – between 1928 and 1935 – meant that you were a class-conscious worker whose production – art – acted as a weapon to incite a revolution that would end Fascist structures of power and give workers control of their production…”

We then had a brief break: noshing on snacks, and networking with friends, and then we were treated to live theater, and live music. The former was a reading of “Capitalization”, by Marc Norwalk, presented by three members of Cleveland Public Theater;

IMG_1991

the latter music from the New Deal Era by Todd Smith and the New Deal All Stars.

IMG_1992

Enlightenment and entertainment: for what else could one ask? Senator Sherrod Brown? He and his lovely literary wife – thr Progressive Pulitzer winner Connie Schultz – were there as well.    

IMG_1994

We Still Need Higher Revenues to Reduce Our Deficit

From: American Progress

By Michael Linden and John Craig

“Though conservatives like to point to the “historical average” level of tax revenue as support for their position that further deficit reduction should not include more revenue, the historical data actually prove just the opposite. If we want to reduce our budget deficit, we will need higher revenues than are currently projected.

As Congress and the White House contemplate possible approaches to deficit reduction that would replace the $1.2 trillion sequester that is set to begin in March, the arguments over revenue and spending levels have intensified. Most conservatives in Congress insist that any plan to replace the sequester must be paid for entirely by cutting spending—not by bringing in new revenue. Their position rests on the contention that, “This isn’t a tax problem. It is a spending problem.” And as proof, they often point out that revenues are already projected to rise above the historical average over the next 10 years.

They’re not wrong—at least not about the historical average. Federal receipts, as a percentage of gross domestic product, or GDP, have averaged 17.9 percent over the last 40 years. The Congressional Budget Office projects that—with the fiscal cliff deal in place and assuming that a variety of “temporary” tax breaks will be extended yet again—federal revenues will average 18.5 percent of GDP over the next 10 years. 18.5 percent is certainly bigger than 17.9 percent, so some conservatives say that this proves that we don’t need more revenue.

But what they’re missing is that 17.9 percent of GDP hasn’t been enough revenue for the last 40 years—and it certainly won’t be enough for the next 40 years. Remember, the federal budget was in the red for nearly every one of these last 40 years—and often deeply so. And the deficits were bigger when revenue was lower, smaller when revenue was higher—a fact that should surprise no one.

Take, for example, the last four years. From 2009 to 2012 federal receipts averaged just 15.4 percent of GDP—lower than at any point since 1950. Not surprisingly, record-low revenues translated into record-high deficits.

This basic relationship holds true over the past four decades. In the 40 years since 1973, 11 years saw deficits greater than 4 percent of GDP. In those same 11 years, revenues averaged 16.7 percent of GDP—well below the much-vaunted historical average. Similarly, there were 12 years in which the deficit was smaller than 2 percent of GDP. And in those years, revenue averaged 18.9 percent of GDP—much higher than the average. And, of course, in the four years in which we actually balanced the budget, revenue averaged 20 percent of GDP. (see Figure 1)

But full budget balance isn’t necessarily what we’re aiming for right now, so perhaps revenues don’t need to be increased all the way up to 20 percent of GDP. Indeed, President Obama has called for just enough deficit reduction to prevent the national debt, measured as a share of GDP, from rising. Others have called for somewhat more deficit reduction. Those goals will require deficits in the range of 2.5 percent of GDP or lower. And in the years since 1973—when the deficit was less than or equal to 2.5 percent of GDP—the federal government collected 18.8 percent of GDP on average in revenue.

While the difference between 18.8 percent and the current projection of 18.5 percent may not appear to be substantial, that 0.3 percent increase over the next 10 years equates to about $640 billion in additional revenue. To put that in perspective, the president’s call to replace the sequester half with revenues and half with spending cuts would equate to about $500 billion in new revenue. That would still leave us short of the “historical average” for years with low deficits.

And let’s not forget that what was sufficient in the past may not be sufficient in the future—a point which the historical data itself proves. In the years between 1953 and 1983 in which the deficit was smaller than 2 percent of GDP, revenues averaged 17.9 percent of GDP. But during the following three decades, in the years in which the deficit was smaller than 2 percent of GDP, revenues averaged a much-higher 19.1 percent. Our needs grew over time as our demographic, economic, and security challenges changed, so revenues that were sufficient in one generation became insufficient in the next. (see Figure 2)

This is especially true right now. The anticipated demographic shift as a result of the “baby boom” generation retiring means that there will be a larger proportion of the population relying on Social Security and Medicare in the coming years. Even with significant changes to these programs, this will mean higher costs to the federal government. If we want smaller budget deficits in the future, revenues must be higher than they have been in the past.

Yes, revenues are currently projected to rise above the historical average—but this misleading factoid proves little. Rather than showing why we don’t need more revenue, the historical data actually show clearly why we do. When deficits were small in past years, revenues were higher—higher than the historical average and higher than the current projections. Not only that, but the average revenue in low-deficit years has increased over time.

The lesson is clear and simple: If we want to reduce the deficit, we’re going to need more revenue.

Michael Linden is the Director for Tax and Budget Policy at the Center for American Progress. John Craig is a Research Assistant in the Economic Policy department at the Center.

Emphasis Mine

see:http://www.americanprogress.org/issues/budget/news/2013/02/20/53961/we-still-need-higher-revenues-to-reduce-our-deficit/

 

Pope Benedict To Be Given Global Immunity From Catholic Church Sex Crime Prosecution

From: the freethought express

By: Dorian Staten

Pope Benedict turned a blind eye to Catholic priests molesting boys, then avoided charges due to diplomatic immunity, and will have police protection and continued immunity after he steps down as Pope. That’s one helluva lot more protection than the victims he ignored ever received.

From Reuters:

  • “His continued presence in the Vatican is necessary, otherwise he might be defenseless. He wouldn’t have his immunity, his prerogatives, his security, if he is anywhere else,” said one Vatican official, speaking on condition of anonymity.

Defenseless? Like the young boys who were raped by priests? Go on…

  • “It is absolutely necessary” that he stays in the Vatican, said the source, adding that Benedict should have a “dignified existence” in his remaining years.

Ah, yes. A ‘dignified existence.’ Just like all the victims of Catholic priest molestation who hanged themselves. Please, tell me more.

  • The final key consideration is the pope’s potential exposure to legal claims over the Catholic Church‘s sexual abuse scandals.

Oh those pesky sexual abuse lawsuits. Who wants to have to deal with all that nonsense? Certainly not the high and mighty ex-Pope.

While the victims the Pope turned a blind eye to try to put the pieces of their lives back together, the Pope will be living high off the hog with protection and diplomatic immunity. It’s one old, pious, delusional man’s middle finger to every one of the Catholic Church’s sexual assault victims.

Emphasis Mine

See:http://dastaten.com/Thread-Pope-Benedict-To-Be-Given-Global-Immunity-From-Catholic-Church-Sex-Crime-Prosecution#.USAy3X3sMdC

 

How the State of the Union Worked

From: HuffPost

By: George Lakoff

N.B.: Prof Lakoff is the master of the message, friend of the frame, and not easily pleased: he was.

“Political journalists have a job to do — to examine the SOTU’s long list of proposals. They are doing that job, many are doing it well, and I’ll leave it to them. Instead, I want to discuss what in the long run is a deeper question: How did the SOTU help to change public discourse? What is the change? And technically, how did it work?

The address was coherent. There was a single frame that fit together all the different ideas, from economics to the environment to education to gun safety to voting rights. The big change in public discourse was the establishment of that underlying frame, a frame that will, over the long haul, accommodate many more specific proposals.

Briefly, the speech worked via frame evocation. Not statement, evocation — the unconscious and automatic activation in the brains of listeners of a morally-based progressive frame that made sense of what the president said.

When a frame is repeatedly activated, it is strengthened. Obama‘s progressive frame was strengthened not only in die-hard progressives, but also in partial progressives, those who are progressive on some issues and conservative on others — the so-called moderates, swing voters, independents, and centrists. As a result, 77 percent of listeners approved of the speech, 53 percent strongly positive and 24 percent somewhat positive, with only 22 percent negative. When that deep progressive frame is understood and accepted by a 77 percent margin, the president has begun to move America toward a progressive moral vision.

If progressives are going to maintain and build on the president’s change in public discourse so far, we need to understand just what that change has been and how he accomplished it.

It hasn’t happened all at once.

In 2008, candidate Obama made overt statements. He spoke overtly about empathy and the responsibility to act on it as the basis of democracy. He spoke about the need for an “ethic of excellence.” He spoke about the role of government to protect and empower everyone equally.

After using the word “empathy” in the Sotomayor nomination, he dropped it when conservatives confused it with sympathy and unfairness. But the idea didn’t disappear.

By the 2013 Inaugural Address, he directly quoted the Declaration and Lincoln, overtly linking patriotism and the essence of democracy to empathy, to Americans caring for one another and taking responsibility for one another as well as themselves. He spoke overtly about how private success depends on public provisions. He carried out these themes with examples. And he had pretty much stopped making the mistake of using conservative language, even to negate it. The change in public discourse became palpable.

The 2013 SOTU followed this evolution a crucial step further. Instead of stating the frames overly, he took them for granted and the nation understood. Public discourse had shifted; brains had changed. So much so that John Boehner looked shamed as he slumped, sulking in his chair, as if trying to disappear. Changed so much that Marco Rubio‘s response was stale and defensive: the old language wasn’t working and Rubio kept talking in rising tones indicating uncertainty.

Here is how Obama got to 77 percent approval as an unapologetic progressive.

The president set his theme powerfully in the first few sentences — in about 30 seconds.

Fifty-one years ago, John F. Kennedy declared to this Chamber that ‘the Constitution makes us not rivals for power but partners for progress…It is my task,’ he said, ‘to report the State of the Union — to improve it is the task of us all.’ Tonight, thanks to the grit and determination of the American people, there is much progress to report. …

First, Obama recalled Kennedy — a strong, unapologetic liberal. “Partners” evokes working together, an implicit attack on conservative stonewalling, while “for progress” makes clear his progressive direction. “To improve it is the task of us all” evokes the progressive theme that we’re all in this together with the goal of improving the common good. “The grit and determination of the American people” again says we work together, while incorporating the “grit and determination” stereotype of Americans pulling themselves up by their bootstraps — overcoming a “grinding war” and “grueling recession.” He specifically and wisely did not pin the war and recession on the Bush era Republicans, as he reasonably could have. That would have divided Democrats from Republicans. Instead, he treated war and recession as if they were forces of nature that all Americans joined together to overcome. Then he moved on seamlessly to the “millions of Americans whose hard work and dedication have not yet been rewarded,” which makes rewarding that work and determination “the task of us all.”

This turn in discourse started working last year. Empathy and social responsibility as central American values reappeared in spades in the 2012 campaign right after Mitt Romney made his 47 percent gaff, that 47 percent of Americans were not succeeding because they were not talking personal responsibility for their lives. This allowed Obama to reframe people out of work, sick, injured, or retired as hard working and responsible and very much part of the American ideal, evoking empathy for them from most other Americans. It allowed him to meld the hard working and struggling Americans with the hard working and just getting by Americans into a progressive stereotype of hard working Americans in general who need help to overcome external forces holding them back. It is a patriotic stereotype that joins economic opportunity with equality, freedom and civil rights: “if you work hard and meet your responsibilities, you can get ahead, no matter where you come from, what you look like, or who you love.”

It is an all-American vision:

It is our unfinished task to make sure that this government works on behalf of the many, and not just the few; that it encourages free enterprise, rewards individual initiative, and opens the doors of opportunity to every child across this great nation.

“Our unfinished task” refers to citizens — us — as ruling the government, not the reverse. “We” are making the government do what is right. To work “on behalf of the many, and not just the few.” And he takes from the progressive vision the heart of the conservative message. “We” require the government to encourage free enterprise, reward individual initiative, and provide opportunity for all. It is the reverse of the conservative view of the government ruling us. In a progressive democracy, the government is the instrument of the people, not the reverse.

In barely a minute, he provided a patriotic American progressive vision that seamlessly adapts the heart of the conservative message. Within this framework comes the list of policies, each presented with empathy for ideal Americans. In each case, we, the citizens who care about our fellow citizens, must make our imperfect government do the best it can for fellow Americans who do meet, or can with help meet, the American ideal.

With this setting of the frame, each item on the list of policies fits right in. We, the citizens, use the government to protect us and maximally enable us all to make use of individual initiative and free enterprise.

The fact that the policy list was both understood and approved of by 77 percent of those watching means that one-third of those who did not vote for the president have assimilated his American progressive moral vision.

The president’s list of economic policies was criticized by some as a lull — a dull, low energy section of the speech. But the list had a vital communicative function beyond the policies themselves. Each item on the list evoked, and thereby strengthened in the brains of most listeners, the all-American progressive vision of the first section of the speech. Besides, if you’re going to build to a smash finish, you have to build from a lull.

And it was a smash finish! Highlighting his gun safety legislation by introducing one after another of the people whose lives were shattered by well-reported gun violence. With each introduction came the reframe “They deserve a vote” over and over and over. He was chiding the Republicans not just for being against the gun safety legislation, but for being unwilling to even state their opposition in public, which a vote would require. The president is all too aware that, even in Republican districts, there is great support for gun safety reform, support that threatens conservative representatives. “They deserve a vote” is a call for moral accounting from conservative legislators. It is a call for empathy for the victims in a political form, a form that would reveal the heartlessness, the lack of Republican empathy for the victims. “They deserve a vote” shamed the Republicans in the House. As victim after victim stood up while the Republicans sat slumped and close-mouthed in their seats, shame fell on the Republicans.

And then it got worse for Republicans. Saving the most important for last — voting reform — President Obama introduced Desiline Victor, a 102-year spunky African American Florida woman who was told she would have to wait six hours to vote. She hung in there, exhausted but not defeated, for many hours and eventually voted. The room burst into raucous applause, putting to shame the Republicans who are adopting practices and passing laws to discourage voting by minority groups.

And with the applause still ringing, he introduced police officer Brian Murphy who held off armed attackers at the Sikh Temple in Minneapolis, taking twelve bullets and lying in a puddle of his blood while still protecting the Sikhs. When asked how he did it, he replied, “That’s just how we’re made.”

That gave the president a finale to end where he began.

We may do different jobs, and wear different uniforms, and hold different views than the person beside us. But as Americans, we all share the same proud title: We are citizens. It’s a word that doesn’t just describe our nationality or legal status. It describes the way we’re made. It describes what we believe. It captures the enduring idea that this country only works when we accept certain obligations to one another and to future generations; that our rights are wrapped up in the rights of others; and that well into our third century as a nation, it remains the task of us all, as citizens of these United States, to be the authors of the next great chapter in our American story.

It was a finale that gave the lie to the conservative story of America, that democracy is an individual matter, that it gives each of us the liberty to seek his own interests and well-being without being responsible for anyone else or anyone else being responsible for him, from which it follows that the government should not be in the job of helping its citizens. Marco Rubio came right after and tried out this conservative anthem that has been so dominant since the Reagan years. It fell flat.

President Obama, in this speech, created what cognitive scientists call a “prototype” — an ideal American defined by a contemporary progressive vision that incorporates a progressive market with individual opportunity and initiative. It envisions an ideal citizenry that is in charge of the government, forcing the president and the Congress to do the right thing.

That is how the president has changed public discourse. He has changed it at the level that counts, the deepest level, the moral level. What can make that change persist? What will allow such an ideal citizenry to come into existence?

The president can’t do it. Congress can’t do it. Only we can as citizens, by adopting the president’s vision, thinking in his moral frames, and speaking out from that vision whenever possible. Speaking out is at the heart of being a citizen, speaking out is political action, and only if an overwhelming number of us speak out, and live out, this American vision, will the president and the Congress be forced to do what is best for all.

By all means, discuss the policies. Praise them when you like them, criticize them when they fall short. Don’t hold back. Talk in public. Write to others. But be sure to make clear the basic principles behind the policies.

And don’t use the language of the other side, even to negate it. Remember that if you say “Don’t Think of an Elephant,” people will think of an elephant.

Structure is important. Start with the general principles, move to policy details, finish with the general principles.”

George Lakoff is Professor of Cognitive Science and Linguistics at the University of California at Berkeley and is the author, with Elisabeth Wehling, of The Little Blue Book.

Emphasis Mine

see: http://www.huffingtonpost.com/george-lakoff/how-the-state-of-the-unio_b_2693810.html

Five Reasons Ayn Rand Would Have Despised Paul Ryan

From: The National Memo

By: Jason Sattler

Paul Ryan may be backing away from his devotion to Ayn Rand, the woman who inspired him to enter politics. But there are some things that the 20th century’s most prominent prophet of selfishness would have probably appreciated about the Republican’s soon-to-be nominee for vice president. (N.B.: not written yesterday).

In fourteen years in Washington D.C., Ryan only passed two bills—one naming a U.S. post office in his hometown, the other giving arrow makers a tax break. This abject uselessness on behalf of the American people is about as close as an elected official can get to “going Galt.” Being a star member of the most unproductive Congress in 65 years might also have impressed the author who saw the only purpose of government as protecting citizens from physical violence.

Rand might also admire Ryan’s desire to eventually zero out nearly every program that helps the poor and his desire to help rich people become richer with massive tax breaks. But there’s much about the Congressman from Wisconsin that she certainly would consider abhorrent. As Rand scholar Jennifer Burns said, “If Mr. Ryan becomes the next vice president, it wouldn’t be her dream come true, but her nightmare.”

Here are five reasons why Ayn Rand would have quickly shrugged off Paul Ryan.

Jack Kemp was a favorite of Ronald Reagan. The ex-football star, Congressman, and 1996 running mate of Bob Dole, Kemp gave Paul Ryan his first job in politics as a speechwriter. A prime requirement of such a job would be the ability to praise the Gipper slavishly and constantly, something Ryan has been doing ever since. Ryan says that Republicans need to offer the kind of “boldness and clarity that Reagan offered in the 1980s.” Rand would disagree. She hated Reagan with a boldness and clarity that few liberals can match. In 1976 she wrote, “I urge you, as emphatically as I can, not to support the candidacy of Ronald Reagan. I urge you not to work for or advocate his nomination, and not to vote for him. My reasons are as follows: Mr. Reagan is not a champion of capitalism, but a conservative in the worst sense of that word—i.e., an advocate of a mixed economy with government controls slanted in favor of business rather than labor.”

A “conservative in the worst sense of that word” may be the single finest phrase she ever wrote.

Paul Ryan is as anti-abortion rights as any modern politician can be. He authored the Protect Life Act, which would deny an abortion even to save the mother’s own life. Rand’s stand on abortion rights was equally firm in the opposite direction. In her book Of Living Death, Rand wrote, “Abortion is a moral right—which should be left to the sole discretion of the woman involved; morally, nothing other than her wish in the matter is to be considered.” The idea that a woman possesses ownership of her own body even after one of her eggs has been fertilized is certainly one concept of freedom that has not been transmitted to those on the right like Ryan, who publicize her philosophy.n his first speech as Mitt Romney’s running mate,

Paul Ryan, a practicing Catholic, said “Our rights come from nature and God, not from government.” He clearly hoped to soothe any doubters on the religious right who might worry that he is too influenced by Rand’s writings. A militant atheist, Rand believed the source of all rights came from simply existing. “The source of man’s rights is not divine law or congressional law, but the law of identity. A is A—and Man is Man,” she wrote. About faith, a fundamental aspect of Catholicism, Rand wrote: “Faith is the worst curse of mankind, as the exact antithesis and enemy of thought.” It isn’t hard to believe that Rand would consider Ryan to be a walking manifestation of that enemy.

Paul Ryan’s great grandfather started a company called Ryan Incorporated Central that has been contracting with the government for over a century. Ryan himself famously used his Social Security survivor’s benefits to pay for his college, which was easy to do considering that his father also left him a substantial share of his estate. And you’re well aware that since he began serving in Congress back in 1999, Paul Ryan has been enjoying government health care. Ayn Rand preached self-reliance and her heroes were always self-made—unlike Ryan and Romney, both of whom enjoyed extraordinary financial stability and connections coming out of college. These luxuries made Ryan insensitive to the troubles faced by typical Americans and the need for a safety net, which Ryan likes to call a “safety hammock.”

Some people are born on third base and think they hit a triple. Ryan is standing on third base wondering why the batboy is being so lazy. Not exactly a heroic stand.

For all her ranting about the limits of government and the need to be independent, Ayn Rand benefited from Medicare. After decades of smoking, she needed surgery for lung cancer. And where did she turn? The evil of collectivism. Her supporters argue that “she paid into [the Medicare system] her entire life. Why shouldn’t she accept the benefits?” I agree. But all the people under 55 who would get a vastly different version of Medicare under Ryan’s plan have paid their dues, too. Lao Tzu said, “Watch your character, for it becomes your destiny.” Whatever Ayn Rand’s beliefs or intentions, her character provided a real testament to the virtues of  government that promotes its citizens’ general welfare.

———————————————————————————————————————————————————–

Emphasis Mine & my comments

see: http://www.nationalmemo.com/five-reasons-ayn-rand-would-have-despised-paul-ryan/