HHS and Soft Totalitarianism, by George Weigel.

The Obama administration’s recently-announced HHS regulations, which would require Catholic institutions to subsidize health insurance coverage that provides sterilization, abortifacient drugs, and contraceptives, should be located within the context of the administration’s three-year long effort to define religious freedom down.

As the administration has demonstrated in its international human rights policy, it regards religious freedom as a kind of privacy right: the right to freedom of worship, which the administration seems to regard as analogous to any other optional, recreational activity. No serious student of religious freedom, however, takes the redefinition of religious freedom as freedom-to-worship seriously. For if that redefinition were true, there would be “religious freedom” in Saudi Arabia, so long as the “worship” in question were conducted behind closed doors. And that is manifestly absurd.

The HHS regulations announced on January 20 are one domestic expression of defining-religious-freedom-down. The administration does not propose to, say, restore the 1970 ICEL translations of the prayer-texts of the Mass; that, even HHS might concede, is a violation of religious freedom. But the administration did not think it a violation of religious freedom for its Equal Employment Opportunity Commission to try and overturn the longstanding legal understanding which held that religious institutions have a secure First Amendment right to choose their ministers by their own criteria—until it was told that it had gone way over the line in January’s Hosanna-Tabor Supreme Court decision (a judicial smackdown in which the administration’s own Court nominees joined).

Now, with the HHS “contraceptive mandate” (which, as noted above, is also a sterilization and abortifacent “mandate”), the administration claims that it is not violating the First Amendment by requiring Catholic institutions to provide “services” that the Catholic Church believes are objectively evil. That bizarre claim may well be another constitutional bridge too far. But the very fact that the administration issued these regulations, and that the White House press secretary blithely dismissed any First Amendment concerns when asked whether there were religious freedom issues involved here, tell us something very important, and very disturbing, about the cast of mind in the Executive Branch.

It is no exaggeration to describe that cast of mind as “soft totalitarianism”: an effort to eliminate the vital role in health care, education, and social service played by the institutions of civil society, unless those institutions become extensions of the state. As my colleague Yuval Levin has pointed out, it’s the same cast of mind that gave us Obamacare (which massively consolidates the health insurance industry into a small number of players who function like public utilities) and the Dodd-Frank financial sector reform (which tries to do to banks what Obamacare did to insurance).

The social doctrine of the Catholic Church emphasizes the importance of the mediating institutions of civil society in living freedom nobly and well. John Paul II coined the phrase “the subjectivity of society” to refer to these institutions, which include the family, religious communities, and voluntary organizations of all sorts. In Centesimus Annus, the late pope taught that, among their many other contributions to the common good, these institutions are crucial schools of freedom in which the tyrants that all of us are at age two are turned into democrats: the kind of people who can build free and virtuous societies.

It seems increasingly clear that the Obama administration does not share this vision of a richly textured democracy, in which civil society plays an important, independent role. Rather, it sees only the state and the individual, honoring the institutions of civil society insofar as they can be turned into simulacra of the state. Those with a sense of the ironies of American history will find it, well, ironic that it should be the Catholic Church—long held suspect for its alleged anti-democratic tendencies—that is now cast in the role of chief defender of the fundamental principles of democracy. But that is the task that Catholics have been given.

It is a task in which we dare not fail—for our sake, and for the future of American democracy.

George Weigel is Distinguished Senior Fellow of the Ethics and Public Policy Center in Washington, D.C.

Complete article and comments here

Become a fan of First Things on Facebook, subscribe to First Things via RSS, and follow First Things on Twitter.


Se la cristianofobia è un’occasione, by Marco Respinti.

«Nel mondo islamico, i cristiani vengono massacrati per la loro fede religiosa. Siamo davanti a un genocidio dilagante che dovrebbe suscitare allarme a livello globale». Lo sappiamo. Ma che la denuncia campeggi dalla copertina di Newsweek è una vera notizia.

Il settimanale statunitense ha dedicato al tema un ampio servizio che porta la firma famosa e impegnativa di Ayaan Hirsi Magan Ali. Nata a Mogadiscio, figlia di un signore della guerra somalo, “rinata” nei Paesi Bassi, Ayaan diventa famosa quando, il 2 novembre 2004, il regista neerlandese Theo van Gogh, per il quale aveva scritto la sceneggiatura del cortometraggio Submission, viene ucciso da Mohammed Bouyeri, killer musulmano di origini marocchine. Da allora la Ali vive sotto scorta, si è trasferita a Washington dove lavora per il neoconservatore American Enterprise Institute for Public Policy Research e della sua irriducibile avversione all’islam non fa alcun mistero. Meno digeribile è invece le sua critica piuttosto laicista della religione.

Ayaan non rivela certo novità travolgenti quando ricorda le stragi efferate di Boko Haram in Nigeria, le mattanze che lordano di sangue cristiano il Sudan, l’ordalia continua di un Paese, l’Egitto, le cui “giovani promesse” hanno pensato bene di inaugurare la “corsa alla democrazia” massacrando 23 copti il 1° gennaio 2011 nella Chiesa dei Santi di Alessandria (famosissima la foto del Cristo macchiato di sangue che anche Newsweek sceglie per la copertina), le violenze anticristiane in Iraq, la situazione intollerabile del Pakistan e l’Arabia Saudita custode dei “luoghi santi” dell’islam che vieta con rigore più che zelante la costruzione di qualsiasi edificio di culto cristiano. Ma il punto vero è che questo compitino diligente e utile compaia con grande enfasi sulla copertina di un settimanale non certo di apologetica cristiana scritto non certo da una missionaria (chi non masticasse l’inglese può contare sulla traduzione, parziale, che il 13 febbraio ne ha offerto il Corriere delle Sera nella pagina degli Esteri, la 17esima, senza nemmeno un richiamo in prima).

Ciò – ipotizziamo – avviene per tre motivi. Il primo è che l’evidenza dei massacri anticristiani è tanto grande e cogente che nessuno, meno ancora un entourage di professionisti di primo piano come quello che produce Newsweek, può permettersi di continuare a bucare la notizia.

Il secondo è il fallimento palese delle cosiddette “primavere arabe”, indossate acriticamente da tutti ma ora rovesciatesi (ed era facilissimo prevederlo da subito) nell’esatto contrario di quanto auspicati dal “buonismo”. Che i copti rimpiangano i giorni di Hosni Mubarak – che pure li angheriava – per non rassegnarsi alla “piazza salafita” e che ai “ribelli” gli assiri preferiscano Bashar Assad – che, ricambiati, non amano – è totalmente paradossale quanto altamente significativo.

Terzo, ultimo e forse a lungo termine più fecondo motivo è che un certo mondo, quello che Newsweek se non altro fotografa bene, quello per intendersi che pensa ai propri tornaconti, alle “magnifiche sorti e progressive”, alla Chiesa se può darle addosso, insomma un certo mondo laico-laicista e radical-chic, si rende conto che solo i cristiani sono seme di civiltà. Che una sola è la cultura che genera il vero umanesimo dei diritti e delle libertà. Che se in Medioriente, Africa e Asia trionfasse il modello islamico, il mondo come lo abbiamo conosciuto finirebbe, prospettando poco di buono per quello che lo sostituirà. Insomma, che se là dove sono minoranza vessata e perseguitata perdiamo i cristiani come interlocutori del nostro mondo diviso tra postcristianesimo e nuova evangelizzazione, cioè come pilastri e architravi di isole di società autenticamente «a misura d’uomo e secondo il piano di Dio» (beato Giovanni Paolo II) dove invece di Dio vige un’idea errata e quindi l’uomo muore, tutto è perduto.

Ai tempi in cui il Libano era lacerato tra quattro eserciti invasori e i cristiani ne pagavano il prezzo, l’allora Segretario di Stato americano cinico e liberal Henry Kissinger pensò che la soluzione ottimale fosse che i maroniti, segno di contraddizione ma unica condizione di pace vera, lasciassero il Paese, ridislocandosi per esempio in quel Canada dove di spazio ne hanno da vendere. Ecco, la mano che Newsweek decide di tendere oggi alla lotta contro la “cristianofobia” suggella il tramonto definitivo di quella prospettiva sciagurata, un’azione fattualmente meritoria quali che ne siano le ragioni.

Da che conseguono due cose fondamentali: la prima è che, giunta pure la benedizione liberal, ora non ci sono più scuse per tollerare oltre la strage; la seconda è che la difesa pur strumentale dei cristiani da parte dei liberal è comunque un’occasione d’oro per cominciare a rievangelizzare anche la parte peggiore dell’Occidente. Finché infatti i popoli e le persone continueranno a saltare il mare per venire da noi, mentre invece nessuno fa l’inverso – se ne rende conto pure Newsweek -, avremo su tutti un incommensurabile vantaggio apologetico.

From La Bussola Quotidiana

Complete article and more stories here


Nature, nurture and liberal values, by Roger Scruton.

Biology determines our behaviour more than it suits many to acknowledge. But people—and politics and morality—cannot be described just by neural impulse.

Beyond Human Nature by Jesse Prinz (Allen Lane, £22)
Incognito by David Eagleman (Canongate, £20)
You and Me: the Neuroscience of Identity by Susan Greenfield (Notting Hill Editions, £10)

Human beings are diverse and live in diverse ways. Should we accept that we are diverse by nature, having followed separate evolutionary paths? Or should we suppose that we share our biological inheritance, but develop differently according to environment and culture? Over recent years scientific research has reshaped this familiar “nature-nurture” debate, which remains central to our understanding of human nature and morality.

For much of the 20th century social scientists held that human life is a single biological phenomenon, which flows through the channels made by culture, so as to acquire separate and often mutually inaccessible forms. Each society passes on the culture that defines it, much as it passes on its language. And the most important aspects of culture—religion, rites of passage and law—both unify the people who adhere to them and divide those people from everyone else. Such was implied by what John Tooby and Leda Cosmides called the “standard social science model,” made fundamental to anthropology by Franz Boas and to sociology by Émile Durkheim.

More recently evolutionary psychologists have begun to question that approach. Although you can explain the culture of a tribe as an inherited possession, they suggested, this does not explain how culture came to be in the first place. What is it that endows culture with its stability and function? In response to that question the opinion began to grow that culture does not provide the ultimate explanation of any significant human trait, not even the trait of cultural diversity. It is not simply that there are extraordinary constants among cultures: gender roles, incest taboos, festivals, warfare, religious beliefs, moral scruples, aesthetic interests. Culture is also a part of human nature: it is our way of being. We do not live in herds or packs; our hierarchies are not based merely on strength or sexual dominance. We relate to one another through language, morality and law; we sing, dance and worship together, and spend as much time in festivals and storytelling as in seeking our food. Our hierarchies involve offices, responsibilities, gift-giving and ceremonial recognition. Our meals are shared, and food for us is not merely nourishment but an occasion for hospitality, affection and dressing up. All these things are comprehended in the idea of culture—and culture, so understood, is observed in all and only human communities. Why is this?

The answer given by evolutionary psychologists is that culture is an adaptation, which exists because it conferred a reproductive advantage on our hunter-gatherer ancestors. According to this view many of the diverse customs that the standard social science model attributes to nurture are local variations of attributes acquired 70 or more millennia ago, during the Pleistocene age, and now (like other evolutionary adaptations) “hard-wired in the brain.” But if this is so, cultural characteristics may not be as plastic as the social scientists suggest. There are features of the human condition, such as gender roles, that people have believed to be cultural and therefore changeable. But if culture is an aspect of nature, “cultural” does not mean “changeable.” Maybe these controversial features of human culture are part of the genetic endowment of human kind.

This new way of thinking gained support from the evolutionary theory of morality. Defenders of nurture suppose morality to be an acquired characteristic, passed on by customs, laws and punishments in which a society asserts its rights over its members. However, with the development of genetics, a new perspective opens. “Altruism” begins to look like a genetic “strategy,” which confers a reproductive advantage on the genes that produce it. In the competition for scarce resources, the genetically altruistic are able to call others to their aid, through networks of co-operation that are withheld from the genetically selfish, who are thereby eliminated from the game.

If this is so, it is argued, then morality is not an acquired but an inherited characteristic. Any competitor species that failed to develop innate moral feelings would by now have died out. And what is true of morality might be true of many other human characteristics that have previously been attributed to nurture: language, art, music, religion, warfare, the local variants of which are far less significant than their common structure.

Follow reading the article at Prospect


“The Real Margaret Thatcher Story”. Daniel Yergin

In the late 1970s, inflation neared 20% and the ruling party still wanted to own the means of production. Enter the grocer’s daughter.

A movie producer once shared with me a maxim for making historical films: Faced with choosing between “drama” and “historical accuracy,” compromise on the history and go with drama.

That is certainly what the producers of “The Iron Lady” have done. The result is a masterly performance by Meryl Streep as former British Prime Minister Margaret Thatcher. But the depiction of Mrs. Thatcher in the movie misses much of the larger story. That story—the struggle to define the frontier between the state and the market, and the calamities that happen when governments live beyond their means—is directly relevant to the debt dramas now rocking Europe and the United States.

 

After World War II, Britain created the cradle-to-grave welfare state. A clause in the constitution of the Labour Party which came to power in 1945 called for government to own the “means of production.” That ended up ranging from coal mines and the steel industry to appliance stores, hotels and even a travel agency.

 

The postwar “mixed economy” became a recipe for economic decline. Inflexible labor rules and competition for power among unions led to endless strikes and disruption of the economy. Workers in state-owned companies were essentially civil servants, which gave them enormous clout when they struck. The system stifled innovation, adaptation and productivity, making Britain uncompetitive in the world economy. Yet the spending and debt to support an expansive welfare state went on.

 

By the late 1970s, enormous losses were piled up at state-owned companies, debts which had to be covered by the British Treasury. A desperate Britain needed to borrow money from the International Monetary Fund. Inflation was heading toward 20% as the deficit mounted and strike after strike disrupted economic life. Even grave diggers walked off the job. The continuing deterioration of the country seemed inevitable. Some predicted that Britain would soon be worse off than communist East Germany.

 

Enter Margaret Thatcher, the daughter of a grocery-shop owner who had begun her own professional life as a food chemist but decided she preferred politics to working on ice cream and cake fillings. In 1975, she was elected leader of the Conservative Party. Four years later, she became prime minister, determined to reverse Britain’s decline. That required great personal conviction, which Meryl Streep brilliantly captures.

 

Mrs. Thatcher came with her own script, a framework provided by free-market economists who, even a few years earlier, had been regarded as fringe figures. One telling moment that “The Iron Lady” misses is when a Conservative staffer called for a middle way between left and right and the prime minister shut him down mid-comment. Slapping a copy of Friedrich Hayek’s “The Constitution of Liberty” on the table, she declared: “This is what we believe.”

 

Years later, when I interviewed her in London, Mrs. Thatcher was no less firm. “It started with ideas, with beliefs,” she said. “You must start with beliefs. Yes, always with beliefs.”

 

As prime minister, she encountered enormous resistance, even from her own party, to putting beliefs into practice. But she prevailed through difficult years of painful budget cuts and yanked the government out of loss-making businesses. Shares in state-owned companies were sold off, and ownership shifted from the British Treasury to pension funds, mutual funds and other investors around the world. This set off what became a global mass movement toward privatization. (“I don’t like the word” privatization, she said when we met. What was really going on, she said, was “free enterprise.”)

 

But it was labor relations that were the great drama of the Thatcher years. The country could no longer function without labor reform. This was particularly true of the government-owned coal industry, which was being subsidized with some $1.3 billion a year. The Marxist-led coal miners had great leverage over the entire economy because coal was the main source for generating Britain’s electricity. Everyone remembered the paralyzing 1974 National Union of Miners strike which blacked out the country and brought down a Conservative government.

 

By the 1980s, it was clear that another confrontation was coming as Mrs. Thatcher’s government prepared to close some of its unprofitable mines. The strike started in 1984 and was marked by violent confrontations. But the Iron Lady would not bend, and after a year the strike faltered and then petered out. Thereafter, the road was open to reformed labor relations and renewed economic growth.

 

In 1990, discontented members of the Conservative Party did what the miners could not and brought Mrs. Thatcher down. The Iron Lady, in their view, had become the Imperious Lady of domineering leadership. They revolted, forcing her to resign.

 

Margaret Thatcher and Ronald Reagan were intellectual soul mates, but the movie does not touch on arguably their greatest collaboration, which was ending the Cold War. Mrs. Thatcher’s visit to Poland in 1988, for instance, gave critical impetus to the Solidarity movement in its struggle with the Communist government.

 

But the difference between how Reagan and Mrs. Thatcher are seen today is striking. The bitterness of the Reagan years has largely been forgotten. Not so with Margaret Thatcher. She remains a divisive figure. Her edges were sharp, as could be her tone. Moreover, she was a woman competing in what had almost completely been a man’s world.

 

Yet her true impact has to be measured by what came after, and there the effect is clear. When Tony Blair and Gordon Brown took the leadership of the Labour Party, they set out to modernize it. They forced the repeal of the party’s constitutional clause IV with its commitment to state ownership of the commanding heights of the economy.

 

They did not try to reverse the fundamentals of Thatcherite economics. Mr. Blair recognized that without wealth creation, the risk was redistribution of the shrinking slices of a shrinking pie. The “new” Labour Party, he once said, should not be a party that “bungs up your taxes, runs a high-inflation economy and is hopelessly inefficient” and “lets the trade unions run the show.”

 

The lesson that governments cannot permanently live beyond their means had been learned. When economies are growing in good times, that lesson tends to be forgotten. Yet the longer it is forgotten, the more painful, contentious and dangerous the relearning will be. That is the real story of the Iron Lady. And that is the story in Europe—and the U.S.—today.

 

Mr. Yergin is author of “The Quest: Energy, Security, and the Remaking of the Modern World” (Penguin 2011) and chairman of IHS CERA.


Powered by WordPress | Designed by Kerygma