The Raspberry Patch now continues on with The Age of Spectacle manuscript rollout project. Regular readers will note that our Friday-to-Friday flow was interrupted on Sunday past by a relatively short post—less than 2,000 words—about the stunningly rapid fall of the Asad regime in Syria. It took me about 45 minutes to draft that out, since it was a follow-on to an essay I had published four and a half years ago adumbrated ny some wee-hours insomnia-induced thought.
I enjoyed doing the piece for several reasons, one of which was that it took me back to a long phase of my so-called career when I used my university training, think-tank apprenticeship, and government experience to write on the Middle East quite often, both before and during my time as a thought magazine editor. Here in the The Raspberry Patch I have written only once on this portfolio, on Lebanon back on January 14. So Sunday’s post joins it as the sum so far of this Substack’s writing of that kind.
Alas, I did commit one brainfart typo in “A Bold Plan for the New Syria….”--writing Irbid when I meant Idlib. I’ve fixed that now, and one other similarly produced typo (the missing word “not” in the next sentence). I’ve never been to Idlib but I have been to Irbid, which is in Jordan, not Syria. Funny how the brain works and occasionally misfires, isn’t it? I thank my friend Claire Berlinski for pointing out the typo; that, among other things, is what friends are for. Friends are also for commenting substantively and helping make pieces better—and here thanks are due to some old friends and colleagues: Fred Hof, who has forgotten more about Syria than I ever knew; Aaron David Miller; Dov Zakheim; and a few others.
Finally by way of throat clearing, let me report that my wife and I attended a Labor Chorus concert at the Ethical Society building down on 16th Street on Saturday night. We had a fine time. Since my Dad was a Teamster, and nary a discouraging word about Jimmy Hoffa could be uttered in our home, I have to admit that I have always felt a lot more cozy and fuzzy in the company of trade union grime-and-grit leftists than I ever have among country-club conservative types. I’m reasonably sure I always will, too.
But as a kind of PS to this past Friday’s post and a bridge to what follows below, I have to note that this group assembled in the Ethical Society building was redolent of gender-tinged wokeism and tinged with identity politics tics here and there, reminding me that as someone left-of-center on political economy issues but more conservative on culture war issues I rarely if ever feel entirely at home in any group. I noted several references in the song lyrics that evening to Martin Luther King, Jr. that spited the ill-fitting identity politics aura in the room. Despite some apologists trying to tell us that Dr. King didn’t really mean all those things he said back when—the “not by the color of his skin but by the content of his character”theme—I think it’s clear that he meant every word. So it puzzles me how the contradiction between King’s insistence on the ideal of a harmonious color-blind society and identity politics’ determination to achieve the exact opposite via its conflict-only, zero-sum template goes so often unnoticed, or suppressed by the self-inflicted tyranny of the ideological doxy d’jour.
Well, bearing that in mind, on we go.
Chapter 8. Cognitive Gluttony Meets Race and Gender, part 2
Ripleys on the Left
Lest anyone think that only adolescent fantasists on the Right can “Do a Ripley,” the woke Left is no slacker here. But the “Ripleys” of the illiberal Right, undertaken in a mode of militant nostalgia, and those of the illiberal Left, undertaken in a mode of votive-act utopian yearning, are different in that and other ways. MAGA conflict entrepreneurs lie, regularly and shamelessly, because it keeps working for them. Most woke entrepreneurs do not lie; most really believe what they are saying despite the internal contradictions and abundant negative evidence surrounding them. At best they lie to themselves as, yet again, ideology bests reality. Delusion is not the same as malice aforethought.
This difference matters, morally and otherwise. But to the non-woke the effect is much the same in terms of the psychology of spectacle: to listen to the magical gymnastics required in these delusions is to be smacked in the face with a “hey, you don’t experience that every day” moment, and for a second or two you don’t know for certain if the delusional gymnast is being serious or is trying to pull your leg—A or not-A. Let some fairly recent examples illustrate the point, several being about the deeply dug misanthropies of identity politics.
On January 22, 2023 a 72-year old mentally imbalanced Vietnamese immigrant opened fire on a Lunar New Year celebration in Monterey Park, California, killing eleven people. Two days later another attack, at Half Moon Bay, California, killed seven. The second shooter was a Chinese immigrant. Initial reactions to both shooters were premised on the assumption that the shooters were “white” anti-Asian racists. When it turned out that this premise was false, all sorts of mental gyrations followed.
For example, an Asian-American “wellness” reporter for USA Today wrote: “. . . this time the tragic shootings might not have been out of racism. But that doesn’t negate the constant harassment, violence and hatred we battle on a daily basis.” In other words, we were wrong in our first reactions as to what happened, but that still somehow proves the rule despite the exception. “Might” not have been out of racism?
The only way that it might have made sense to conditionalize the non-white-racist character of the attacks would have been in reference to intra-Asian racism. Vietnamese and Chinese do not exactly love and snuggle with one another, and, to put it mildly, neither do Vietnamese and Cambodians. But that could not have been what the writer was thinking because although Sino-Vietnamese enmity is a hoary fact of life, it is definitionally impossible in the woke faith because only “White” people can be racist.
The New Yorker’s Michael Luo admitted that he, too, “immediately saw the specter of anti-Asian violence” despite, as it turned out, its not being there. Why?
I though about a massacre that had taken place about a hundred and fifty years earlier in Los Angeles, just a few miles west of the Monterey Park shooting. On the evening of October 24, 1871, an angry mob, bearing knives, pistols and clubs, surrounded the city’s Chinese quarter and began dragging our terrified residents . . . . Had I been paranoid? Too quick to believe that a racial motivation might be the cause? I returned to the history in front of me.
Paranoid to rely on an event from October 1871? Maybe not. Jews of East European ancestry are quick to fear pogroms where they don’t exist, too, also even after a century or more. Too quick to hit on a racial motivation? Obviously, yes, and returning to contemplate history does not change that. Journalists as well as scholars are obliged to establish the basic facts before rushing to and helping to spread confusions.
The mainstream media quickly dropped these stories when the racist frame was found clearly not to fit, in stark contrast to the days-long effort to uncover a racist motivation behind the Atlanta “spa massacre” of March 2021, since the shooter was as a Caucasian. There was none to uncover.[1]
Not long after the two California mass shootings came the killing of Tyre Nichols by five Afro-American policemen in Memphis. Again the initial reaction was that this was racism, the cops again killing an innocent Afro-American. It fairly soon became known, however, that all five policemen were also Afro-American, and that the Memphis police chief was Afro-American, too. That logically should have made the white supremacist frame a clear non-fit, but for some it didn’t. We soon beheld a whole herd of two-headed carnival calves, mooing more or less the same tune.
The woke have a ready answer when a non-white person does something murderous, or merely supports policies unpopular among that ethnic group: “multiracial whiteness.” An invention of New York University Assistant Professor Cristinia Beltrán, “multiracial whiteness” comes down to minority status immigrants or others wanting to be part of the majority so badly that they will associate with “whiteness” even against their own identity-group interests—as Professor Beltrán defines them. So if the “White” racist majority is aggressive, exclusionary, and domineering, these misfit Latinos and “Blacks” join up because they want to be aggressive, exclusionary, and domineering, too.[2]
That, precisely, was the bizarre explanation mooed forth after Tyre Nichols’s killing in order to save the ideologically mandated premise of “White racism.” Former Congressman Mondaire Jones asserted: “If you think the Memphis police officers had to be white in order to exhibit anti-Blackness, you need to take that AP African American Studies course Ron De Santis just banned.” The Atlantic’s Jemele Hill wrote:
I need so many people to understand this regarding Tyre Nichols. Several of the police officers who murdered Freddie Gray [in Baltimore] were Black. The entire system of policing is based on white supremacist violence. We see people under the boot of oppression carry its water all the time.
So, too, Te-Nehisi Coates, who ascribed the death of a friend at the hands of an Afro-American policeman to “the real culprit: The dream of acting white, of talking white, of being white.” That, claimed Coates, is what killed his friend Prince Jones, “as sure as it murders black people in Chicago with frightening regularity.”
It is not prima facie insane to imagine Afro-Americans wanting so much to be like “White” people to the point that they will adopt behaviors and attitudes alien to their own interests and dignity. Something like that happened occasionally among Jews in Nazi concentration camps who acted as capos complicit more than necessary in the mass murder of their brethren. But to compare the situation of Afro-American police in Memphis to the situation of Jews in Nazi concentration camps is a bit of a stretch, to put it gently. And one would think that Ms. Hill would have wanted to speak with at least some of the five Memphis policemen before reaching any conclusion about what they thought they were doing. Apparently, the idea never crossed her mind.
At least Jeleni Cobb, writing in the New Yorker, came close to a fitting analogue to the death camps when speaking about conditions under slavery: “The most pernicious effects of American racism,” she noted, “were to be seen in what happened in the absence of white people, not in their presence.” She was referring to the hierarchy within slave populations that sometimes mimicked the capo-like behaviors of some Jews during World War II. “Boss” slaves could be just as vicious to other slaves as white workers and owners, and Cobb knows that in the antebellum South some free Afro-Americans were themselves slave-owners, and not always particularly compassionate ones.[3]
But most of the woke invocations of Afro-American police today “acting white, talking white, being white” amounts to magical thinking, in perfect alignment with the mythic law of metamorphosis, under which anything can turn into anything else if, under the twin law of consanguinity, it “feels” like it must be that way in order to preserve the unfalsifiable premise at the bottom of this sort of adolescent ideological thinking. Such explanations are at a complete loss to explain the rise of Afro-Americans and other “blacks” to high office and high positions in law, business, education, and academia except to accuse them all of being race-traitors who down deep just want to be white. This is where unfalsifiable ideological thinking ultimately leads: to rank absurdity.
Speaking of which, the woke Left is all for DEI virtue-signaling in homage to its RUE (radical undifferentiated egalitarianism) secularized theology. But what is DEI exactly? It is a form of luxury thinking, a little like a votive act, that makes excessively guilty-feeling white people feel less guilty, and also just happens to provide jobs for the mostly middle-class cadres who now populate what has become a DEI industry in universities and the corporate world. It amounts to both tokenism raised to the status of a religious principle and, perhaps not incidentally, an ingenious form of self-dealing.
No evidence exists that any DEI program has ever actually helped a disadvantaged person who couldn’t find help, if needed, somewhere else. The entire fad is thus much worse than affirmative action, which, at least in the beginning when it was meant to be a temporary jump-start to a more level playing field, did actual good.[4] That was because affirmative action was devoted to advancing merit to its proper and fair level, while DEI pretends that merit does not exist for any practical purpose since in the woke mentality no one can be better than anyone else, and all advancement has to be in group terms, not individual terms, anyway.
It is actually worse than that; it is, predictably, counterproductive by its own lights. A revelatory essay on the DEI experience of the University of Michigan by Nicholas Confessore in the New York Times Magazine is important for two reasons.[5] First, Confessore shows how expensive the program has been—more than $250 million over a decade—and how many university positions are bound up with it—241, the vast majority of the positions occupied by females. But, more important, he shows how badly the effort has backfired: It has failed to increase the number of minorities by making campus more friendly to them; it has instead created what he calls a culture of grievance thanks to which minorities on campus are more self-isolating than ever. Of course they will be: Create a mindset for young people that stresses how awful white people (and males) are, and an evoked set will form that drives students to first define and then see.
The second reason the essay matters is that it appeared in the New York Times, a bastion of woke subculture. The fact that the Gray Lady has allowed a sane, fact-based centrist critique of DEI is a hopeful sign.
What ought to be energizing the Left are the oligarchic distortions of the economy and the natural and subsequent plutocratization of our politics that grow worse day by day on account of the gigantism-begetting technology of the digital era—what we have described as the Net Effect. We now endure financialized rentier economic structures, linked to a globalized oligarchical elite, that bears little resemblance to the shopkeeper capitalism of Adam Smith’s time. Connection to these global structures is out of reach for most Americans. The entire set-up is to no small degree beyond democratic accountability, as well, and the result has been, in effect, the ongoing undermining of the American dream for those who need it most to come true: opportunity open to all, with only the sky the Horatio-Algerist limit, for those willing to work hard and follow the rules. The manifest result has been the skewing of social mobility along lines of education and class, with non-college educated whites, and especially white males, being particularly harmed by the main trends.
Understanding that reality takes work and patience, and doing something about it takes real organizing and results-oriented activism. But none of that fits the woke Left’s forte. It also requires that the Left actually care about disadvantaged people of all skin tones, not fawning over some while calling the rest “deplorables.” The stark truth is that typically college-educated woke types do not actually give a shit about poor people unless their skin tones happen to fit with their exotic self-flagellating “white magic” theology. It is as Eric Hoffer wrote in 1970: “Scratch an intellectual and you find a would-be aristocrat who loathes the sight, the sound, and the smell of common folk.”[6] Alas, in the past roughly three-quarters of a century we have gone from the economically grounded, communalist (but not communist) Old Left to the stoned out, narcissistic New Left to the utterly clueless, but equally well-intentioned, woke Left. None of that is genuinely progressive and none of it resembles progress.
Readers of a certain age and experience will recognize that the temperamental signature of the New Left more than half a century ago has segued into the temperamental signature of wokeness: Overwhelmingly other-directed personality types (in Lonely Crowd terms,) insisting on elevating their personal predilections above all else. What does this amount to? A kind of flamboyant narcissism in a crowd of other flamboyant narcissists, or, put a bit differently, competitive narcissistic exhibitionism. Oxford Dictionaries chose “selfie” as its winning new word of 2013, but that was only because smartphones with built-in cameras were still fairly new a dozen years ago. Clearly, the impulse that gave rise to the neologism had been marinating in the culture for quite a while.
This narcissistic exhibitionism, the more spectacular the better—which has traveled a more or less straight line from Yippies trying to levitate the Pentagon in October 1967 to Naked Athena on the streets of Portland, Oregon in June 2020—is precisely what old Left pro-trade unionists like Eugene Genovese most despised about the original hippie/yippie phenomenon. As the son of a Teamster who started college in the fall of 1969, I thought Genovese had it spot on. I still do. When politics goes basically hollow, as ours has, any damn silly thing might come to inhabit the institutions that once seriously did the peoples’ business. And what has come to inhabit them is the grotesquely unserious obsession with near-mindless entertainment, the zenith of which is spectacle. Under such conditions a narcissistic shaman can even get elected President of the United States.[7] Twice. And so now, after the second time, we are about to test John Fowles’s credo from The Aristos: “I believe in the essential sanity of man.”[8]
More Sex
We talked about “Sex Magic” in Chapter 2, but to the subject we must return because some people just seem never to get enough of it. Other reasons, too. This is glibly put but it is not meant unseriously, as we will see soon enough.
One cannot help but notice how much of the energy of leftwing culture warriors has been devoted over the past thirty years or so to gender, sexuality, and the literal appurtenances thereof. Let’s resume our analysis with a little history.
This chapter in American social history started in the fabled Sixties, as did many things, with first-stage feminism, the deceptively mild initial symbol of which was the spectacle of public bra-burning. This was around the same time that bare breasts started appearing in mass-circulation magazines, but only sparingly and subtly, as though in a Find-Waldo mode (of course, not yet invented). In those still-innocent times teenage boys could be found paging expectantly though fresh copies of Life and Look in search of the Holy Boob of the week or month.
So it was about spectacle from the get-go, but certainly not only about spectacle: The grievances feminists expressed were and remain real. Those realities have been thrown headlong into public space and, perhaps not coincidentally, avidly marketed ever since. We now have in our cultural archive theater and musical scripts about vaginas (“The Vagina Monologues”) and menopause (“Menopause: The Musical)” among other artifacts of sexual spectacle. Is there anything wrong with artists taking a cultural fad to the bank? Perhaps not, but something important has been going on since the feminist upheaval, and it is apparently not well understood. To lay it out we need to return for a moment to David Riesman’s reference to inner- and other-directed personality types.
To simplify a bit, inner-directed personalities internalize rules concerning right and wrong, and when they violate a rule they feel guilt privately, as well as dishonored before others if their violation becomes known. Other-directed personalities recognize rules created and upheld by others, and when they violate a rule they feel shame. The distinction matters. Aside from sociologists like Riesman, cultural anthropologists far earlier distinguished between honor and shame societies and tracked the practical differences the distinction made in modal behaviors, including in such seemingly peripheral matters as sense of humor.[9] As American society moved away from more of an inner-directed culture to an other-directed one, guilt about having sex out of wedlock (and for some people even within wedlock, evidently) gradually transformed into shame. An inner-directed personality feels guilt about what is established in social mores as a sexual transgression whether others not in the room at the time become aware of it or not. Not so with other-directed personalities: One only feels shame over some act if others do find out about it.
Sexual liberation in the Sixties, contrary to received wisdom, was not about the abolition of inhibition as such or of bourgeois institutions. It was, deep down, about the abolition of shame, especially for women who wished to indulge their natural sexual appetites on a par with men (we return to this theme below when we discuss the real emotional depth to the abortion debate). With guilt a problem for only a shrinking remainder of women, getting rid of shame was the next liberation frontier, and it proved no heavy lift to get there.
It happened fast, too. A long-forgotten illustration of the essence here resides in Roger Ebert’s review of Alex De Renzy’s 1971 film “The History of the Blue Movie.” “It is a melancholy landmark in the disintegration of our age,” wrote Ebert, “that genuine hard-core pornographic stag movies are now assembled into documentaries for young couples to see on Saturday night dates.” (I actually took a date to see that film, in Georgetown, in 1971--I confess it.) Ebert sensed that something was very wrong about this. “Somehow,” he staggered forward,
a stag film should mean more than that; it should be surrounded by the heady excitement of a forbidden thrill. . . . But no. That era is as long dead . . . . We live in an age so compulsively permissive that I sometimes wonder whether anyone under 21 would know a forbidden thrill if he felt one.
Then Ebert stumbled: “Norman Mailer was on the right track in ‘The Armies of the Night’ when he protested against those who would remove the guilt from sex: Without guilt, he wrote, sex would lose half the fun.”[10] No: De Renzy’s arsenal of compulsive permissiveness was aimed not at guilt. By 1971 sexual guilt was already headed to the cultural cemetery, and Mailer, already an old man from the prior epoch—born in January 1923, so an ancient 48 years old at the time—missed the mark. The Sixties counterculture aimed instead at killing sexual shame, and made much progress.
Sexual shame still exists of course, but in transmuted form. In 1971, sexual shame was still mainly about immodesty, about not wanting to be seen in an other-directed culture as promiscuous. Thanks in part to the magical rectangle’s relatively recent impact in spreading pornography throughout American society—many other societies, too—shame is now less about immodesty than unattractiveness. The ensemble of the perfect life by entitlement now includes for many younger people having a sexually attractive body as defined by mass-media celebrity standards. One result, aided by CGI- and now AI-enhanced apps, is a plague of psychiatric dysmorphia that is sending mainly women to plastic surgeons demanding to be given their favorite celebrities’ lips, cheeks, butts, or boobs.
Salacity is rapidly becoming obsolete in this third decade of this 21st century, giving way to needless, childish embarrassment in the face of an exceedingly narrow definition of mere appearance-attractiveness. Think the massive increase in visible tattooing and body piercing is an unrelated coincidence? In a culture being overtaken by spectacle, appearances invariably trump substance, in this case that of personality having already long since replaced character, and now always trumps later. How ironic that feminism set out to abolish artificial, commercialized, straitjacketing “Playboy” metrics of female beauty only to inadvertently advance a cultural trend that has produced the opposite outcome. Flatten the significance of personality, let alone character, and appearances are all that remain.
The eclipse first of sexual guilt and then of sexual shame bears implications beyond the realm of sexuality. It is at the root of the defenestration of social authority, for if the existence of a rule elicits no internal obedience to its good sense if not necessity, then it can only be enforced by external suasion and threat of penalty. That turns authority and all who administer it into an adversary instead of a partner in the maintenance of civil order. The erosion of shame has thus spread to the psychology of crime: If people feel no sense of guilt or shame over breaking laws that clearly harm others’ interests and wellbeing, or so little of either that an exculpatory pretext for breaking laws is but a subjective twitch away, then all else equal there will be lots more of it. (All else is not equal, and never is, in case you were wondering.)
What has the eclipse of shame to do with spectacle? Just this: Shamelessness removes all restraints to the escalation of the shock bar as spectacle entrepreneurs, commercial and political, seek the next “wow now” of the attention economy. The relationship is reciprocal and mutually reinforcing. Shamelessness enables the shock bar to rise and displays of shamelessness as part and parcel of the retinue of media culture deepens the license for shamelessness as the generations roll onward. A downward spiral to moral depravity is a possible consequence, and that is not so small a step toward an anti-civic life that repeats Lord of the Flies moments at social scale. The shamelessness of the MAGAt world is obviously the most salient example before us. And, not surprisingly, it affects the excruciating debate we have about abortion.
Abortion: Serious Issues, Specious Arguments, Sunken Roots
Related orthogonally to the adolescent obsession with sex, gender, and genitalia is the liberal as well as the woke Left’s bizarre attitude toward abortion. How so?
The Left is on the whole communitarian and collectivist, as all sensible leftwing movements should be to offset the distortive hyper-individualism that inheres in extremist libertarian conceptions of human social nature. It also cherishes affinity communities as passable substitutes for traditional extended families—also fine when those are the only available options. But when it comes to abortion, the typical left-of-center line is hyper-individualist: This is a decision, this line insists, that must be made only by the woman, and maybe also her doctor. The biological father? The woman’s parents and siblings? Her grandparents, aunts and uncles? Her clergyman or clergywoman? No, according to this radical individualist view none has any say in the matter, even though the decision to have an abortion affects the entire community of intimates within which nearly every woman is networked.
What justification is proffered for this view? Usually none: Most of the time it just boils down to the assertion that a woman owns her own body and so may do with it whatever she wants. This assertion is rarely examined; to most who assert it, it seems flatly self-evident. But what is the reasoning behind this assertion if one is forced to produce one? By what logic does a woman, or a man for that matter, own her or his own body?
The meaning of the verb to own anything material or physical breaks down into three basic categories. One owns something if one purchased it, if one designed and fabricated it, or if one was the beneficiary of a gift from another person or persons. Subcategories exist, as well: For two examples, a loan is a temporary gift, and ownership of something can be joint with other people. So now back to a woman’s body: Did she buy it? Did she design and make it? Was she gifted it or loaned it by some other person or persons? Did she even ever ask for it? Did she own her own body legally before age 18? The obvious answers: no, no, no, no, and no. Wherefore, therefore, the claim to ownership?
All Abrahamic—Jewish, Christian, and Muslim—exegetes in moral reasoning have concluded over many centuries, as have the sages of other faith communities worldwide, that a person’s body is a gift on loan from the Creator, a temporary if exclusive gift, and that certain responsibilities attend the care of that gift along with right to use and enjoy it for as long as the spirit of life inhabits it. Only in a time and among people for whom religious law and tradition hold near zero suasive power could a claim to owning one’s own body be considered non-risible. But that just demonstrates how deep the switchout of America’s stories, harking back to Chapter 2, really is. John Adams has something to say to us yet again:
Because we have no government armed with the power capable of contending with human passions unbridled by morality and religion. Avarice, ambition, revenge, and licentiousness would break the strongest cords of our Constitution, as a whale goes through a net. Our Constitution was made only for a moral and religious people. It is wholly inadequate to the government of any other.[11]
For Adams or any of his Founder contemporaries to argue that a woman or a man owned her or his own body, and so could do with it anything desired, would never have occurred in a still-religious age yet to discover the lure of the hedonistically inclined imperial-I.
Rabbinic Judaism certainly aligns with this view, which is why tattoos, exotic piercings, and above all suicide are violations of Jewish law: No one may deface or destroy what God has created and given as a loan. The concept of the sanctity of human life, the inner pivot of all human rights principles, resides in this universal body-loan arrangement: One may not harm another person wantonly not because that person owns his or her body, but because to desecrate any living human body is to deny its ultimate origin and so is an affront to God.[12]
For this reason, halachic Judaism rejects abortions unless compelling reasons, mostly related to the welfare of the mother, are in play. Exceptions to the basic rule therefore exist and always have, for historic reasons having to do with the Hadrianic persecutions of the late 1st and 2nd centuries CE. The Orthodox rabbinical view, unchanged for more than two millennia, is similar to the views of virtually all Islamic jurists and many Christian clergy, as well.[13]
The woke Left rejects all this. It does sense that violence against innocents is categorically wrong, but it lacks any logical ethical basis for its own view since its postmodern premise rules out the existence of non-relative moral reasoning. It rejects both revelation and natural law as sources of such reasoning, and there are no other sources. It is reduced to “feeling” that a given act is right or wrong, for that person at that time. As a font of social mores for an entire community, this is not a helpful formulation if a community, and not mere personal license, is what one actually aspires to.
Evidence? Well, note that if a pregnant women seeks an abortion because she senses, correctly in the hypothetical case to hand, that the conditions for the child to be loved, cared for, and made secure do not exist, she still has the option of giving birth and turning the child over for adoption. But of course that presumes community; that option works more often than not only because a community exists. If this pregnant woman does not even consider the adoption option, it becomes clear what her true motives are: They are not about the child; they are about herself.
Hence a sign in a front yard in Silver Spring, Maryland, proclaiming “Abortion is a human right” is a great applause line for some; it even sounds heroic, almost. But what is the logic in asserting that the right to extirpate an unborn child is a human right that overrides the right of the fetus to be born? Couldn’t one just as logically say that, having been conceived, “the right to be born is a human right”? That sounds at least as compelling as a moral principle to most people, if not more so, since any given fetus represents a unique, never-to-be-replicated conjunction of two human spirits that, for religious people, are created in the image of God. To affirm the former right while giving no thought whatsoever to the latter possibility in what is, after all, an empirically contested and open-ended debate about the true beginning of human life, is yet another example of facile adolescent reasoning. It’s therefore quite appropriate for a yard sign or a bumper sticker, because all that can be written there is all there is to write—it’s the full extent of the poor excuse for thinking it reflects.
It can repay effort to plumb where these novel attitudes come from. Let’s start with recent events and head back in time. The overturning of Roe v Wade in late 2022 shocked many people, but it did not send me into paroxysms of regret. I strongly oppose draconian limitations on abortion, but as a matter of constitutional law it was always a stretch too far to find a right to privacy in the Constitution that could conceivably apply to abortion. Insofar as any such right exists at all, it is a relatively recent jurisprudential addition thanks to Justice Brandeis occasioned by the proliferation of cameras in the early decades of the 20th century. Cameras have nothing to do with abortion. As already argued, there is simply no remit in the Constitution for any Federal view of abortion; I dare anyone to show me the text that even remotely suggests otherwise to any reasonable reader. Moreover, while few people seem to be aware of it in this country, no other Western democracy, not even Canada, has ever enshrined abortion as a constitutional right. The United States is the lone outlier on this.
Even though I am not and have never been a Republican—and certainly won’t become one anytime soon—I agree with Clarence Thomas’s statement that Roe v. Wade was way ahead of the social consensus on abortion in 1973, that it imported the culture wars into American politics and helped toxify them, and that it has contributed to the divisiveness of American politics ever since. Many people thought mistakenly that Roe v. Wade settled the abortion question for all time more than half a century ago, but part of what it did was create a backlash that led the Republican Party to focus on controlling judicial appointments so as to get that decision overturned. It just drove the abortion culture war deeper into the roiled soul of American society, and spread it sideways from there. The sum of the matter? Not good.
[1] See Andrew Sullivan, “When the Media Narratives Meet Reality,” The Weekly Dish, February 3, 2023. The quotes above from USA Today and the New Yorker are also drawn from this essay.
[2] Cristinia Beltrán, “To understand Trump’s support, we must think in terms of multiracial Whiteness,” Washington Post, January 15, 2021.
[3] On this point see the historical novel by Edward P. Jones, The Known World (Amistad/Harper Collins, 2003). One wonders if any trade publisher would dare publish this book today, since its truthful historical backdrop clashes with woke ideology.
[4] A recent Harvard study shows that race is less predictive of low socio-economic status than it used to be, and affirmative action may have contributed something to that shift. See Raj Chetty, Will Dobbie, Benjamin Goldman, Sonya R. Porter, and Crystal S. Yang, “Changing Opportunity: Sociological Mechanisms Underlying Growing Class Gaps and Shrinking Race Gaps in Economic Mobility,” NBER Working Paper, No. 32697 (July 2024). A summary of the study with clear graphics may be found in German Lopez and Ashley Wu, “Who Can Achieve the American Dream? Race Matters Less Than it Used To,” New York Times, July 25, 2024. The NYT account focuses more on the good news—the declining salience of race—than it does on the bad news—the evidence of widening inequality of opportunity.
[5] Confessore, “What to Know About the University of Michigan’s D.E.I. Experiment,” New York Times Magazine, October 16, 2024.
[6] Hoffer, The True Believer, p. 111.
[7] Again, not that all these contributing causes happened suddenly or all at once. As we have been at pains to make clear, the affluence/decadence piece goes back at least to the late 1950s. The end of modernity piece goes back to the mid-1960s. The deep-literacy erosion piece goes back to the point of television saturation in the mid-1950s. Only the cyberlution piece is recent—mid-1990s at the earliest. This is why John Ganz’s new book, When the Clock Broke: Con Men, Conspiracists, and How America Cracked Up in the Early 1990s (Farrar, Straus & Giroux, 2024) doesn’t reach back far enough. Despite Ronald Reagan’s supposedly successful two-term presidency, the clock really stopped, or seriously ran down at least, in the early 1980s. Yes, Reagan was elected Governor of California before he became President, but that says more about the California electorate than it does of Reagan’s B-movie actor grasp of American politics and national security policy, which on any close inspection was shockingly and superficially abstract.
[8] John Fowles, The Aristos: A Self-Portrait in Ideas (Little, Brown, 1964), p. ix.
[9] What is simplified in a text can be better explained in a footnote. Suffice it to say, people can feel some guilt and some shame simultaneously; and philosophers have strained for centuries to define and distinguish the two in many languages. In recent decades their efforts have not always been edifying, as with the widespread assertion that guilt is social and shame is internal to the self. This is the opposite of how the terms are used here, and it seems to me that this view is mistaken.
[10] Ebert, “Reviews: History of the Blue Movie,” Chicago Sun-Times, August 25, 1971.
[11] Letter, October 11, 1798, to the officers of the Massachusetts militia.
[12] This is also why using non-living human corpses as props in a supposed artistic display strikes many observant Jews, and others, as disgusting and morally reprehensible. I refer specifically to “Body Worlds”—Korpenwelten in the original German—that toured various cities in 2002 and thereafter.
[13] The fact that American Jews overwhelmingly support abortion-on-demand—something like 75 percent to 25 percent—reflects two facts: Jews are in the main well-educated and urban, so they have views like those of other such individuals, regardless of faith community; and most American Jews do not respect Jewish law or even know what it says about abortion and other sensitive moral issues. If they belong to synagogues or other explicitly Jewish organizations, it reflects the fact that these institutions have essentially rejected the authority of Jewish law and tradition. They are Jewish in ethnic terms, but not in religious terms as that phrase has been generally understood for the past dozen centuries. A textbook example is Hannah Holzer, “Jews overwhelmingly support abortion rights; overturning Roe violates my freedom of religion,” Sacramento Bee, June 27, 2022, p. 7A. Ms. Holzer writes, inter alia, “The liberal denomination of Judaism I belong to, Reform Judaism, puts personal ethics above all else. To be a Reform Jew, you don’t have to keep kosher, observe the Sabbath or even believe in God.” She is right; but to call this Judaism in any form is like calling a so-called form of Catholicism that lacks transubstantiation, confession, or belief in the divinity of Jesus a genuine form of Catholicism. This, too, is not so, and cannot be so.