The Age of Spectacle, No. 32
Chapter 7: The Neuroscience of Spectacle: Research and Implications, part 4
The Raspberry Patch is posting early his week thanks to incipient Thanksgiving travel disruptions. The plan is to commence rolling out Chapter 8 of The Age of Spectacle next Friday, December 6.
The only other note before getting to the last section of Chapter 7 is that some recent more or less sleepless nights have led me to once more rejigger the project’s structure. For that reason I’m again adding the working extended table of contents outline of the book at the bottom of this post. For those with an interest, comparing the current version with the one posted this past Friday, November 22, will show that not only have some of the subsections changed places and names, but the larger structure of Part II has also be altered.
Chapter 7: The Neuroscience of Spectacle: Research and Implications, part 4
McLuhan Was Wrong, and Right
. . . Light entering our optic nerve is hooked directly into our endocrine system. This is, at another level, why blue light tends to wake us up and orange light tends to make us drowsy. But never mind for now any part of our body except our brain. As already noted, visual stimuli and the light/wavelength characteristics they embody affect the arousal state of our brains. The persistent watching of mediated images, whether on television, movie screens, or smaller screens like the ones on smartphones, especially in a darkened room, puts us in alpha, and alpha tending toward theta the longer we dwell in it. When we are in alpha and more so as we move toward theta, we cannot bring our focused attention to bear on what it being conveyed to us. We are primed in that mental state to feel a lot, but to think not so much.
And so what? Well, first, this is why advertisers have so loved television as a medium all these years: Snake oil can be made to smell like essence of lilac. Second, remind yourself that, these days, conspiracy theories and cultic nonsense of every kind, from Frazzledrip to QAnon, are peddled mainly through social media, and social media uses screen technology. Hear any bells ringing yet? Perhaps some alarms going off?
Does this mean that the medium of communication, regardless of any particular content, sizes the brain’s reception of the message? That if the state of mind induced by the technology of transmission is, say, alpha tending to theta, then the recipient of the message, especially when he or she is alone and so singularly focused on the screen, will likely be unable to bring his or her critical facilities to bear on evaluating the truth-value of the message or the perhaps not-obvious intentions of the sender? Yes, it does mean that. Which brings us to Marshall McLuhan.
When McLuhan famously intoned “the medium is the message” he was being somewhat clipped and cryptic, deliberately so for he was marketing himself in an age already long on entertainment. You had to actually read the books to get his message, which was ironic because as right as McLuhan was about the essence, he was wrong about the implications in his day.
McLuhan got his temperature valances backwards. He thought “static” print was “hot” and “moving” images “cool.” No: Print is coldblooded; it ignites the frontal cortex because that is where our literacy “wiring” is thickest, which is precisely why reading McLuhan was the useful experience it was. Images stimulate the older parts of the brain, and are redolent with emotion. They are very hot, and also come at us very fast as opposed to print. Speed doesn’t only kill, as the old warning goes; it singes, too. We need time to process the written word; vivid images hit us with a matter-of-fact immediacy print can’t touch.
McLuhan was also wildly wrong about the larger, meta-social implications of media technology. It was only 1964; digital technology was not even a phrase, but in Understanding Media McLuhan saw paradise in the spread of first-generation electronic technology:
Today, after more than a century of electronic technology, we have extended our central nervous system itself in a global embrace, abolishing both space and time as far as our planet is concerned. Rapidly, we approach the final phase of the extensions of man—the technological simulation of consciousness, when the creative process of knowing will be collectively and corporately extended to the whole of human society, much as we have already extended our sense and our nerves by the various media.
It seems not to have occurred to McLuhan to ask what the human “hive” he foresaw would mean for individual freedom, the very plinth of constructive creativity, and for the sense of self that was and remains the hallmark of post-primitive human civilization. As for sensory overload, the distortive power of mediated sensory input, and the possibilities of technoference and addictions, not a single word.
Yet as wrong as McLuhan was, he learned enough from his mentor Harold Innis to have the right instinct. Humans really are the self-completing animal, as Kenneth Burke wisely called us.[1] We really are autogenic and hence in that sense free within certain parameters, parameters wide enough so that over long enough stretches of time the small ambit of our freedom can produce large differences in the human condition. It works through a structurally recursive element: We invent new environments for ourselves and the new environments in turn reinvent us. As Winston Churchill put it—if you read the word “buildings” as referring generally to all human fabrications, not just to habitable structures—“We shape our buildings, and then our buildings shape us.” So McLuhan understood that we invent increasingly sophisticated “extensions of man,” his signal phrase, and that these extensions in turn shape the way we think and behave, a process nowadays called for short epigenetics.
The question today, as already suggested, is first whether epigenetic recursivity is moving too fast for our social and political institutions to track and adapt to, and second whether deferring to corporate interests to monopolize the design characteristics of digital technology is a good idea. It is now settled neuroscience that, as Arko Ghosh wrote, “the brain is continuously shaped by touchscreen use,” specifically that swiping a phone screen changes the mapping of the hand’s representation in the brain’s sensory cortex. The younger the brain the larger the change, all else equal. This feature on the phone has no positive functional purpose. Its only purpose is to maximize dopamine masturbation in the intermittent-reward loop in order to effect addiction, whose core purpose is commercial—to maximize dwell time on the screen so that advertisers can sell people things they mostly don’t need and otherwise never would have concluded they want.
If those were the only issues before us sleep would come easier by night. But two others need be considered: How the corporate hijacking of our visual cortex affects modal brainwave conditions; and the shadow-effect that a technology-induced modal brainwave mentality working at scale affects the functionality of our inherited political institutions.[2]
Screen-delivered fiction aside for a moment, the brain frequency impact of machines impinges on what is supposed to be non-fiction, as well. A cascade of three standard reasons helps to explain why Americans no longer trust their leaders and associated institutions as much as they once did. The first reason is that the elites have too often screwed up and don’t deserve to be trusted, and the third is that Americans’ well-documented hemorrhaging of social trust in general has bled over into politics.[3] The second reason, the one most relevant to the technology piece we are limning here, is that elite screw-ups are now much harder to hide thanks to the ubiquity of unfiltered, disintermediated “right now” media.
It’s not that political elites are necessarily less adroit than they once were; it’s that shortfalls become immediately known, but known in exaggerated funhouse mirror-style thanks to the cultural gluttony for spectacle magnified by the clickbait and varyingly biased broadcast media. So let’s say a cabinet member does something wrong, morally or otherwise. If we read about it in a newspaper or magazine and consider the details hopefully provided, we do so in the cold winter of our emotions. Our brains are probably in high alpha tending to low-beta, even higher perhaps if the subject or personage is of particular salience to us. But if we see mostly images on the television news version of infotainment, our brains are likely in alpha or low alpha, depending on the time of day and if we are alone or with others. We do not reflectively take it in as information so much as first emote and then, maybe, consider the matter calmly and at greater length thereafter.
Let us now skip to the side a step, adding some endocrinology to our neurobiology. Precisely because our optic nerve connects to our endocrine system, we react microchemically to various stimuli. When we are deep into our screens playing videogames or jousting with the “Spelling Bee,” the technology, making full use of addictive intermittent-reward structures, is not just affecting our brainwave modality. It is also stimulating dopamine squirts from our pancreas. We squirt few hormones when we read a news article, even one about wrongdoing politicians; we squirt buckets when we see the faces of the wrongdoers in living color looking uneasy or arrogant on our screens. The immediate image cascade is indistinguishable from watching a fictive scripting, and we are conditioned to care mainly about emotion when we watch such scriptings because they try at least to be artful--and emotional texturing is, after all, what art is all about.
Experience paired with emotional up-pitches also tends to get imprinted more readily in memory. This is no different in the present tense than in the past; we remember best events from our youth that were particularly joyous or particularly traumatic. The difference is that our memories are now assaulted by artificially emotionalized images, spectacalized images
It is therefore a bit shocking when friends and acquaintances debate before my very eyes and ears which television news and commentary shows are the most or the least biased, or the most or the least trustworthy. “It’s a screen,” I sometimes interpose. “It’s put you in low alpha, and anyway it’s infotainment, not objectively curated news. You cannot understand, let alone master, any significant political or public policy issue from screens alone, regardless of the source’s lesser or greater bias. You must read about it, with your brainwave prompts preferably somewhere in beta.” These episodes where my typically calm demeanor goes missing rarely end well.
Brain Shadows
One has to wonder whether not especially well-educated people, through no fault of their own most of the time, and perhaps people whose day-jobs do not typically require sustained cognitive tasking, are more vulnerable to the sirens of conspiracy theories because their screen-fed brains tend to get stuck, so to speak, in an alpha tending to toward theta brainwave frequency. All else equal, it might be that, at least to some extent, the technological medium presupposes not only the message but, more broadly, the neurological receptivity capacity or parameters of the would-be recipient. If machines and their human users can synch up mental metabolisms, led by the machine’s, perhaps human brains can be routinized or habituated so that a person’s evoked set tends to generate expectations and sensitivities that function within a truncated spectrum of brainwave possibilities compared to those whose daily work demands a lot of beta and even gamma states of mental activity, and who do not spend so much time with screen entertainments.
This is not a way of saying that stupid people are rubes for both commercial and political entrepreneurs to harvest. The point is not about anyone’s raw intelligence, let alone about “deplorables,” a radioactive word that should never have been uttered from a dais. It’s the other way around: It is about what certain routinized behavioral choices induced by technological availability do to everyone’s brain, regardless of raw intelligence.
If someone’s brain has become naturalized to low alpha rather than to mid- or high-beta because of what they do at work, or because they also don’t deep-read when not at work—or perhaps ever read anything beyond bumper stickers, menus, and grocery lists—incoming sensory data may be filtered in such a way that this person is receptive mainly or even only to content better suited to low alpha-like modes and moods than to mid-beta generating content. They may even be predisposed to disattend beta-like content, as in “don’t bother me with boring facts or multifactoral arguments while I’m enjoying this wild tale or these mesmerizing images.”
It is clear, after all, that the human brain matures and reaches its potential only with experience. As already noted, an infant’s brain dwells more or less in theta for its first year of life. Only when it starts to stand upright and walk, and is thus able to diversify its behavioral repertoire, does the brain develop in the integrated manner of which it is capable. It then acquires via experience the ability to be in different modal brainwave states. Now, imagine a young person on the savannah or in the woodlands of Africa or Europe ten thousand years ago. Simply by dint of mimetic learning and survival requirements, people knew how to concentrate attention on something. From purposeful glancing one moves from alpha to beta in order either to acquire dinner or to avoid becoming dinner for some predatory creature. One concentrates when one is building something—a tool, a weapon, a trap, even a toy. One concentrates and shifts into and back out of beta throughout the day, or else.
Now imagine an affluent young denizen of contemporary America: What does he or she routinely concentrate on? Work, no doubt, if the person’s work requires beta-level concentration brain states; and hobbies if that person engages in music, woodworking, sewing, gardening, and so on. But what about young people whose patterns of behavior rarely if ever put them into beta, and who build nothing with their hand/brain/body trifecta? Again, a human brain acquires and maintains a repertoire of being in an array of brainwave states from practicing being in them, from extended experience, in other words. If some people rarely or ever do things that require a beta brainwave state, they may sharply erode or even lose the capability of going into and sustaining that state.
As we know too, aged people sometimes naturally lose the ability to concentrate as their brains change metabolism over time, and so they lose the ability to go into and stay in beta; this often leads to varying degrees of memory loss and in some cases to dementia. Why would anyone much younger follow and feed such a path on purpose? Because they do not realize what they are doing, so deeply at ease with passive screen entertainments that it never occurs to them, and it never occurs to them because they no longer can manage to concentrate on much of anything, including their own health and mental state.
There may be a particularly tremulous implication here for the large number of able American males who have absented themselves in recent years from the work force.[4] The number is large enough that they have won the accolade of an social science acronym: NILFs, standing for Not in Labor Force. No mystery is at work here; it is in part at least an outcome of the dematerialization phenomenon discussed in Chapter 1. The labor profile has shifted for technological reasons toward symbol manipulation and away from material resources manipulation, toward services and away from manufacturing, and toward capital intensiveness in production techniques and away from labor-intensive techniques. The American political order then piled on plutocratically-inflected decisions to favor capital over labor and to favor outsourcing over domestic labor-intensive manufacturing innovation. The Age of Brawn has given way to the Age of Braininess, and a lot of American men who grew up in subcultures long on the patriarchal nobility of physical labor got lost at seachange, so to speak. Cultural outcroppings of these changes, of which feminism is perhaps the most important, have piled a psychological burden on top of the economic one. None of these changes have been caused by anything directly related to neurology or brain anatomy, but their effects very much do intersect with both.
What do able-bodied guys who abjure even looking for work do with their waking hours? According to one analysis, such cohorts spend around seven and a half hours a day socializing, relaxing and…..watching screens. That comes to more than four hours a day more than working women, almost four hours more than for working men, and at least a hour more than men still looking for work. It’s actually worse than that. “The rhythm of life for a great many of the prime-age men in American currently disengaged from work,” write Nicholas Eberstadt and Evan Abramsky, “is defined not simply by days and nights sitting in front of screens—but sitting in front of screens while numbed or stoned.”[5]
What do they see on their screens? Little reliable empirical data exist on what demobilized men choose to watch, but a cultural logic of sorts can be readily imagined. Of course there is the ubiquity of free porn, and there are dark versions of it, too, in which women are depicted not as just appealing but as both alluring and sinister, so deserving of punishment. Alas, in a post-feminist culture where patriarchy and mansplaining have become the dirtiest words on offer, as formerly pithy obscenities have been banalized beyond meaningfulness, some immobilized men seek virtual revenge against women for their seemingly—to them—systematic emasculation and humiliation.
There is a policy implication here. Whatever they watch, work-demobilized men probably do not need the additional incentives to become societal fringe that a generous GNI payment would provide. Having to get up and go to work in the morning prevents many people from truly tossing their futures to the wind. For many, work remains the best and sometimes the last link to self-discipline in their lives. Lose that or forsake it and it is too easy to get sucked into “the zone” of self-abnegation and bitter, addictive hedonism. That tends not to apply so readily to prime-age men who are married, whether happily or maybe not so much. But that’s partly the point: A significant percentage of NILFs are not married, or married anymore.
The bottom line, however, insofar as the neurological fallout goes is this: More than seven hours a day of entertainment screens, most of the time likely spent solo, may well lock the brain into a vulnerable and impressionable low-alpha/theta brainwave la-la-land, just waiting for some ideological entrepreneur to come along and pick the lock. The level of raw intelligence of the potential victims here, again, is not determinative. It would be nice to know in some empirical detail, for example, how many of the January 6, 2021 insurrectionists were NILFs, and by what messaging media they were recruited to the “Stop the Steal” cause.
We are unlikely ever to find out, unfortunately. But it is a commonplace, probably an accurate one, that loners—not all of whom are on the autism spectrum from childhood—tend disproportionately to fall for demagogues, grifters, and conspiracy theorists. They tend to gravitate to any kind of community that can relieve the ache of loneliness—which we have already discussed in passing in Chapter 1—as a consequence of affluence. Loners talk to themselves and to their unhearing screens for lack of others with whom to interact—except maybe for Alexa—so there is no peer reality check dynamic in their lives, and no social penalty for mouthing crazy stuff.
Atomized, isolated individuals often stand on the verge of depression; “Loneliness,” wrote John Milton in the Tetrachordon, “was the first thing which God’s eye named not good.” Said Saul Bellow’s character Herzog, echoing Hannah Arendt’s analysis of fascism in The Origins of Totalitarianism, “a terrible loneliness throughout life is simply the plankton on which Leviathan feeds.” Bellow (Arendt, too) was right. Isolated, atomized individuals have good reason to want to escape from reality: reality, after all, is not their friend.
Even worse, perhaps, reality is messy, impenetrable. Isolates cannot bear, wrote Arendt, “its accidental, incomprehensible aspects” because they lack the empathetic company with which to bear it. To double back on the deep literacy point, it is hard to imagine anyone lonelier than the literal isolate who cannot even find spiritual companionship and its attendant compassion with others through the written word.
America is now lonelier than ever, and burgeoning screen addiction, because they represent opportunity costs in diminished face-to-face interactions, is one of the reasons. More Americans live alone today than at any time in our history, whether because of high divorce rates, extended age expectancies, broken relationships, or other reasons. Loneliness sires not just depression but also desperation and escapism. In retrospect, the surprising popularity of “pet rocks” may have been a harbinger when the gag turned into something strange. But more than just a gag is afoot: When chronological adults can get a load of the fake alpha masculinity of someone like Donald Trump and, at one level, know it’s fake but not care, so thoroughly blurred have fiction and reality become for them, then we are well beyond a gag and thick into the tragi-surreal.
The neurological basis of the surge of the surreal has as its focal point the ongoing but accelerating—now cyberlutional—coevolution of man and machine. A shadow effect recently arisen and subsequently deepened through the saturation of cyber-gadgets in our cognitive environment relates screen addictions to the flexibility of brain function step-shifts. As already suggested, those who induce in themselves through their chosen behaviors an alpha/theta brainwave normality often cannot shift up to low-beta, let alone to high-beta or gamma, as easily as those whose attention and visual gaze is exercised more frequently and unmediated in nature. That is what our evolved novelty bias typically does when it is left unskewed by artificial behavior modes like screen watching. Psychological comfort zones in due course become defined by in-dwelled states of mind—not “locked” exactly, but inclined to a point of habituated inflexibility. Many people, perhaps most in affluent settings, do what is easy when they can get away with it, when they have food, shelter, clothing, and medicine and tend not to worry that any of those necessities is about to disappear. Given that level of minimalist affluence, depressed and humiliated men will not so often push themselves up and off the sofa, especially when no woman or dependent child is around to witness, appreciate, and benefit from it.
So it’s not just that the screens’ blue light inhibits downshifts into theta and so into healthful sleep. This is hardly esoteric knowledge anymore and anyway should be way down the priority list of concerns about what our magic rectangles, larger and smaller, are doing to our brains. Higher on that list is that screens systematically retard or inhibit upshifts into cognitive alertness at all ages, and obviously for women as well as for men. They want to dump everyone into alpha neutral and, again, that works because realistic-seeming but yet unreal visual images ask little to nothing of the viewer. Remember, they are neither ideas nor myths nor metaphors. They do not help you think. They just are.
Reading ink and paper copy does just the reverse, not the same way as a walk in the woods but just as effectively. It pushes us up into more rapid brainwave patterns, even when we are reading non-fiction and certainly when we are reading an adventure novel or a sci-fi thriller. Reading forces us to devise our own mental images built from the symbolization processes inherent to written language. It obliges us, in other words, to think.
If there is now a general cultural proclivity toward low-alpha/theta induced by our cyber-gadget saturation, and if low-alpha/theta for adults is indeed the brainwave “zone” in which spectacle inducement, as well as conspiracy theory transmission, happens most readily, then it probably explains much of the mentality shift we observe. Pending more empirical research, it might also in turn explain how the digital tsunami is affecting American political culture more specifically—a matter to which we will return anon. But this is the basic neuroscience, much of it settled and some of it speculative, behind the Age of Spectacle.
No Need to Exaggerate
James Thurber once wrote that, “You might as well fall flat on your face as lean over too far backwards.” Too true; now that some of us have begun to take digital technology seriously for its dangers as well as for its benefits, some are leaning over too far backwards to find fault that may not exist.
Some observers, for example, clearly not scientists, have blamed “excessive use of personal electronics and social media” for lowering American median IQ, as reported by a 2023 Northwestern University/University of Oregon study, and thus, according to Sabrina Haake, making people stupid enough to support Donald Trump even when it contradicts their own interests.[6] Unpacking what is wrong with this statement will take a moment.
First, the Northwestern/Oregon study did not insist on a causal link between digital hyperstimulation and falling IQs, and for good reason. Falling IQ test scores may be over-determined by many causes: increasing anxiety and depression levels in test subject age cohorts; increasing sleep deprivation only partly ascribable to digital addictions; eroded reading comprehension levels due to lowered educational standards and too much teaching-to-the-test pedagogy; higher percentages of test subjects whose native tongue is not English; and several other possible factors. Verbal SAT scores fell significantly back in the day, more than forty years ago, when television, especially color television, had saturated the American market; that is not the same as IQ but would show up on IQ tests as well, and that happened decades before the cyberlution hit.
Second, to ascribe support for Trump to stupidity alone combines two errors in one: stupidity, while real enough, is ontologically distinct from both ignorance and mental illness. Ignorance of the relevant connecting dots in complex political matters seems a far better source of voter delusion than garden-variety stupidity.
Moreover, it is not as though the regnant elites, including Democratic Party ones, have been serving the interests of less well-off and less well-educated Americans such that voters can finely parse mostly information-free campaigns at a time of advertising language colonized political rhetoric. The penchant for insurgent and “change” candidates reflects a general discontentment with the political status quo, and assessing what is and is not in a given voter’s interests becomes mixed up with emotional impulses having nothing to do with remembering, say, what the Trump tax cut was all about and who it helped and hurt some years down the road, years now long since passed in a present-oriented culture. Ms. Haake gives average voters too much credit for even knowing what factors to parse to determine whether a given candidate’s views and likely behavior in office is or is not aligned with their interests.
A similar leap of confusion describes contentions that smartphones and screens account wholly or mainly for the dramatic rise in juvenile and teen myopia, especially prevalent in Asia. Data from Singapore, South Korea, Taiwan, and China are indeed attention arresting. In China, for example, data show that myopia in juvenile and teens has over the past half century years leapt from around 20 percent of 18-year olds—about the percentage in Western countries today—to around 80 percent now. To ascribe this solely or mainly to the cyberlution, however, is not justified. Some genetic factors may be at play, but even more likely are two other factors: The data from half a century ago is not reliable, since huge numbers of rural Chinese never had their eyes examined as youth, or at all; and myopia caused by too much close focusing and too little time spent outside looking at distant objects can be caused by the close study of printed material as well as looking at screens. High levels of myopia have been verified in Orthodox Jews for a few centuries now, and that had everything to do with intense close-focus study of texts and nothing to do with digital gadgets that, obviously, did not yet exist.
Things are bad enough: This is no time to add false worries to the ambit of our anxiety.
The Age of Spectacle:
How a Confluence of Fragilized Affluence, the End of Modernity, Deep-Literacy Erosion, and Shock Entertainment Technovelty Has Wrecked American Politics
Foreword [TKL]
Introduction: A Hypothesis Unfurled 5
The Cyberlution
The Republic of Spectacle: A Pocket Chronology
The Spectocracy Is Risen
Why This Argument Is Different from All Other Arguments
Opening Acts and the Main Attraction
Obdurate Notes on Style and Tone
PART I: Puzzle Pieces
1. Fragilized Affluence and Postmodern Decadence: Underturtle I 38
Government as Entertainment
The Accidental Aristocracy
The Deafness to Classical Liberalism
The Culture of Dematerialization
Affluence and Leadership
Neurosis, Loneliness, and Despair
Wealth and Individualism
Hard Times Ain’t What They Used to Be
Affluence Fragilized
Real and Unreal Inequality
The Net Effect
Dysfunctional Wealth
Searching for the Next Capitalism
2. Our Lost Origin Stories at the End of Modernity: Underturtle II 75
What Is a Mythopoetical Core?
Aristotle’s Picture Album
Faith, Fiction, Metaphor, and Politics
The American Story, a First Telling
How Secularism Was Birthed in a Religious Age
Regression to the Zero-Sum
Industrial Folklore
Bye, Bye Modernity, Hello the New Mythos
Mythic Consciousness and Revenant Magic
Sex Magic
Word Magic
Business Magic
Progress as Dirty Word, History as Nightmare
Attitudes and Institutions Misaligned
3. Deep Literacy Erosion: Underturtle III 137
Trending Toward Oblivion
The Reading-Writing Dialectic
The Birth of Interiority
A Rabbinic Interlude
You Must Remember This
Dissent
The Catechized Literacy of the Woke Left
Reading Out Tyranny
Chat Crap
4. Cyber-Orality Rising: Underturtle III, Continued 168
The Second Twin
Structural Mimicry and Fantasized Time
Losing the Lebenswelt
Podcast Mania
The Political Fallout of Digital Decadence
Zombified Vocabulary
Democracy as Drama
Where Did the News Go?
Optimists No More
Foreshadowing a Shadow Effect
5. The Cultural Contradictions of Liberal Democracy: An Under-Underturtle 217
A Big, Fat, Ancient Greek Idea
The American Story Again, This Time with Feeling
Footnotes to Plato
Some For Instances
Jefferson à la Carte
Revering the Irreverent
The Deep Source of the American Meliorist State
The Great Morphing
Myth, Magi, and Immaturity
The Wages of Fantasy
Pull It Up By the Roots
The Crux
PART II: Emerging Picture
6. “Doing a Ripley”: Spectacle Defined and Illustrated 278
Astounding Complexes and Technical Events from TV to Smartphones
Tricks, Illusions, and Cons
Fakers, Frauds With Halos, and Magnificos
Projectionist Fraud as a Way of Life
Old Ripleys, New Ripleys
Fake News
Trump as Master of Contrafiction
Conspiracy Soup
Cognitive Illusions
Facticity Termites
Conditioning for Spectacle
To the Neuroscience
7. The Neuroscience of Spectacle: Research and Implications 333
Brain Power
Seeing the Light
Surfing Your Brainwaves
Suffer the Children
The Screen!
Easy Rider
The Graphic Revolution, Memory, and the Triumph of Appearances
McLuhan Was Wrong, and Right
Brain Shadows
No Need to Exaggerate
8. Cognitive Gluttony: Race and Gender 370
Cognitive Gluttony Racialized
Ripleys on the Left
More Sex
Abortion: Serious Issues, Specious Arguments, Sunken Roots
Beyond Feminism
I’m a Man, I Spell M-A-N
9. Saints and Cynics: The Root Commonalities of Illiberalism 383
Different Birds, Same Feathers
The Touching of the Extremes
From Left to Right and Back Again
Spectacle in Stereo
The Right’s Crazy SOB Competition
The Irony of Leveling
Human Nature
10. Spectacle and the American Future
You Are the Tube and the Tube Is You
The Nightmare on Pennsylvania Avenue
Bad Philosophy, Bad Consequences
Is Woke Broke?
Myth as Model
The AI Spectre
The Futility of Conventional Politics
A Few National Security Implications
Meanwhile…
Who Will Create the Garden?
Acknowledgments 483
[1] The reference is to Burke, Language in Thought and Action (University of California Press, 1949).
[2] Another personal note: I used to hate the “The Smurfs.” Every episode was the same in two ways: Whenever an opportunity arose in the script to teach a child a new vocabulary word the writers used the verb “to smurf”; and every episode eventually depicted some characters in a state of zombified dreamlike hypnosis. Little did I realize at the time how prophetic this second characteristic was, and how it doubled back to account for the emaciation of the typical viewer’s vocabulary.
[3] Robert Putnam’s classic Bowling Alone essay from 1995, in The Journal of Democracy, and his book from 2000 (Simon & Schuster) did not extend his general sociological observations into politics as far as Francis Fukuyama’s 1995 book Trust (Simon & Schuster).
[4] See Richard V. Reeves, Of Boys and Men: Why the Modern Male Is Struggling, Why It Matters, and What to Do About It (2022).
[5] Eberstadt and Abramsky, “What Do Prime-Age ‘NILF’ Men Do All Day? A Cautionary on Universal Basic Income,” Institute for Family Studies, February 8, 2021.
[6] Sabrina Haake, “Our falling IQ shows in the polls,” The Haake Take (substack), May 4, 2024.