Age of Spectacle No. 16
Chapter 4. Underturtle III: From Deep Literacy to Cyber-Orality, part 4
The Raspberry Patch continues on today, now in August 2024, precisely as expected, with part 4 of Chapter 4. By way of general introduction there is relatively little to say. The manuscript continues to change, including parts already rolled out. The subject, alas, is so capacious and the world so willingly flows examples and instructive surprises to my attention that refusing to mind them would constitute an unnatural act. I am not into unnatural acts.
I have, as promised last time, popped my extended outline into this post at the end, which allows me to orient occasional readers not only to which chapter we are on but which parts of that chapter, as well. This time, unfortunately, the post cuts off in the middle of a subsection so as not to violate Substack’s length parameters. Also note: The extended outline is also likely to change, somewhat at least, as my thinking continues to develop.
I hope to keep true to the once-a-week schedule for The Age of Spectacle rollout, but two contingencies may interfere. First, in a few weeks we’ll be off for a camping weekend to Franklin, Pennsylvania, where the annual Rock ‘n River extravaganza takes place. That’s the Pennsylvania state stone-skipping championship competition, if you don’t know. Our eldest son, Gabriel, won the 2019 competition, and got his picture in the Washington Post holding the trophy with his name to be newly inscribed on it. (Winners do not get to keep the trophy; they get a box of chocolate fudge from the local ladies instead.) Just a few weeks ago a guy came up to Gabriel in a grocery store in Ardmore, Pennsylvania and recognized him from that nearly five-year-old photo. You can’t make stuff like that up.
Second, starting yesterday I am back on payroll for a six-month stint at the S. Rajaratnam School of International Studies (RSIS) at the Nanyang Technical University (NTU) in Singapore, where I lived on fellowship from July 2019 to July 2020. I am a Washington “stringer” consultant for RSIS for the U.S. election season and its pre-inauguration aftermath. The RSIS Commentary series is online and free to all, so if you seek my analyses of U.S. politics upcoming, it’ll be there, not here in The Raspberry Patch. I don’t think my obligations to RSIS-NTU will deflect my Age of Spectacle schedule, but I could end up mistaken about that. We’ll see.
Otherwise, we here in the DC area are now under drought warning, and it continues to be very hot in extended stretches of days rather than, as usually used to be the case, in spikes of only a few days at a time. Weather conditions have definitely affected our garden. With strategic watering the squash, eggplant, peppers, and many of the herbs show every sign of loving life. The peas and blueberries have long since given up the ghost, however. Tomatoes do not ripen well on the vine unless evening temperatures fall below about 65 degrees, and runner bean blossoms seem not to want to proceed to fruiting. The seasonal second coming of raspberries is progressing--more robustly for the everbearing type than for the primacane type--but even there the heat seems to be taking a toll. Lettuces, dill, and some stir-fry greens rush to bolt prematurely, and only some flowers—tithionia diversifolia, for example (Mexican sunflower)—thrive under these conditions as others wither and die, or barely hold on.
Even so, despite the weather conditions, it remains true that gardening is the slowest of the performing arts. Beauty is in the eye of the beholder, sure; but beauty of radical awe caliber is in the eye of the fortunate experienced, participant, and above all patient beholder. Just like reading this manuscript, gardening is good for you. Very good for you.
Chapter 4: Underturtle III: From Deep Literacy to Cyber-Orality, part 4
. . . .It is hard to disagree with Ong’s conclusion that, “without writing, human consciousness cannot achieve its fullest potential.”[1]
Dissent
Hard but not impossible. Not everyone has agreed with that conclusion and some still don’t. To be fair and clear, it behooves us to do a “Team B” critique of claims for deep literacy. As valuable as deep literacy is likely to seem for anyone reading these words, several arguments from different starting points push back against claiming, or asking, too much for and from deep literacy.
One ancient starting point is associated, somewhat ironically, with none other than King Solomon. It was he who reputedly wrote—hence the irony—near the end of Ecclesiastes: “My son be admonished: Of making many books there is no end, and much study is a weariness of the flesh.” Solomon was not saying do not learn to read or do not read; he was merely saying don’t overdo it and get some aerobic exercise from time to time.
It was Socrates, across the Mediterranean some four centuries later, who feared that the written word would kill the dialectical charms of oral language, invite willful misunderstanding, and make memory lazy. In this he had later allies back in Solomon’s neighborhood among the rabbis who only wrote down the Oral Law in the 2nd-century CE when they feared its demise at a time of existential crisis. Similar views of written language also populate Hindu attitudes and those of several other faith traditions.
A more modern starting point of dissent links back to the Rousseauian idealism of the noble savage. Ancient tribes did not have alphabets or writing, yet they supposedly thrived in harmony with the natural world. They may not have created anything so fine by way of governance as liberal democracy, but they didn’t need protective principles of governance when their leaders were, we are meant to assume, wise, humble, civic minded and directly approachable. J.-J. Rousseau did not, of course, invent de novo the anthropologically bizarre idea of pre-literate humans as an ideal of eco-moral rectitude. An authority no less exalted than Thomas Bulfinch tells us that Cadmus, a Phoenician, introduced the alphabet to the Greeks, thus siring literacy and in due course civilization “which the poets have always been prone to describe as a deterioration of man’s first estate, the Golden Age of innocence and simplicity.”[2]
A fine depiction of this idyllic meme is the 2019 Bhutanese film Lunana: A Yak in the Classroom. A young civil servant from the capital is sent to teach a small clutch of children in the remote village of Lunana, population 63. He arrives not wanting to be a teacher, wanting instead to go to Australia and perhaps never return. He arrives in Lunana, too, wired with headphones, rude, self-absorbed, oblivious to natural beauty, and on balance insufferable. Months later he leaves a changed man, yet one still unaware of the full implications. Soon finding himself singing on a club stage in Sydney, he lowers his guitar, drops his song, sits as though enchanted for a long moment, and then begins quietly singing a yak ballad of the high Himalayas that he learned from a young village woman. That is where the film ends, but we know it’s not where the story ends. It ends back in Lunana.
Literacy, as the line of “noble savage” reasoning goes, disciplines the imagination to excess by leading it away from presentational symbology toward more limited linear kinds of perception. Look at what happens to children as they learn to read: They gain new left-hemisphere cognitive capacities, but they also lose some of the right-hemispheric wild and joyous imaginative spontaneity of their early childhood. Reading may expand one’s theory of mind by dint of mastering abstractions, but illiteracy may point one toward sensory immersion in the world, and from there, some believe, to the portal of the Creator.
This is not a silly idea. It could be, after all, that absolute truth exists ab initio but cannot be revealed in words. As the Hasidic master said: “God hides himself from the mind of man but reveals Himself to the heart.” Our minds are happiest with distinctions, which literacy aids our achieving. Our hearts are happiest with oneness, which literacy strains to address, even in the most sublime poetry. Our politics, therefore, are never and can never be entirely at ease, for politics requires metaphor and metaphor encompasses and compresses thought into feeling. It multiplies serial symbol-friendly “how” questions aplenty, but as it does it cannot avoid posing presentational symbol-friendly “why” questions, let alone any deeper unifying political theology-spun “why” question.[3]
Learning to read, in this rendition of human nature, is therefore at best a necessary evil. At worst, it is a tipping point in the arch of human history that has led to environmental rapine and put the future of the planet itself in jeopardy. Somewhere in between it is blamed for a direction of human cultural evolution that leaves much, too much for some, to be desired. Thus wrote the neurophysiologist Joseph Bogen in 1975, Western society is “a scholastized, post-Gutenberg-industrialized, computer-happy exaggeration of the Graeco-Roman penchant for propositionizing.”[4] Kevin Systrom agrees using different terms: “People have always been visual–our brains are wired for images. Writing was a hack, a detour. Pictorial languages are how we all started to communicate—we are coming full circle.”[5] Systrom is co-founder of Instagram, so that would be his view, wouldn’t it?
A third starting point, and for our purposes the last to be admitted here, is the plaint that reading counterfeits the world and that reading novels, in particular, raises our expectations to a height so extravagant that disappointment and depression can be the only outcomes. An otherwise obscure English literature professor named Peter Thorpe, probably hoping to become less obscure (but without forfeiting tenure), wrote a book back in 1979 entitled Why Literature is Bad For You. “If we become too involved in the beautiful imitation,” he hypothesized, “we can begin to lose touch with the real thing.” Thorpe, too, had an ancient Greek precursor: The myth of Cephalus and Procris warns against the overuse of metaphor lest it lead to the tragedy of lost love. Thorpe, who passed away in 2006, no doubt knew the myth.
The Second Twin
Thanks to the digital disruption or tsunami, ever more of us are addicted to distraction and particularly to distraction in the form of head-turning spectacle. Mark what the phrase really implies beyond its bumper-sticker length assertion.
Addictions, if detected and accepted as such by the addicted, can be treated and, with any luck and some patience, overcome. A gambling addict, for example, can see evidence of addiction in a dwindling bank account and an accrual of debt, and often enough in an array of riled up family relationships. Someone addicted to distraction, on the other hand, has a harder task seeing time and timelines disappear for lack of any solid empirical referent that something is amiss, and as soon as a rare sign of a problem begins to dawn on the victim, whoosh, it’s gone thanks to the next distraction. It short, addiction to distraction often evidences a virtual closed loop with no way out. It doesn’t help that so many people are now affected that addiction’s symptoms seem perfectly normal, but are actually anything but.[6]
This is important. The origin of any and every addiction is an illusion, a kind of fiction, that some marvelous pleasurable reward may abide in one’s future. A person sets off in pursuit of this reward only to find that the faster he or she runs the more elusive the reward becomes. With substance addictions the usual response is to use more of the substance to sustain the same level of pleasurable hope that the ultimate reward can be grasped. With behavioral addictions like gambling, thrill seeking, and sex the usual response is to do whatever it is faster, deeper, and more often than before. Since the ultimate reward is indeed fictional, however, it will never be attained.
The same goes, but slightly differently, with addiction to digital technology where the fiction is honed to near statistical perfection by algorithmic teasing. As Jaron Lanier put it, “The algorithm is trying to capture the perfect parameters for manipulating a brain, while the brain . . .is changing in response to the algorithm’s experiments” using what look like GANs (generative adversarial network) methodologies, the methodologies that create, interestingly, deepfakes. But, continues Lanier, “because the stimuli from the algorithm doesn’t mean anything, because they are genuinely random, the brain isn’t responding to anything real, but to a fiction. That process—of becoming hooked in an elusive mirage—is addiction.”[7]
Again, the difference is that substance and other behavior addictions involve physical, material elements that can be seen, felt, and even photographed. The addict can thus objectify these elements so as to dodge them next time around en route to transcending the addiction. But cyberaddictions have few if any external empirical referents for the addict to beware of, and no one else can see them either. For any practical purpose, they are all packed into an individual headspace, making them more insidious than the run-of-the-mill addiction even as they tend to be less physically dangerous.
Some observers see another difference: Cyberaddictions do not habituate like substance and ordinary behavioral addictions. They are, it is averred, ever novel and endlessly attractive. One reason is that the push notifications that sparkle and the dings that sound from iPhones do not anticipate only one class of rewards. They can signal a new email, or a new text message, or a new YouTube video, or a calendar notification, or a recorded voicemail—and many if not most of these possibilities raise the chance of a social connection, something most other addictions are powerless to dangle before us. Even incidental, passing social connections seem to be more salient psychologically than most of us realize. So unlike the treadmill psychology of substance addictions in particular—that “oh no, here I go again” feeling—the thrill is not gone (apologies to B.B. King) ever from cyberaddictions.
Addiction is known to change the brain permanently. Addictions that commandeer the dopamine network change the network; they reshape it permanently. So now note that, according to the British psychologist Aric Sigman, by the time the average American kid is 8-years old he or she “will have spent more than a full year of 24-hour days on recreational screen time.”[8]
So far below the radar is the actual generic problem of cyberaddiction that minor sidebars of the danger often take pride of place in the media. One example is a recent special Foreign Affairs volume titled “Technology and Power.” One of the lead essays is by former Google CEO Eric Schmidt who delivers up perhaps the biggest BFO of the decade: Technological innovation will have a major impact on international power distribution. Really Eric, you think? Not one of the essays in the Foreign Affairs collection even mentions possible effects of technological innovation on what business types call “human capital,” in the sense that the technology as adapted might produce negative effects on human reasoning capabilities.
A better, because more granular, example of the same thing is a Financial Times article by Camilla Cavendish with the breathtaking title “Humanity is sleepwalking into a neurotech disaster.” When I saw that headline I rejoiced that finally someone else had gotten a grip on the nature and magnitude of the problem before us. Alas, it was not to be.
Ms. Cavendish, a journalist, used the article to discuss the work of a Duke University neuroscientist and law professor named Nita Farahany, and to draw attention to Farahany’s recent book The Battle for Your Brain: Defending the Right to Think Freely in Age of Neurotechnology. The book is basically about privacy and autonomy, not sanity and inbecilization, in the face of new technological developments. Yes, all sorts of technological devices are coming into existence with the potential to eavesdrop on your brain, and perhaps to direct or limit what your brain focuses on. Ms. Cavendish’s article gives several examples and points to the book’s detailing of many more.
Given the nature of addiction to distraction, this is no trivial matter, but these claims are overwrought. Farahany thinks that neurotech could be the nexus that allows a central point to become the manager of all our interactions with technology—a “universal controller,” in other words.[9] We already readily give away our privacy data in return for entertainment crumbs, a foolishness so common that there are songs about it.[10] Now we are about to give away our “brain data” for similarly evanescent amusement and convenience, and that forebodes an entirely different and vastly more dangerous tubful of trouble.
But all this amounts to is Alexaphobia cubed: In this addled view, Alexa is not merely your radio music impresario; “she” is enabling some anonymous remote listener to hear what you are saying in your living room, or even your bedroom, and that presumably gives the listeners, whomever they may be, potentially crippling power over you. Imagine how much worse off we will be if anonymous and remote others can monitor your very brain waves before you so much as utter a single word, even to yourself, in conclusion as to what you will do next.
Nonsense: Fears about nanobots reading our minds and surveilling us 24/7, and “replacing independent thought and judgement with automated control,” as Cavendish quotes a recent Chatham House paper saying, would require the would-be objects of control to knowingly surrender their autonomy before anyone can hope to abuse them. No one makes anyone eat toxic fast food, or become morbidly obese from that and sheer laziness, or waste their lives playing video games, or engage Alexa at all for that matter. All anyone has to do to prevent visitations of neurotech abuse is to pull the relevant plug. Worrying about future neurotech while failing to notice the already occurring general degradation of high cognitive functioning is like worrying about getting sunburnt next month while you’re on a sailboat miles from land in the middle of a pop-up storm.
In societies plagued by totalitarian regimes, real and aspirational ones, yes, neurotech directed from above with little to no hope of general transparency could become instruments of control worthy of Orwell’s 1984. That is bad, which justifies writing neuro-rights into law, as happened recently in a newly drafted Chilean constitution. But such sinister hidden control from above, lurid deep state conspiracy theories notwithstanding, is simply not going to happen in a place where an innocent experimental Canadian robot called hitchBOT, traveling down the eastern seaboard from Boston, reaches Philadelphia and is summarily beheaded, otherwise torn to shreds, and set afire.[11]
Not to make light of vandalism, which in this case seems to have had no Unabomber-like anti-tech philosophy behind it, or any philosophy at all. But a serious point lurks in this tale from the city of brotherly love: “Liberty lies in the hearts of men and women,” said Judge Learned Hand in 1944; “when it dies there, no constitution, no law, no court can save it; no constitution, no law, no court can even do much to help it.”[12] If Americans en masse are ever found willing to trade their liberty for the newest shiny-object-like adult toy, then it will be clear that they care even less for liberty than they lately seem to care about democracy. Esau sold his birthright for a bowl of porridge; if We the People do something similar, we deserve whatever becomes of us.
Structural Mimicry and Fantasized Time
Something more profound politically than not seeing the dangers of deep literacy erosion for individual intellectual development is at work, as well. With the erosion of deep literacy among elites as well as ordinary folk, and the disorienting press of cyberaddiction on our capacity to defend ourselves from ideological simplifications, conspiracy mongering, and intellectual demobilization, a reversion to pre-rational modes of thought is gaining traction. We see it vividly in the regression of scriptural-based religion into pre-literate modes of religion, and in some cases back even to the modes of mythic consciousness. That, ultimately, is where the revenant zero-sum mentality described above comes from.
Pre-deep literate religious traditions rely on spectacle to capture the attention of the faithful, and spectacle is long on emotion and awe, short on reason and logic. Just compare the typical decorations in a traditional Catholic Church, let along a major cathedral, with those of any low-church Protestant place of worship, and the difference is impossible to miss.[13] This may be why, in an age of a galloping spectacle mentality, classical scripture-based forms of Protestantism are giving way to exotic, spectacalized forms of “big concrete tent” Evangelicism, Pentacostalism, and even to Catholicism.
Moreover, not all religious traditions go about the binding of irrationality and the generation of in-group morale in the same way. In the American context, Protestantism of the two-pronged Anglo sort (Anglican and Calvinist being the two prongs) has been most definitive, and Protestantism is a scripture-based, deep literacy-based faith. By contrast, Catholicism is a pre-deep literate tradition, which is why a literate priesthood came to interpose itself between the godhead and the masses of the faithful. It is also why one observer believes that, in a post-literate age of spectacle, Catholicism may make gains relative to Protestantism, whose demands on parishioners concerning the written word may be more than many will care to bear.[14]
As already noted, many believe that woke politics is at least in part a new form of Protestantism in American, part of a fourth, mostly de-churched Great Awakening. It, too, goes long on spectacle. Spectacle is the common denominator of the “strange rites” described by Tara Isabella Burton amid new forms of religion, those not necessarily obsessed with politics, for a godless world. [15]
But why? Why is there such an appetite for the unreal, and for spectacle in particular? Plenty of possibilities suggest themselves, some fairly anodyne. Large chunks of American society experience a sleep deficit, so perhaps we crave the psychic catharsis of dreaming that we get too little of because we are too worried, too caffeinated, too sugar-addled, and too stunned by blue light from out smartphone screens. Maybe some younger age cohorts are trying to make up for innocence lost on account of close hovering parents who prevented them from getting their necessary share of free imaginative play. Perhaps it is as simple as that, or the other. But I doubt it.
Maybe it’s the cyberlution, but in a different and deeper way than most are accustomed to thinking about it. Here we elide on what might be the mother of all shadow effects.
The long-wave culture shift away from developmentalism and toward entertainment, and from relatively placid entertainment to pulse-raising spectacle, clearly has sources beyond technology and its associated techniques. But it has been mightily enabled if not induced by technology over the past several decades. Our man-made environment is not on the whole human-scale friendly. It makes us smaller by atomizing our interactions and creating isolation and loneliness. It also creates anxiety, so it is understandable that many will be incentivized to escape that anxiety by recourse to escapist fantasy, the higher the graphic quality the better for the purpose.
As we are here at pains to point out, and as we do in more detail below, not just Americans but Westerners and many others have experienced a quantitative-become-a-qualitative change in the nature of the sensory inputs we experience. This change has gradually rewired brain circuitries to the point that by now most American, well educated and not, do not think as they once did about a range of subjects, including politics. This is because immersion in mediated imagery, whether for purposes of indulging in fantasy entertainment or other pursuits as seemingly anodyne as social media keeping-in-touch, has a shadow effect on cognition generally. That effect reflects the simple facts that the human mind is promiscuously associational, and that most of the promiscuity is pre-conscious so that we rarely discern the process at work on ourselves. We can, however, more readily see it objectified in others if we try.
Importantly, the shadow effect of which we speak works not just or even mainly as a function of mimicked content but rather as a structural version of the cognitive dynamic that produces the evoked set in our heads. As we turn repeatedly to a particular interpretive template we reinforce the neural networks associated with it, and strengthen them accordingly. We literately rewire our brains every time we do it, and exactly that is the actual source of spectacle-related shadow effects so deep in our mental functioning that we typically do not notice it happening.
Something else we have stopped noticing, despite its yawning difference from the natural human attitude before the graphic revolution--to again invoke Boorstin’s 1961 locution--is what technology has done to our perception of time. It has fantasized it.
When we watch fictive scriptings on television or in the movies—and increasingly movies watched on television via streaming thanks to our astonishing technology-powered affluence—we are messing with time in a way our forebears did not because they could not. In a sense, we can induce in ourselves a hybrid consciousness between relaxed wakefulness and dreaming almost anytime we like, and this necessarily involves screwing around with linear timelines—just as is the case in dreams and mythic cognitive syntax—and putting ourselves in theta brainwave states (elaborated below in Chapter 9).
At the risk of stating the obvious, consider that since there is little live television save for some talk shows, sports announcing, and the (supposed) news, the drama, sitcom, or documentary we are watching is already finished, completed to the best ability of its producers, directors, and actors. It is as such disembodied from the flow of natural time. But we are not thus disembodied as we are watching, presumably without knowing how the plot develops and concludes—unless we’re watching a well-enough remembered re-run. We are watching in natural time, and this creates a time parallax: We’re watching something now that comes from back when.
Television, as it developed over two decades—a fairly short time as major human cultural developments go—in a sense trained us to be at ease with such forms of wide-awake time-manipulation despite the radical perceptual discontinuity it marked from the previous roughly 200,000-year history of modern anatomical humans. What had been unnatural became so common and well assimilated that it hardly drew notice. This upending of linear time was, it now seems obvious, a kind of structural portal to the unreal, and from the unreal, with a little splash of willful creativity, to the surreal—to the 24/7 spectacle now at our beck and call.
The Age of Spectacle: How a Confluence of Fragilized Affluence, the End of Modernity, Deep-Literacy Erosion, and Shock Entertainment Technovelty Has Wrecked American Politics
Foreword [TKL]
Introduction: A Hypothesis Unfurled
Technovelty
The Republic of Spectacle: A Pocket Chronology
A Spectocracy, If We Can Keep It
Why This Argument Is Different from All Other Arguments
Opening Acts and the Main Attraction
The Path Forward
Obdurate Notes on Style and Tone
PART I: Puzzle Pieces
1. The Analytical Status Quo: Theories of American Dysfunction
Meritocracy Awry
Populism
Polarization
Segmented Economies
Perforated Moral Communities
Institutional Decay
Social Trust Depletion
Industrial Folklore
The Digital Tsunami
Fear and Delusion in the 21st Century Funhouse
2. Underturtle I: Fragilized Affluence and Postmodern Decadence
Government as Entertainment
The Accidental Aristocracy
The Agora’s Deafness to Classical Liberalism
The Culture of Dematerialization
Affluence and the Changing Image of Leadership
Neurosis, Loneliness, and Despair
Wealth and Individualism
Hard Times Ain’t What They Used to Be
Affluence Fragilized
3. Underturtle II: Our Lost Origin Stories at the End of Modernity
Masking Motives
Virtue and Character
Aristotle’s Picture Album
Faith, Fiction, Metaphor, and Politics
The American Story
How Secularism Was Birthed in a Religious Age
Regression to the Zero-Sum Mean
Bye, Bye Modernity
Mythic Consciousness and Revenant Magic
Progress as Dirty Word, History as Nightmare, Equality as Godhead
Real and Unreal Inequality
Attitudes and Institutions Misaligned
4. Underturtle III: From Deep Literacy to Cyber-Orality
Where Did the News Go?
Podcast Mania
Fakery Cubed: The Chat Claptrap
The Reading-Writing Dialectic
The Birth of Interiority
A Rabbinic Interlude
Remember This
Dissent
The Second Twin
Structural Mimicry and Fantasized Time
Losing the Lebenswelt
The Political Fallout of Digital Decadence
Zombified Vocabulary
The Catechized Literacy of the Woke Left
Democracy as Drama
Reading Out Tyranny
Optimists No More
5. The Net Effect
Futurology Agonistes
Business Mega-Cycles
The COVID Visitation
Electronic Knitting
Declassé Goes Global
Bad Guys, First Movers, and Business Consolidations
Social Media and Artificial Intelligence as Net-Effect Phenomena
Gigantism and Plutocracy
Offshoring at Global Scale
Cybersecurity as a Productivity-Growth Externality?
Bank on It
Debt Becomes Her?
Risk versus Uncertainty
Dysfunctional Wealth
Searching for the Next Capitalism
6. The Cultural Contradictions of Liberal Democracy
A Big, Fat, Ancient Greek Idea
Footnotes to Plato
Opinion Gluttony
The New Children’s Crusade
Revering the Irreverent
The Wages of Fantasy
Pull It Up By the Roots
The Great Morphing
An Ohio Coda
PART II: Emerging Picture
7. We Do Believe in Magic
Fear and Delusion in the 21st Century Funhouse
Culture, Politics, and Regression to the Mythic Consciousness
Word Magic Redux
Our New/Old Stories
Myth, Magic, and Childishness
Addiction to AI-Spectacle as the Ultimate Danger?
Mythic Politics
8. “Doing a Ripley”: Spectacle Defined and Illustrated
Working Definition
Tricks
Illusions
Cons
Fakers and Frauds With Halos
The Magnificos
MAGA: Projectionist Fraud as a Way of Life
Old Ripleys, New Ripleys
Trump as Idiot-Savant Fraudster
Conspiracy Soup
Facticity Termites
Conditioning for Spectacle
To the Neuroscience
9. The Neuroscience of Spectacle
Glancing
Seeing the Light
Eye-to-Eye
Surfing Your Brainwaves
McLuhan Was Wrong, and Right
The Graphic Revolution, Memory, and the Triumph of Appearances
Structural Shadows
Surfing a New Wave
Toward Some Informed Speculations
Suffer the Children
10. The Mad Dialectic of Nostalgia and Utopia in the Infotainment Era
Ripleys on the Left
The Hylton-Brown Case
From Left to Right and Back Again
The Root Commonalities of Illiberalism
Spectacle Gluttony
Gratuitous Harm in Black and White
The Touching of the Extremes
The Wrongness of the Right
Now Sex
Beyond Feminism
The Irony of Leveling
Abortion: Serious Issues and Silly Arguments
The Imperfect Perfect
Vive la Difference?
Human Nature
11. Spectacle and the American Future
Bad Philosophy, Bad Consequences
Astounding Complexes from TV to Smartphones
Up from the Television Age
The Crux
Cognitive Illusions
Another Shadow Effect
Myth as Model
The AI Spectre
A Sobering Coda
12. What Our Politics Can Do, What We Must Do
A Brief Parade of Human Stupidities
Where Are Our Leaders?
The Zero-Sum Unfolded
On the Other Hand
Some Real Ideas
A Small Idea Whose Time Has Come
Moving the Overton Window
Four Hard Pieces
Flesh on the Bones
Housework
Making Subsidiarity Happen
Another Modest “To Do” List
Who Will Create the Garden?
Meanwhile
Index
[1] Ong, Orality and Literacy, p. 14.
[2] Bulfinch, The Age of Fable (Heritage Press, 1942), pp. 300-301.
[3] The distinction between serial and presentational symbols is clearly delineated in Susanne K. Langer, Philosophy in a New Key: A Study in the Symbolism of Reason, Rite and Art (Harvard University Press, 1942). Langer was a student and translator of Ernst Cassirer.
[4] Bogen quoted in the useful essay by Kit Wilson, “Reading Ourselves to Death,” The New Atlantis, Number 68 (Spring 2022), pp. 73-9. Wilson’s title is obviously a talk-back-to jibe at Neil Postman’s Amusing Ourselves to Death: Public Discourse in the Age or Show Business (Random House, 1985).
[5] Systrom quoted in the insightful article by Jemima Lewis, “Nuance is the first victim of images’ victory over words,” The Telegraph, March 23, 2023.
[6] The word addict comes from the Latin addictum, which originally meant the time that an indentured servant (literally an addict) had left to serve his or her master.
[7] Lanier, Ten Argument for Deleting Your Social Media Account Right Now (Henry Holt and Company, 2018), p. ??.
[8] Aric Sigman, quoted in Cytowic, p.TK
[9] Cavendish, “Humanity is sleepwalking into a neurotech disaster,” Financial Times, March 5, 2023.
[10] Note, in particular, Vienna Teng’s perhaps too lovely “Hymn to Acxiom,” composed in 2016.
[11] According to the Associated Press, hitchBOT was about the size of a small child, bore a GPS tracker, could carry on a limited monotone conversation, and snapped a photo every 20 minutes. It wore yellow wellies; the words “San Francisco or Bust” adorned its forehead. For details see Sarah Kaplan, “Hitchhiking robot’s cross-country journey comes to tragic end in Philadelphia,” Washington Post, August 3, 2015.
[12] From a speech delivered at “I Am An American Day,” Central Park, New York, May 20, 1944.
[13] Other examples of the same basic point are plentiful. The evolution of Judaism from a pre-literate, pre-exilic Temple-based sacrificial cult to rabbinic Judaism is all about the injection of mass deep literacy over a period of a few centuries after the Babylonian Exile (586 BCE). Literacy is also a likely major mean of conveyance between pre-literate forms of Hinduism and Taoism to Buddhism. These are complex subjects studied by many scholars over many years, so the gist can only be hinted at in a footnote.
[14] See Marty Mac, “The coming crisis of Protestantism,” Marty’s Mac ‘n’ Cheese (Substack), June 18, 2021. The same reasoning applies to the gains that Evangelical magachurches have made at the expense of mainline “high church” Protestantism in recent decades: The “prosperity gospel” doesn’t require deep literacy, or really any literacy; understanding and fully participating in an Episcopalian service does.
[15] See Burton, Strange Rites: New Religions for a Godless World (PublicAffairs, 2019).