Post-Extinction Event Project No. 2: Creating a Cybermedia Temperance Movement
Post-January 20 AoS Chronicle, No. 19
Last Friday TRP presented its first of four major “Post-Extinction Event Projects: A Voluntary National Service/Baby Bond Program.” The project description was based on and aligned with the tripartite process described in TRP’s June 27 post, “The Garden: A Second of Two Hypothetical American Post-Constitutional Futures Post-January 20 AoS Chronicle, No. 17”: namely a process of first developing a vision, distilling core principles from it, and then developing projects for future implementation in a likely altered political context that align with both.
In the case of the Voluntary National Service/Baby Bond Program, the principles that would be served by its creation include: staunching the hemorrhaging of social trust in our national society; helping to assure maximum possible equality of opportunity; acting on the truth that human beings are built by and flourish and through the works of their own hands; and bolstering America’s Tocquevillean intermediate institutions that bridge the civil society gap between individuals and the state. So a Voluntary National Service/Baby Bond Program would align with and promote four of our five core principles.
We are reviewing the structure of our Post-Extinction Event essay series for two reasons: so that new subscribers since last week—and there are quite a few!—will understand better what they have dropped in on midstream; and so that regular readers, too, can refresh their memories amid so many worthy distractions that most TRP readers expose themselves to day by day, week by week. So now, without further ado…
On the Brink
My Niskanen Center colleague Brink Lindsay, purveyor of the excellent Substack The Permanent Problem, has penned two superb essays. A very recent one, June 9’s “America’s Internal Brain Drain,” we discussed, quoted, and cited in “The Abyss and the Garden: A Tale of Two Post Constitutional Extinction Events,” Post-January 20 AoS Chronicle, No. 16,” on June 20.
Brink is more optimistic than I am that, for example, an AI-infused tutoring set-up would solve the scaling problem for one-on-one supplementary education at the K-12 level. But we agree on more fundamental propositions, specifically that the loosing of current and future versions of ChatGPT into the wild for use among young people who have yet to do the work to build the edifice of their own capacity for intellection would be a counterproductive disaster. It dooms its users to permanent intellectual puerility. Evidence to that effort has emerged since June 20 in the form of a useful New York Times feature: Evan Gorelick, “A.I. in the Classroom,” New York Times, July 9, 2025, and as good or better is a review of studies on AI in the classroom, published under the title “Will AI Make You Stupid?” in The Economist on July 16, 2025.
No one should be surprised by this analysis. Back in 2014 Los Angeles United School District leaders thought it would be wonderful if every student in the city had an iPad that would be compatible with a series of lessons crested and supplied by a private vendor that could be loaded onto them. The idea, apparently, was that the iPads could act like a one-on-one tutor—this nearly a decade before ChatGPT could be touted as providing a similar educational boost. The public rationale for the purchase of the machines and the lessons was that this would be good for educational uplift in a system where way too many students were achieving far below state and national standards. Some really believed this, even most perhaps. But educational advancement was not only or probably the real reason for the proposition: School district managers calculated that despite the huge upfront costs for the iPads and the lessons the program would eventually save the district money by enabling it to make due with fewer teachers and hence a lower payroll-and-benefits HR cost structure.
As it happened, the LA purchase deal got caught up in accusations of insider-dealing corruption, expensive delays, and broken promises by the main contractors; the whole project was summarily abandoned in a pool of recrimination, innuendo, and pointlessly diminished resources. This was unfortunate for two reasons, the first and lesser reason being that most of the scandal accusations were probably true, and that was hurtful.
But the far greater problem was that the program should have been abandoned, indeed never should have been proposed and approved in the first place, because it would have been hugely counterproductive to the education process had it come to pass. The program would have failed on the merits, and so wasted tens of billions of taxpayer dollars, because those talking at us (not really to us) from screens never make eye contact with us, do not know us, do not care about us, and never will. They therefore cannot truly teach us anything of significance, no matter what may be claimed, and the younger the student the more damage would be done by way of lost opportunity costs for actual educational progress.
As Charles Taylor has observed: “The crucial condition for human learning is joint attention,” and, he might have added, environments rich in embodied cognition. Sitting still in front of an iPad, staring at a two-dimensional screen in a three-dimensional world, amounts to trying to affect the position of a shadow by doing things to the shadow. Education doesn’t work like that. Yet by cognitive default too many of us anthropomorphize digital gadgets and other machines as though they were functionally the same as human interlocutors who do all of these fully attentive things. They are not, they do not, and they never will.
Not only is too much screen time for kids not useful for any important seminal educational purpose, too much—and at very young ages any—screen time is flat-out harmful to brain and sensory development. There is no longer any question that long hours spent watching screens physiologically harms brain development in young children, notably the critical process of neuronal myelination, and that is over and above—actually it is behind—all the behavioral damage screens do not just in childhood but into adolescence and beyond, to wit: How is the next generation supposed to avoid harmful behavioral patterns in regard to screen use if their brains are diminished when young by those very patterns of excessive screen exposure? Hence the nasty recursivity of the problem.
That alone is reason enough to demand, just for one particularly infuriating example, that manufacturers of car seats for kids that feature a holder to stick a smartphone in to entertain the little tots while on the road need to cease and desist immediately making, marketing, and selling those products. They constitute child abuse. If the manufacturers refuse to stop marketing and selling them, they should be banned from public sale, just as a range of chemical toxins are banned from public sale.
Not only is the evidence of screen exposure damage in children overwhelming, the evidence is now being conveyed beyond the laboratories to the general reading public so that school officials and others in positions of authority no longer have any excuse for their own ignorance. Note Jonathan Haidt’s 2024 book The Anxious Generation: How the Great Rewiring of Childhood Is Causing an Epidemic of Mental Illness, which garnered much attention. Note the worthy earlier book by Jean Twenge and W. Keith Campbell, The Narcissism Epidemic: Living in the Age of Entitlement (Simon & Schuster, 2009). Most recently we have Nicholas Carr’s Superbloom: How Technologies of Connection Are Tearing Us Apart (W.W. Norton, 2025). Ironically, however, the sharp decline of deep literacy is obviously affecting the audiences for these books and essays, the result being that those who need most to understand their contents are the least likely to read them.
Also note: It was not until June 17, 2024 that then-U.S. Surgeon General Vivek Murthy proposed placing warning labels, presumably to be read by parents, on digital gadgets used by children. It is scandalous that it took the U.S. Federal Government that long to do anything, and then do something so feeble and feckless that it was risible. Take a wild guess what was going on there…..
Brink’s treasure chest had long before his past June 6 come up with the model on which this essay is at least loosely based. Back on September 26, 2023 he posted “The Need for a Media Temperance Movement.” Let’s briefly review it before striking out beyond.
“The Need for a Media Temperance Movement” begins by describing the extent and nature of America’s alcohol addiction in the late 18th and throughout the 19th century. (Brink unfortunately doesn’t mention the lesser known but hardly trivial laudanum addiction plague that mostly affected women.) He shows how a temperance movement centered in the churches, initiated and supported by the moral exertions of the Second Great Awakening, succeeded in making a major dent in alcohol consumption habits—more or less the same sort of virtue cascade, as I have called it in contradistinction to a scoundrel cascade, was taking place in Victorian Britain at about the same time and for much the same reasons (he doesn’t mention that either, but you can’t say everything you might wish to say in just one essay…though, darn me, Lord knows I often try). Brink also shows how other movements, like Abolition, were coupled in American minds with alcohol addiction—as in, slavery is a kind of moral addiction, hurtful to all concerned but denied with proportionate vigor by slavers.
From this pocket history Brink drew a core lesson:
[A] society’s “moral capital” is not just some inheritance from the pre-modern past that is inevitably drawn down as the old traditions fade. Broad-based moral regeneration can occur under the conditions of modernity, and it can be rapid and dramatic. . . . [t]he temperance movement shows us how a free society can respond to the challenges of addictive activities that subvert individual autonomy.
Education and moral suasion, Brink concludes, can do the trick.
His essay then proceeds to detail the harms done from excessive screen-borne media consumption, and to derive targets for its amelioration. Four sources of harm are chosen: social media; solitary consumption of media; the crowding out of deep literacy; and media (mis)coverage of politics as infotainment. Brink describes these harms capaciously and carefully; I’ll not repeat his account here, but you should read it in full (and that’s not because he’s generous to my previous work in that essay). But here is one of the keys to the mechanics of harm done by screen exposure, which deserves some self-reflective inventory-taking: “The more habituated you to get to this undemanding, frictionless substitute for genuine social interaction, the more difficult and burdensome the real thing can start to seem.”
Implications here, which Brink mostly left aside for the moment in which he was writing, certainly include matters of sex, love, marriage, and parenting. The acquired autism symptoms that self-enforced isolation can cause can radically raise anxiety and lead to addictions of several sorts as well as a range of long-lasting personality distortions and affectations. They can in that regard create incels from otherwise normal young males and, in an environment in which homosexuality is not only being protected from bigotry and abuse, which is certainly all to the good and long overdue, but is being touted as well by popular media and entertainment culture as a lifestyle to be proud of, it can move those people in the birth-fixed gray zone of the hormone distribution sweepstakes to suppose that a homosexual life is easier to find and keep than a heterosexual one. Maybe this twitch of direction shift in the American gay rights mindset goes back to the popular 1978 film “La Cage Aux Folles”; I don’t really know, I’m no expert on such things.
But to the extent this is true and demonstrable, it seems unfortunate for two reasons: Biological parenthood and grand-parenthood are normal organic parts of human life—social, spiritual, and emotional—and are core constituent parts of the trigenerational species characteristics special to homo sapiens. (Here see the wonderous and much under-appreciated 1987 book by David Gutmann, Reclaimed Powers….if you are like me, you’ll be shocked by how many obvious facets of human culture you never noticed before Professor Gutmann showed you how to see them.) Missing out on parenthood and grandparenthood not because of what it is fair to call biochemical necessity, but for more or less arbitrary reasons due to passing social deformities, is in no way “gay”; and because, again to the extent that it is true and demonstrable, it is contributing more than trivially to a birth dearth in the United States, which has formidable future socio-economic downsides associated with it.
So if those are the four main problems—too much social media; too much loneliness, isolation, and depression; too much crowding out of reading; and too much disorganization of We the People’s stock of political knowledge and common sense—it follows that a cybermedia temperance movement would have to find ways to reduce social media use, especially among the young, get people out of their dark screen-abiding basements to interact with others in the sunlight—or even at a bar, for heaven’s sake—get people reading more again despite that being an inherently solipsistic activity most of the time, and doing something about media infotainment pollution.
Well, how do we do this? Any of it? By raising public awareness of the dangers of the cybermedia status quo, and by stigmatizing it as addictive. We need, said Brink, to make sure people understand the cognitively compromised nature of virtual experience—I refer to it in the Age of Spectacle manuscript as counterfeited consciousness—especially if it comes, as it usually does, at the expense of direct experience. “We should feel a twinge of discomfort,” Brink wrote, “every time we switch off the real world to tune in the mediated one.” We should feel, he concluded, “as if we’d stepped into a disreputable dive bar in the middle of the day.”
Well, as Age of Spectacle manuscript readers know, I’m all four paws in with that, all of that. But….but.
Can’t Help Myself, I’m a Fool…..
But….cyberaddiction is not just a metaphor to toss about, as many people seem to think. It is clinically real, and precisely because it is real better education and moral suasion alone will not meet the challenge it poses to American culture and, downstream, to our political life. Let’s review briefly some Age of Spectacle material on cyberaddiction, from Chapter 4, to get a sense of how really hard it is to break the back of the problems Brink, and of course many others, have identified.[1]
Willpower alone rarely cures addictions. Adjurations to willpower more often lead to psychic exhaustion and relapse. One does not cure an addiction; one transcends it. As Marc Lewis shows in his brilliant 2015 book The Biology of Desire, addicts conquer their problems, which he calls not “recovery” but rather “personality development beyond addiction,” when they become able to project a storyline outward—and recursively back inward—that gives them a vision of themselves in a better future. Fine; the problem with cyberaddictions is that they are, in this very respect, different and more difficult to manage than substance addictions.
Yes, even behavioral addictions, if detected and accepted as such by the addicted, can be treated and with any luck, much discipline, and some patience be overcome. A gambling addict, for example, can see evidence of addiction in a dwindling bank account and an accrual of debt, and often enough in an array of roiled family relationships. Someone addicted to distraction, on the other hand, because of cybernetic gadgets that use “captology” to reward distraction—like doomscrolling—has a harder task for lack of any solid empirical referent that something is amiss; as soon as a rare sign of a problem begins to dawn on the victim, whoosh, it’s gone thanks to the next distraction as time and timelines melt into an amnesic miasma. It short, addiction to distraction often evidences a virtual closed loop with no way out. It does not help that so many people are now affected that addiction’s symptoms seem perfectly normal, but are actually anything but.
This is important. The origin of any and every addiction is an illusion, a kind of fiction, that some marvelous pleasurable reward may abide in one’s future. A person sets off in pursuit of this reward only to find that the faster he or she runs the more elusive the reward becomes. With substance addictions the usual response is to use more of the substance, or use it more often, to sustain the same level of pleasurable hope that the ultimate reward can be grasped. With behavioral addictions like gambling, thrill seeking, and sex the usual response is to do whatever it is faster, deeper, and also more often than before. Since the ultimate reward is indeed fictional, however, it will never really be attained and at a certain point the addict will stop not because of recovery but because of physical or mental breakdown.
Something roughly similar goes, but again differently in its detail, with addiction to digital technology. In digital addictions the fiction is honed to near statistical perfection by algorithmic teasing. As Jaron Lanier put it, “The algorithm is trying to capture the perfect parameters for manipulating a brain, while the brain . . .is changing in response to the algorithm’s experiments” using what look like GANs (generative adversarial network) methodologies, the methodologies that create deepfakes. (Here please see my “Disinformed,” Inference: An International Review of Science, 5:3 (Fall 2020)…..because I worked really hard on this.) So yes, a cyberaddiction is a kind of addiction that, more cannily than other kinds, involves faking yourself out. But, continues Lanier, “because the stimuli from the algorithm doesn’t mean anything, because they are genuinely random, the brain isn’t responding to anything real, but to a fiction. That process—of becoming hooked in an elusive mirage—is addiction.”[2] Maybe kind of like getting hooked on a conspiracy theory? Wait….anon….
Again, the difference is that substance and other behavioral addictions involve physical, material elements that can be seen, felt, and even photographed. But cyberaddictions have few if any external empirical referents for the addict to beware of. For any practical purpose, they are all packed into an individual headspace, making them more insidious than run-of-the-mill addiction even as they tend to be less physically dangerous—unless of course you’re a phombie who drifts unaware into speeding traffic. So no one is literally addicted to a smartphone; we are addicted to an ensemble of behaviors mediated by the smartphone, and that is a distinction with a difference.
Some observers see yet another important difference: Cyberaddictions do not habituate like substance and ordinary behavioral addictions. They are ever novel and endlessly attractive because the push notifications that sparkle and the dings that sound from iPhones do not anticipate only one class of rewards. They can signal a new email, or a new text message, or a new YouTube video, or a calendar notification, or a recorded voicemail—and many if not most of these possibilities raise the chance of a social connection, something most other addictions are powerless to dangle so predictably before us. Even incidental, passing social connections seem to be more salient psychologically than most of us realize. So unlike the treadmill psychology of substance addictions in particular—that “oh no, here I go again” feeling—the thrill is not gone (apologies to B.B. King), not ever at least from this side of the grave, from cyberaddictions.
Finally on this point of review, addiction changes the brain permanently. Addictions that commandeer the dopamine network reshape it forever by making its receptor cells more sensitive. So now note that, according to the British psychologist Aric Sigman, by the time the average American kid is 8-years old he or she “will have spent more than a full year of 24-hour days on recreational screen time.”[3] That being the case, and dopamine being mostly agnostic as to what evokes its presence, it may be that screen addictions—particularly in childhood as the brain is still fast changing and growing—act as gateways to substance and behavioral addictions later on in life.
In other words, cyberaddictions are invitations to a cornucopia of intersecting doom loops. A simple example: loneliness and isolation cause anxiety in most people. (Loneliness can look deceptively similar to solitude, but it isn’t.) When someone gets anxious he or she will resort to a behavior that relieves the anxiety, which can mean turning back to familiar dopamine-dosing behaviors on a smartphone, which makes the addict more solitary and more anxious,….and which down the road can lead to more substantial escapism aides like alcohol or pills, which will only create more anxiety, leading…..well, you get the point.
From Health Crisis to Constitutional Crisis
Why are we reviewing the physiology of cyberaddictions? Because—I fear to be too blunt here, but I’ll be brave—because cyberaddiction in the United States today is so acute and widespread as to constitute a public health crisis, and as such it is a direct prelude in no trivial way to the constitutional crisis in which we now live. How so?
Because cyberaddiction—particularly Brink’s third and fourth categories of harm, crowding out deep reading about political matters by screen-borne infotainment—rewards surrealist fantasies by evoking massive dopamine and oxytocin flows. It rewards people for choosing to watch screens and not use their precious time to read and so to think, and as such it rewards ignorance and invites simpleminded conspiracy theories to displace coming to grips with reality. This is how tldr (that book or essay is too long so I didn’t read it) sires tcdu (too complex, don’t understand) and so leads non-reading screenheads to wildly oversimplified and misleading caricatures of social and political realities.
The rise in the number and political salience of conspiracy theories (anon is now….) is a direct consequence of cyberaddictions piled onto the erosion of deep literacy in American society. The twain have, as we have argued before, created a New Orality-dominated culture that is hauling with it a regression to the preliterate mythic consciousness, and in that consciousness conceptual abstractions attendant on liberal democracy simply make no sense to a lot of people. What is real to the de-literacied are the concrete and the other-worldly magical, but little to nothing in between. To put it in simple form, that’s how the price of eggs trumped the future of democracy in the November 5, 2024 election (pun totally intended).
Now, the re-raised prominence of the other-worldly magical happens to constitute the prefect growth medium for the conspirification (as Donovan O. Schaefer has termed it) of society. Again in the language of the Age of Spectacle project, conspiracy theories are extensive-narrative, so protracted and socially reinforced, A/not-A structured astounding complexes. The twin Trumpian Big Lie about November 2020 and January 6, 2021 is the quintessentially spectacular American political lie of our time—really of all time by American history standards. It is the two-headed carnival calf of old carried to unprecedented prominence by the mass cultural infusions enabled by 21st-century high-tech cybergadgets, specifically by the combination of hyperconnectivity and disintermediation that those gadgets enable.
More specifically, massive hyperconnected exposure to high-graphic and often fantasy-based mass entertainment at a time of deep literacy erosion has routinized the brains of what has become a politically dominant plurality of mostly non-deep reading Americans to expect certain forms of political narrative and not others. Their evoked set expects not only simplified, reality-TV-ish storylines than cannot possibly capture or account for real-world circumstances, but they also expect without realizing it a temporal fit between a conspiracy theory narrative and the typical adventure/detective TV offering. These days, with streaming video and binge-watching the new norm for many, “Game of Thrones”-scale ornamentation sizes what conspiracy narratives need to be like to land. The QAnon stuff is an apt case in point. Conspiracy theories, in other words, must become proportionately ornate with imbibed entertainment fare to fully resonate within the target’s brain.
Finally in this regard, conspiracy theory stuff also tends to elicit a similar range of brainwave modalities (usually low-alpha to theta) as the two-dimensional entertainment fare that screenheads marinate in. Note, too. in this regard that the means of technological transmission that bring most Americans their so-called news these days—talking cable heads and social media—are identical to the means of technological transmission of conspiracy theory stuff, whether Alex Jones Infowars hooey or a range of loony stuff on the dark web. That match-up of communication transmission motifs magnifies the shadow effect of the technology, an addictive technology, on political perceptions.
Nothing New?
Clearly, the spectacalization of American politics has been going on for a while now, and the role of communications technologies in it is nothing new. Daniel Boorstin pioneered the scholarly genre with a 1961 book called simply The Image. In 1982 Walter J. Ong showed in his now-classic Orality and Literacy masterpiece what a reversion to oral forms of communication by dint of new technology might mean for culture and politics. In 1985 Neil Postman famously scoured the subject in his prophetic Amusing Ourselves to Death, and in 1998 Neil Gabler updated and elaborated the point in Life: The Movie.
But all these analysts, and of course others, were grappling mainly with the impact of television as a transmission device of social, cultural, and political content. The internet age is much more dangerous. Television was and is not truly addictive, particularly when it is experienced non-solipsistically; at most it could mesmerize kids into staring at test patterns when they should have been in bed asleep. Yes, it was bad enough, back in the day, to have a 24,000-volt cathode ray tube aimed at your head. But now we have what would have been described a mere quarter century ago as miniaturized supercomputers aimed algorithmically at our brainstems. (I am not exaggerating and you know it; so you want to shout out “yikes!” about now, don’t you? Well you go right ahead; it’s OK…...)
So look back now at what has happened to American politics since the dawn of the internet age in the mid-1990s, and since the saturation of smartphone ownership in the United States in about 2006-7. With each passing year political life has become more polarized, less civil, and altogether less functional; it has also become more laced with know-nothing rightwing populism, woke ideological fantasies festooned with hints of magical efficacy, and major-party policy brain death in between the two. In the first of the past three presidential cycles We the People, then still pretty newly drunk on cyber-spectacle, elected an encyclopedically ignorant man with an obvious and serious personality disorder to inhabit the Oval Office and gain control over rather a lot of nuclear weapons. The hangover from that experience was painful enough, along side the COVID-19 experience, that we then resorted to an aged mediocrity whose judgmental error rate doomed his hapless successor. Then we got drunk again on even more alluring cyber-spectacle and we re-elected as President a cross between a cultic shaman and a circus barker. Dysfunctional? Sure, but entertaining? Well, that depends—does it not?—on what We the People have gotten used to by way of entertainment, and how.
Speaking of which, the entire progression of the American Republic of Entertainment, as Gabler archly called it in 1998, is rich with irony—of a decidedly frustrating variety.
In the television age it took critics at least two decades to realize that it was not just the content of commercial television that might be a problem, but the technological delivery vehicle of that content, as well. It was only in 1977 that Marie Winn published The Plug-In Drug; less impactful but as important was Gerry Mander’s 1978 book Four Arguments for the Elimination of Television. Winn and Mander stood then in a line of, or slightly off to the popularizing side of, a scholarly apostolic succession that arguably began with Charles Horton Cooley, danced sideways to José Ortega y Gassett’s formulation of the “reason of unreason,” returned to North America with Harold Innis, continued with his University of Toronto student Marshall McLuhan, included Daniel Boorstin and Walter J. Ong both aforementioned, then segued to George Gerbner, with his “mean world syndrome” and “industrial folklore” research project at the Annenberg School at Penn, then on to Postman’s Amusing Ourselves to Death and Gabler’s Life: The Movie.
The point of all this is that we know damned well by now, and we have known for years, that the medium of transmission—the how factor—as well as the content—the what factor—is critical for how the technology affects society, including as a subset of phenomena how a given conspiracy theory will land. Yes, thirty or so years ago we as a political community finally accepted the fact that the very nature of a communications technology, not just its content, could drag American political culture in ways and into places we never imagined. And the frustration? Well, as soon as we finally got to that point, right at the portal of the age of the internet and the digital tsunami, what did we do? We willfully forgot or conveniently ignored everything we had strained to learn over the previous three decades.
And why did we, most of us anyway, do that? Because visions of dollar signs followed by integers and many, many zeros with no decimal points in sight lured our leaders, corporate and political alike, into a willful stupidity so wide and deep that progeny, if we have any that can still think, will wonder how such a thing could have happened.
Yes, technological optimists and macroeconomists sans sociological filters multiplied like dazed mosquitoes in a rainy June. Alas, too many of us failed to B-team the optimists and reprove the statistically autistic macroeconomists. Without prejudice aforethought we should always B-team technological optimists; as Archy said to Mehitabel, “An optimist is a guy without much experience.” And if Don Marquis is too much of a literary leap for you, take cover with Philip Rieff, who once gently chided the legacy of Professor Cooley as follows: The power of communications networks to create “a larger togetherness,” he wrote in 1962, “reflected a deeper disinclination to take into account the demonic in man.”[4] One could always count on Professor Rieff to identify a deathwork when he saw one, and sometimes even when he didn’t.
Our elites are guilty of many things these past three to four decades—hubris, ignorance, insularity, moral obloquy, graft and self-dealing, and more besides. But really, nothing compares to the harm they have done through sins of omission, from letting avaricious techno-feudalists plunder the halls of government and the commons alike, of letting a new mega-emanation of “the malefactors of great wealth” run roughshod over the nation, turning its citizens as best they have been able into Eloi with debit cards who cannot recognize their own best interests if their lives depended on it, and they sort of do.
How has the American elite gotten away with such yawning stupidities? Probably because the rest of the American demos wallows in even shallower waters. Again: The most impactful conspiracy theory in American history, the Trumpian twinned Big Lie, landed with a hellacious thud about five and a half years ago and now dominates the civic scene of a great power like no other Big Lie has since the 1930s. It is therefore amazing that even some otherwise intelligent people, let alone the American hoi palloi, wonder if we might sooner or later experience a constitutional crisis in the United States, when we are obviously already stuck right in the middle of one. If we want to wonder, we need to wonder not about a crisis but about a constitutional extinction event ahead, which, if it happens, will end one crisis only to begin another.
What after-crisis, exactly? Around the corner of the coming constitutional extinction event we stand on the verge not of our economy being inundated by crypto-currency but of our culture and politics being suffocated by crypto-government, namely by large algorithmically armed corporations stripmining our sanity, with the grasping outstretched hands of our venal political class palms up in complicit approval. As is often the case, the better sci-fi writers among us have already anticipated the essence: Neal Stephenson’s Diamond Age from 1995 was first past the post on this score, and Naomi Alderman’s 2023 novel Future is lately bringing up the rear.
Certainly the second Trump Administration will not throw on any brakes go arrest ths process. All thirteen of its tenured billionaires in office, the dark lord of Silicon Valley himself glowering from the shadows close behind, see the same dancing dollar signs, even more of them actually, than their internet/smartphone/social media predecessors saw thirty years ago.
Meanwhile too, and this has to merit a mention again despite being discussed in recent posts, our elites seem determined to run a frenetic no-holds-barred AI race with China, not least in military-related fields, when—as wise and wily Melissa puts it—the finish line of that race is no place even near where we should want to be. Not that AI does not promise many positive applications for collecting, analyzing, and monitoring information flows in domains from traffic control to greater pharmacological precision in treating illness to regenerative agriculture, long may it advance and prosper. But it is one thing to let professional healers and managers have at AI-infused technology on behalf of the commonweal, another to give it to generals and admirals who may not really grasp the ontological nature of the machines they so crave to have and hold, and another still to let any conspiracy-addled nitwit use AI for purposes from the trivial to the unmentionable to the plainly dangerous. If American authorities had been so glibly market-directed with nuclear energy in 1945-46, instead of creating the Atomic Energy Commission to study and regulate it instead, we might not even still be here—and AI is no less revolutionary a technology now than nuclear energy was in 1945.
Pick Up the Pieces
So finally on to pragmatics: What can we actually do to rein in the cybermedial derangement of our political order and culture? If education and moral suasion are not nearly enough, then what is or would be enough?
You’re probably not going to like, or want to accept, my answer: It’s too late to avoid a constitutional extinction event, which I see coming at and just after the November 2026 midterms. The process of deliteracization combined with mounting cyberaddiction has gone too far to expect We the People, in our current degraded condition, to resist the growing onslaught of Trumpenproletariat thymotic nihilism that the strategically placed anarcho-libertarian minority among us will use as distraction to tighten their grip on what has increasingly become an ensemble of predatory gigantist corporate looting machines.
I do not foresee, however, any post-democratic form of order that will be stable under crypto-governmental conditions. I expect instead the United States to fragment into state sovereignties with perhaps some regional alignments taking fairly quick shape, say in New England and the Pacific Northwest for “blue” states, the Deep South perhaps for “red” states. We already see some hints of this, with cores of influential people in some “blue” states talking about withholding revenue from the Treasury lest it be used to finance an authoritarian due-processless ICEish police state.
Here is, possibly, the rosy rub on that, however: By and large, the major “blue” states pay into the Treasury more than they receive back in benefits. In effect, “blue” state populations and economies are subsidizing relatively poorer “red” state populations and economies, and have been doing so for years. This corresponds to relatively better-educated populations subsidizing relatively less-well educated populations, which ought to surprise no one aware of the role of symbol-manipulating human capital in a post-industrial economy. If most or all the larger “blue” states secede financially from the Federal monetary system, they will be increasingly well off compared to those in “red” states that are the beating heart of the MAGA populist support base. But that beating heart may develop A-fib in a hurry: We already see with “red” state nervousness over the massive Medicaid cuts in the “big, beautiful bill” no small dollop of disillusionment and erosion in that support base.
It is not possible today to do at the level of the Federal government what needs to be done to create an effective cybermedia temperance movement, namely (you’ve been waiting patiently, I know): (1) break up the Big-Tech Five—Alphabet (Google), Amazon, Apple, Meta, and Microsoft—at the point of an empowered Brandeisian antitrust spear and redistribute their functional assets into a set of public-private arrangements, managed by the FCC, of the sort we already use for all other critical infrastructure, namely, a public utility model; (2) expose internet content providers to legal remedies for damages their content causes; (3) regulate and as necessary criminalize within new FCC guidelines hate speech and bullying content on the internet; (4) fully integrate cyber-education and addiction treatment into what passes today for our educational and a healthcare systems; (5) prohibit screen exposure to all children aged three and under; (6) empower all U.S. public schools to ban smartphones from school property and transportation vehicles; and (7) re-focus K-12 education away from mandatory testing and toward full deep-reading literacy as a prerequisite for high school graduation. Yes, this goes beyond education and moral suasion. I know that.
Of course, none of these things are possible, or even close to possible, in the age of Trump 2.0. But all of them might be done on a state by state and perhaps region by region level after the fragmentation of the country following a constitutional extinction event, if the citizens of smaller, default-subsidiarity self-governing units wish them to be done. Once the country sobers up from its massive and still burgeoning cyberaddictions, as it did from its rampant alcoholism in the 19th century, a new integrative federal arrangement may become possible. If enough Americans can agree on the vision of a great ensemble of gardens, and on the core principles necessary to bring that vision to reality, it is or will in due course become possible. We might have to divorce from each other before we can truly, happily, remarry. Fragmentation does not therefore presuppose an unhappy ending, only a rough adventure, at least at first.
Obviously, a lot of questions remain unasked, let alone unanswered, about what a fragmentation scenario would imply for a whole host of critical areas we have not yet even mentioned—not least national security policy, and internal security for newly sovereign states. Look buddy, this is only one essay, so chill, please. Don’t be angry; read this poem instead:
Phombie, O Phombie
Thou who art in thrall to the captological sharks coming to call,
each time you stare into empty electron shells and spaces,
and see the melting of compassion ghosts’ faces,
breadcrumbing your own life, the soul falls away
commodified in tweets that litter the fray
of shouting unconnected dots pulsating with empty egos’ wads
long since shot, O when will you stop
shattering your precious repose with skitterings of nervous energy,
as mites of cyberfrass steal away your sleep by the hues of
flickering blue lights, your mind’s reaming, clean sweep leaving
a stunned maudlin wakefulness in place of sweet dreaming?
O phombie, O phombie, you’re not yet dead, only dazed;
so rise from the basement of your heart’s long lost ways,
dance into the sunlight, your arms raised in joy,
your hands free and empty of addictive cybertoys—
Your future is bright, no more endless nights,
if you’ll only forego cruel self-deceptions, and yes,
once again live, even thrive, in all three dimensions.
[1] For just two examples: Michel Desmurget, Screen Damage: The Dangers of Digital Media for Children (Polity Press, 2022), and Adam Gazzaley and Larry D. Rosen, The Distracted Mind: Ancient Brains in a High-Tech World (MIT Press, 2017).
[2] Lanier, Ten Argument for Deleting Your Social Media Account Right Now (Henry Holt and Company, 2018).
[3] Sigman quoted in Richard Cytowic, Your Stone Age Brain in the Screen Age: Coping with Digital Distraction and Sensory Overload (MIT Press, 2024), p. 157.
[4] Rieff, Social Organization: A Study of the Larger Mind (Schocken, 1962), p. xv, quoted in Carr, Superbloom, p. 15.