Don’t be seduced by AI nostalgia — it’s a trap!



I don’t often argue with internet trends. Most of them exhaust themselves before they deserve the attention. But a certain kind of AI-generated nostalgia video has become too pervasive — and too seductive — to ignore.

You’ve seen them. Soft-focus fragments of the 1970s and 1980s. Kids on bikes at dusk. Station wagons. Camaros. Shopping malls glowing gently from within. Fake wood paneling! Cathode ray tubes! Rotary phones! A past rendered as calm, legible, and safe. The message hums beneath the imagery: Wouldn’t it be nice to go back?

Human nostalgia, as opposed to the AI-generated kind, eventually runs aground on grief, embarrassment, and the recognition that the past demanded something from us and took something in return.

Eh ... not really, no. But I understand the appeal because, on certain exhausting days, it works on me too — just enough to make the present feel a little heavier by comparison.

And I don’t like it. Not at all. And not because I’m hostile to memory.

I was there, 3,000 years ago

I was born in 1971. I lived in that world. I remember it pretty well.

How well? One of my earliest, most vivid memories of television is not a cartoon or a sitcom. No, I’m a weirdo. It is the Senate Watergate hearings in 1973, broadcast on PBS in black and white. I was 2 years old.

I didn’t understand the words, but I sort of grasped the tone. The seriousness. The tension. The sense that something grave was unfolding in full view of the world. Even as a toddler, I vaguely understood that it mattered. The adults in ties and horn-rimmed glasses were yelling at each other. Somebody was in trouble. Before I knew anything at all, I knew: This was serious stuff.

A little later, I remember gas lines. Long ones. Cars waiting for hours on an even or odd day while enterprising teenagers sold lemonade. It felt ordinary at the time, probably because I hadn’t the slightest idea what “ordinary” meant. Only later did it reveal itself as an early lesson in scarcity and frustration.

The past did not hum along effortlessly. Sometimes — often — it stalled.

Freedom wasn’t safety

I remember my parents watching election returns in 1976 on network television. I was bored to tears — literally — but I remember my father’s disappointment when Gerald Ford lost to Jimmy Carter. And mind you, Ford was terrible.

This was not some cozy TV ritual. It was a loss of some kind, plainly felt. Big, important institutions did not project confidence. They produced arguments, resentment, and unease. It wasn’t long before people were talking seriously about an “era of limits.” All I knew was Dad and Mom were worried.

I remember a summer birthday party in the early 1980s at a classmate’s house. It was hot, but she had an awesome pool. I also remember my lungs ached. That day, Southern California was under a first-stage smog alert. The air itself was hazardous. The past did not smell like nostalgia. It smelled like exhaust with lead and cigarette smoke.

I don’t miss that. Not even a little bit.

Yes, I remember riding bikes through neighborhoods with friends. I remember disappearing for entire days. I remember my parents calling my name when the streetlights came on. I remember spending long stretches at neighbors’ houses without supervision. I remember watching old movies on Saturdays with my pal Jimmy. I remember Tom Hatten. I remember listening to KISS and Genesis and Black Sabbath. That freedom existed. It mattered. It was fun. But it lived alongside fear, not in its absence.

Innocence collides with reality

I don’t remember the Adam Walsh murder specifically, but I very much remember the network television movie it inspired in 1983. That moment changed American childhood in ways people still underestimate. It sure scared the hell out of me. Innocence didn’t drift into supervision — it collided with horror. Helicopter parenting did not emerge from neurosis. It emerged from bona fide terror.

And before all of that, my first encounter with death arrived without explanation. A cousin of mine died in 1977. She was 16 years old, riding on the back of a motorcycle with a man 11 years her senior. She wasn’t wearing a helmet. The funeral was closed casket. I was too young to know all the details. Almost 50 years on, I don’t want to know. The age difference alone suggests things the adults in my life chose not to discuss.

Silence was how they handled it. Silence was not ignorance — it was restraint.

RELATED: 1980s-inspired AI companion promises to watch and interrupt you: ‘You can see me? That’s so cool’

seamartini via iStock/Getty Images

Memory is not withdrawal

This is what the warm and fuzzy AI nostalgia videos cannot possibly show. They have no room for recklessness that ends in funerals, or for freedom that edges into life-threatening danger, or for adults who withhold truth because telling it would damage rather than protect.

What we often recall as freedom often presented itself as recklessness ... or worse.

None of this negates the goodness of those years. I’m grateful for when I came of age. I don’t resent my childhood at all. It formed me. It taught me how fragile stability is and how much of adulthood consists of absorbing uncertainty without dissolving into it.

That’s precisely why I reject the invitation to go back.

The new AI nostalgia doesn’t ask us to remember. In reality, it wants us to withdraw. It offers a sweet lullaby for the nervous system. It replaces the true cost of living with the comfort of atmosphere and a cool soundtrack. It edits out the smog, the scarcity, the fear, the crime, and the death, leaving only a vibe shaped like memory.

Here’s a gentler hallucination, it says. Stay awhile.

The cost of living, then and now

The problem, then, isn’t sentiment. The problem is abdication.

So the temptation today isn’t to recover what was seemingly lost but rather to anesthetize an uncertain present. Those Instagram Reels don’t draw their power from people who remember that era clearly but from people who feel exhausted, surveilled, indebted, and hemmed in right now — and are looking for proof that life once felt more human.

RELATED: Late California

LPETTET via iStock/Getty Images

And who could blame them? Maybe it was more human. But not in the way people today would like to believe. Human experience has never been especially sweet or gentle.

Human nostalgia, as opposed to the AI-generated kind, eventually runs aground on grief, embarrassment, and the recognition that the past demanded something from us and took something in return. Synthetic nostalgia can never reach that reckoning. It loops endlessly, frictionless and consequence-free.

I don’t want a past without a bill attached. I already paid the thing. Sometimes I think I’m paying it still.

A warning

AI nostalgia videos promise relief without effort, feeling without action, memory without judgment.

That may be comforting, but it isn’t healthy, and it isn’t right.

Truth is, adulthood rightly understood does not consist of finding the softest place to lie down. It means carrying forward what we’ve lived through, even when it complicates our fantasies.

Certain experiences were great the first time, Lord knows, but I don’t want to relive the 1970s or ’80s. I want to live now, alert to danger, capable of gratitude without illusion, willing to bear the weight of memory rather than dissolve into it.

Nostalgia has its place. But don’t be seduced by sedation.

Editor’s note: A version of this article appeared originally on Substack.

OpenAI’s Sam Altman: Tech savior or tomorrow's supervillain?



Spend enough time around Silicon Valley these days, and you’ll hear a surprising thing — the V-word, villain, used to describe what would seem to be one of their own. Not every tech lord, venture capitalist, and founder sees OpenAI’s Sam Altman, the creator of ChatGPT, as a for-real bad guy, but more do than you might expect. The feeling is palpable, now that Altman speaks openly of raising $8 trillion, that today’s villain is well on his way to becoming tomorrow’s supervillain.

It’s an attitude arrestingly close to “doomer” status — the pessimistic attitude toward the onrushing future most techies decry in the name of a-rising-tide-lifts-all-boats optimism about innovation-driven progress. But even without diving into the progress debate, Altman’s uncanny advancement as the rare guy felt in the Valley to be suspect ethically raises significant questions about what can stop humanity’s human villains from accelerating us into a specifically spiritual catastrophe.

The face of our digitally manifested “collective consciousness” isn’t that of an autistic new Enlightenment. It’s schizoid pandemonium.

A fascinating piece of evidence is the euphoria surrounding OpenAI’s latest prompt-to-video product. Sora is a feature that will turn text into AI-generated videos. A series of sample clips triggered a wave of soyfacing and blown minds to rival the comparisons drawn by Apple Vision Pro testers to some kind of religious experience. “Hollywood-quality” ... “Hollywood beware” ... “RIP Hollywood” ... you can probably spend an hour on X just working through techland assessments of Sora’s impending impact. “This is the worst this technology will ever be.”

More mind blowing Sora videos from the OpenAI team\n\n1. Flower tiger
— (@)

There are skeptics, of course. Lauren Southern, who couldn’t get ChatGPT to “generate text with the word ‘libs’ in it,” mocked Sora’s prospects for sinking “woke Hollywood,” predicting “an age of censorship and gov curation the likes of which we’ve never seen before.”

The deeper issue is what exactly we mean by “Hollywood” — a matter akin to what exactly we mean by “the media.” These abstractions refer to corporations, of course, and in that sense, yes — Sora and its inevitable clones might make obsolete corporate mass entertainment in exchange for products directly from the regime itself.

But here we are again talking about abstractions. Hollywood, the media, and the regime are not simply organizations and baskets or networks of organizations, but people, specific flesh-and-blood human beings, with various spiritual lives in varying degrees of distress.

Innovations like Sora don’t just raise questions about which group of people will seize or inherit control of these video and narrative creation tools. They raise questions about whether the automation of content will cause more of us to believe that our spiritual health demands a turn away from worshipful or obsessive attitudes toward narrative altogether.

The dominance of Hollywood, Madison Avenue, and government propaganda arose amidst the televisual forms of communications technology that digital tech has leaped over. The people filling the image-mongering ranks and narrative-shaping executive offices of Los Angeles, New York, and Washington, D.C., came of age and rose to mastery in a world where whoever controlled the means of dream production held sway and whoever dreamed the biggest and best dreams earned an ethical right to rule.

But that state of affairs wasn’t simply determined by the formative influence of televisual tech. Fundamentally, it arose from the temptations that always bedevil us and threaten our spiritual health — not just the sparkling promise of evil and its earthly rewards but our dreams, senses, and passions.

Of course, it’s not our ability to see, smell, and taste, our imaginative and recollective faculties, or our capacity to desire that are evil. It’s that when spiritually undisciplined, all these attributes — which we so frequently idolize, trust, and artificially push to extremes — lead us badly astray into delusion, distraction, addiction, and perversion.

The rise of tools like Sora holds up an uncanny mirror to the idol factories already within in our hearts and minds, giving us a shocking vision of an infinite firehose mindlessly filling up every cranny of our awareness with everything we could ever lust after, everything we could ever describe, all we could fear, all we could imagine, all we could forget — all without us having to lift a finger.

After all, today’s text-based prompting will “eventually” give way, as Mark Zuckerberg recently and offhandedly remarked about Meta’s Apple Vision Pro competitor, to “a neural interface.” The face of our digitally manifested “collective consciousness” isn’t that of an autistic new Enlightenment. It’s schizoid pandemonium.

It all strongly implies that the antidote to Altman isn’t a law or an Iron Man-style superhero but a return to confront the soul sickness lurking in all our hearts and a sobered new willingness to accept responsibility for taking on the discipline to bend our will toward fighting for our spiritual health.

That’s not a very amaaaaazing elevator pitch for the next generation of content creation. Yet if we want to hang on to a future rich with human art worth making and sharing, our path won’t run broadly through a mania of mind-blowing machines but through the quiet, narrow passage of the divine.

MUST READ: How you survive the impending AI takeover (you’ve never heard this one before)



One thing most people agree on is that an artificial intelligence takeover is inevitable. Whether or not that will be beneficial for society, however, continues to be divisive.

Author, professor, and activist Jon Askonas joins James Poulos to discuss the harrowing implications of artificial intelligence when it comes to our future and what we must do when the takeover arrives.

Skeptics are highly suspicious of AI and immediately write it off as inherently evil, while proponents believe that it will solve all our problems and essentially save us.

But Jon and James do not fall into either camp.

They rather believe that thriving in a world dominated by AI will require a unique approach that neither entirely rejects nor submits to technology.

They also agree that people, especially Christians, must accept that AI is not just a super-science; it’s also a deeply spiritual matter.

“It's a powerful technology that will be used in spiritual warfare for good and for evil … but it’s still part of creation and so, like any part of creation, has to be grasped for its good uses,” Jon explains.

James agrees, adding, “One of the things that really sort of bums me out the most about this whole experience we’re going through is people who look at technology … as an evil god.”

The best way to survive the impending AI takeover is to “pray and pay attention to the world that surrounds you … cultivate [technology] and curate it intentionally as a site of spiritual warfare,” adds Jon.

To hear more of their fascinating conversation, watch the full episode below.


Want more from James Poulos?

To enjoy more of James's visionary commentary on politics, tech, ideas, and culture, subscribe to BlazeTV — the largest multi-platform network of voices who love America, defend the Constitution, and live the American dream.