Bill Gates demands a new religion for humanity



It’s a mask-off moment. On the “Possible” podcast, co-hosted by LinkedIn co-founder Reid Hoffman, Bill Gates insisted humanity would need a new religion or philosophy to cope with the reality of AI and the technological conquest of the world.

In his final comment on the episode — which Hoffman calls a “tour de force” — Gates reflects at length on the spiritual situation he believes unbridled tech is coercing us into.

Maybe we want to focus on ensuring people aren’t led into the darkness of worshipping their machines or automating their religion?

“The potential positive path is so good that it will force us to rethink how should we use our time,” he says. “You know, you can almost call it a new religion or a new philosophy of, okay, how do we stay connected with each other, not addicted to these things that’ll make video games look like nothing in terms of the attractiveness of spending time on them.”

On the surface, Gates seems to be advancing a claim plenty of people can agree with — the idea that the coming virtual world will be so tempting to disappear into that only a deep source of spiritual authority will be enough to remind us that we’re still best off sharing life together as the human beings we are.

But it’s not just the virtual world he’s talking about. “So it’s fascinating that we will, the issues of, you know, disease and enough food or climate, if things go well, those will largely become solved problems. And, you know, so the next generation does get to say, ‘Okay, given that some things that were massively in shortage are now not, how do, how do we take advantage of that?’”

Here’s where things get tricky. You might have wondered already why Gates, if he feels so sure that we need cosmic protection against becoming cyber zombies, doesn’t immediately reach for a religion that already exists and flourishes — especially Christianity, which still dominates American faith identification and significant segments of public life.

Well, his assumption is that tech will make obsolete at least some of the words of Christ, such as “you have the poor with you always,” as in always there for you to help and serve. Now one might say that if physical sickness and hunger are “solved problems,” many might still (or especially) suffer from mental and spiritual illness and thirst. But even that logic is not what Gates is interested in. He’s more concerned about sports.

Yes, sports. “You know, do we ban AI being used in certain endeavors so that humans get some — you know, you know, like you don’t want robots playing baseball, probably,” he stammers. "Because they’re, they’ll be too good. So we’ll, we’ll keep them off the field. Okay. How broadly would you go with that?”

Maybe so “broadly” that we’d want to focus on ensuring people aren’t led into the darkness of worshipping their machines or automating their religion? Perhaps that’s something we need to do already, not after the machines and their self-appointed masters — no matter how well intentioned — drag us to a place where our given humanity is almost unrecognizable.

“We are so used to this shortage world that, you know, I, I, I hope I get to see how we start to rethink the, these deep meaning questions,” Gates concludes. But for all his ostensible futurism, he blinds himself to the present — where some tech-savvy Christians are carrying on the work of years in making plain that the tools we need to ensure that we don’t wipe ourselves out with awesome wonders are already at hand … because they are the same yesterday, today, and forever.

Why religion will save us from automated warfare in the digital age



The technology now exists to render video games in real, playable time computationally — a first achieved with the classic pixelated first-person shooter Doom.

Don’t yawn — this isn’t just a footnote in the annals of nerd history. Elon Musk promptly chimed in on the news in the replies to promise, “Tesla can do something similar with real world video.”

We are now governed by people who seem hell-bent on preserving their power regardless of the cost — people who are also getting first dibs on the most powerful AIs in development.

The military applications of this latest leap forward are obvious enough. A person at a terminal — or behind the wheel — enters a seamless virtual environment every bit as complex and challenging as a flesh-and-blood environment … at least as far as warfare goes. Yes, war has a funny way of simplifying or even minimizing our lived experience of our own environment: kill, stay alive, move forward, repeat. No wonder technological goals of modeling or simulating the given world work so well together with the arts and sciences of destruction.

But another milestone in the computational march raises deeper questions about the automation of doom itself. Coinbase CEO Brian Armstrong announced that the company has “witnessed our first AI-to-AI crypto transaction.”

“What did one AI buy from another? Tokens! Not crypto tokens, but AI tokens (words basically from one LLM to another). They used tokens to buy tokens,” he tweeted, adding a 🤯 emoji. “AI agents cannot get bank accounts, but they can get crypto wallets. They can now use USDC on Base to transact with humans, merchants, or other AIs. Those transactions are instant, global, and free. This,” he enthusiastically concluded, “is an important step to AIs getting useful work done.”

In the fractured world of bleeding-edge tech, “doomerism” is associated with the fear that runaway computational advancement will automate a superintelligence that will destroy the human race.

Perhaps oddly, less attention flows toward the much more prosaic likelihood that sustainable war can soon be carried out in a “set it and forget it” fashion — prompt the smart assistant to organize and execute a military campaign, let it handle all the payments and logistics, human or machine, and return to your fishing, hiking, literary criticism, whatever.

Yes, there’s always the risk of tit-for-tat escalation unto planetary holocaust. But somehow, despite untold millions in wartime deaths and nuclear weapons aplenty, we’ve escaped that hellacious fate.

Maybe we’re better off focusing on the obvious threats of regular ordinary world war in the digital age.

But that would require a recognition that such a “thinkable” war is itself so bad that we must change our ways right now — instead of sitting around scaring ourselves to death with dark fantasies of humanity’s enslavement or obliteration.

That would require recognizing that no matter how advanced we allow technology to become, the responsibility for what technology does will always rest with us. For that reason, the ultimate concern in the digital age is who we are responsible for and answerable to.

As the etymology of the word responsible reveals (it comes from ancient terminology referring to the pouring out of libations in ritual sacrifice), this question of human responsibility points inescapably toward religious concepts, experiences, and traditions.

Avoiding World War Autocomplete means accepting that religion is foundational to digital order — in ways we weren’t prepared for during the electric age typified by John Lennon’s “Imagine.” It means facing up to the fact that different civilizations with different religions are already well on their way to dealing in very different ways with the advent of supercomputers.

And it means ensuring that those differences don’t result in one or several civilizations freaking out and starting a chain reaction of automated violence that engulfs the world — not unto the annihilation of the human race, but simply the devastation of billions of lives. Isn’t that enough?

Unfortunately, right now, the strongest candidate for that civilizational freakout is the United States of America. Not only did we face the biggest shock in how digital tech has worked out, but we also have the farthest to fall in relative terms from our all-too-recent status as a global superpower. We are now governed by people who seem hell-bent on preserving their power regardless of the cost — people who are also getting first dibs on the most powerful AIs in development.

Scary as automated conflict indeed is, the biggest threat to the many billions of humans — and multimillions of Americans — who would suffer most in a world war isn’t the machines. It’s the people who want most to control them.