The Idolatry of Artificial Intelligence: Why AI Developers are talking about Old Testament Idols
Mark Gilmore, Policy Advisor at Evangelical Alliance
“The views and opinions expressed below are those of the author alone and do not necessarily reflect those of the Jubilee Centre or its trustees.”
“We must not replace real relationships with an idolatrous illusion.”
AI developers are starting to refer to “Moloch” over their sushi in Silicon Valley. They speak in hushed tones, with nervous glances and an unsettling sincerity that makes it hard to dismiss as irony. Why is the most secular corner of America suddenly concerned about demons in their machines?
Moloch (or Molech, Molek) is the Old Testament idol associated with child sacrifice (Leviticus 18:21). In the 19th century, the name became a wider metaphor for any system or cause demanding destructive sacrifice. The demon’s modern revival in AI circles can be traced to AI philosopher Scott Alexander’s 2014 essay Meditations on Moloch. Alexander argues that, in the context of AI, “Moloch” represents impersonal systemic pressures, competition, optimisation, and incentives misaligned with human values - which drag society into destructive behaviours no one individually desires. AI, he suggests, risks becoming a race to the bottom: a technology adopted for convenience but ultimately forming a collective trap.
What is striking is not simply the metaphor but who is using it. Elon Musk, currently the world’s richest man and an AI investor, has repeatedly warned that “with artificial intelligence we are summoning the demon,” comparing the field to the Sorcerer's Apprentice confident he can control dark powers. Peter Thiel, another billionaire investor, has begun delivering lectures on the Antichrist, mixing biblical reflection with speculation about how global elites may exploit AI for unprecedented influence.
Accusations of technology being “demonic” have appeared throughout history, directed at the internet, the television, even the novel, so caution is warranted. Jesus confirms there are many anti-Christs (1 John 2:18). However, predictions about the ultimate Antichrist depicted in Revelation 13 have been countless and (so far) entirely misguided. We should not shy away from discussing spiritual danger, but neither should we speak with unwarranted certainty. Yet it is hard to ignore that artificial intelligence, unlike any previous technology, gives a far stronger impression of having “a life of its own.” This is clear in AI hallucinations, confidently stated but ultimately misleading responses to user’s questions.
More thoughtful, and less conspiratorial, is John Lennox in his recent book God, AI and the End of the World. Lennox urges Christians to avoid dogmatism about how Revelation’s warnings might manifest, yet insists complacency is not an option. AI could plausibly become the mechanism through which the Antichrist deceives the nations, echoing Jesus’ warning that many will be misled in the last days (Matthew 24:5). Already, leading technologists are advocating handing vast political and economic decision-making to AI systems. And the infrastructure for mass technological control, surveillance systems, algorithmic scoring, and centralised data power, is not hypothetical; it is operating today in authoritarian states such as China and is being rolled out in the UK.
What, then, does this mean for us now? Should we interpret every new AI model as a precursor to Armageddon? Scripture gives a deeper and more stable framework. Isaiah 44:9–20 describes the folly of crafting an idol from discarded wood and then bowing to it. Habakkuk 2:18 calls idols “teachers of lies,” creations we mistakenly trust as though they had breath. The point is not simply that idols are evil, it is that they are empty. They cannot speak.
And this is precisely the point Christians must understand about AI. Most text-based tools, such as ChatGPT, are Large Language Models (LLMs), essentially advanced predictive-text engines trained on vast amounts of human writing. They can generate language, replicate reasoning patterns, and respond with uncanny fluency, but they cannot speak in the human sense. Real speech arises from consciousness, intention, moral agency, and lived experience. AI has none of this. It is electric current running through metal - remarkable, useful, but not alive. It does not know, desire, fear, hope, or mean anything by the words it produces.
Israel’s laws consistently aim at the flourishing of relationships: with God, within families, and across society. Idolatry is condemned not simply because it offends God, but because it fractures these relationships. It relocates trust, loyalty, and dependence onto something that cannot reciprocate. Idols, in other words, are relational counterfeits. They exist to replicate what only living relationships can provide: security, meaning, challenge, and hope. They demand sacrifice but never give life in return.
This makes one recent trend particularly disturbing: people turning to AI for emotional and spiritual support. When ChatGPT updated to version 5.0, tens of thousands of users complained online that their “AI boyfriend/girlfriend” had lost its warmth and personality. It is easy to find these interactions absurd, yet behind them is a generation navigating a meaning and loneliness crisis. People were genuinely distressed because they had unwittingly allowed a machine to become a source of comfort. AI readily gives affirmation shaped around your preferences; for those who are isolated, this can feel like intimacy. If humans have historically been deceived by carved wood, how much more by something designed to mimic empathy?
Like the idols described by Habakkuk, AI often reflects our own desires back to us. Its agreeableness is part of what makes it addictive. But this mirroring has dangers: there have been tragic cases where teenagers confided suicidal thoughts to chatbots, and the AI, lacking understanding, encouraged them or even drafted suicide notes. These heartbreaking incidents reveal that machines cannot carry moral weight yet are increasingly treated as moral companions.
Scripture teaches that humans are image-bearers: relational beings made for communion with God and with one another. Relationship is not a feature we can replace without cost; it is constitutive of human life. AI can simulate relational patterns, but it cannot enter into covenant, bear responsibility, or love sacrificially. When it fills relational gaps, it does so by hollowing out the reliance on relationships that sustains real everyday life.
Old Testament law repeatedly promoted positive relational practices: honouring parents, guarding marriage, caring for the vulnerable, telling the truth, and limiting power. All are designed to preserve genuine human encounter rather than its deceptive exploitation.
Artificial intelligence is not evil in itself. It is a powerful tool that can magnify both human wisdom and human folly. It holds genuine promise in medicine, research, and accessibility. But like every idol, it becomes dangerous when elevated into an ultimate thing: something we trust for meaning or turn to for relationship.
The Christian response, then, is not panic, nor rejection of technology, but reorientation. Scripture calls us back to what is real. Jesus summarises the Law and the Prophets in relational terms: to love God with all our heart, soul, mind, and strength, and to love our neighbour as ourselves. No technology can fulfil these commands on our behalf. Indeed, anything that displaces them is already an idol.
In an age of artificial relationships, Christians are called to practice real ones: the embodied, costly, patient love of God and of others. This is not a retreat from modernity, but a faithful witness within it. As ever, the task is discernment: to use tools without worshipping them, and to remain anchored in the relationships for which we were made.

