Thou Shalt Not Kill – Abraham Kuyper and the AI Revolution


New York, New York by Mario Hains (CC-BY-SA-3.0).

In November 2022, OpenAI released ChatGPT to the public, and within five days, the chatbot had acquired one million users. By January 2023, it had become the fastest-growing consumer application in history, with over 100 million monthly active users. The artificial intelligence could write essays, debug code, compose poetry, and explain complex scientific concepts—all within seconds. Teachers panicked about academic integrity. Writers worried about their livelihoods. Artists watched in horror as AI image generators like DALL-E and Midjourney produced stunning artworks in the style of living artists, sometimes winning competitions against human creators. The technology press celebrated a new era of productivity and innovation, while labor economists warned of mass unemployment. In boardrooms across Silicon Valley, executives spoke excitedly of “efficiency gains” and “scaling without headcount.” What few acknowledged openly was a troubling reality: human workers were becoming, in the language of optimization, redundant.

This moment bears an uncanny resemblance to another technological revolution that reshaped the landscape of human work more than a century ago. When Abraham Kuyper delivered his address “Christianity and the Class Struggle” in 1891, Europe was reeling from the social upheaval wrought by steam power and mechanization. The Industrial Revolution had fundamentally altered the relationship between capital and labor, and Kuyper watched with alarm as workers were reduced to what he called “machines of flesh that can be retired or scrapped when they break down or have worn out” (332-333).

The Industrial Revolution’s Terrible Arithmetic

Kuyper witnessed firsthand how technological advancement, for all its promise, had created what he termed a “deep-seated social need.” He described the transformation starkly: “The incredible revolution wrought by the improved application of steam power and machine production… has freed capital almost completely from its earlier dependence on manual labor. The workingman’s muscle power and his resourcefulness and traditional skills in many ways have turned into dead capital; his value now lies almost entirely in servicing machines according to set instructions” (332-333).

The parallels to our present moment are striking. Just as steam cranes in Kuyper’s day could do the work of twelve men with only three, today’s AI systems can produce in seconds what would take human writers, programmers, or artists hours or days to create. The “magical operation of iron machines” that Kuyper condemned has been replaced by the equally magical—and equally problematic—operations of neural networks and large language models. The technology has changed, but the underlying dynamic remains disturbingly similar: human workers are being valued not for their inherent dignity as image-bearers of God, but for their efficiency relative to machines.

Kuyper’s diagnosis of this devaluation was theological before it was economic. He wrote that “to mistreat the workmen as a ‘piece of machinery’ is and remains a violation of his human dignity. Even worse, it is a sin going squarely against the sixth commandment, thou shalt not kill, and this includes killing the worker socially” (57). This is the phrase that should haunt every technology executive, every venture capitalist, and every software engineer working on automation: the reduction of workers to mere factors of production is not simply an economic inefficiency or a political problem—it is murder. Social murder, but murder nonetheless.

The French Revolution’s Children

To understand why both the Industrial Revolution and today’s AI revolution produced such acute social crises, Kuyper insisted we must examine their philosophical roots. For Kuyper, the fundamental problem was not technology itself but the worldview that shaped its deployment. The French Revolution had enthroned what he called an “atomistic” individualism that destroyed organic social bonds and replaced them with “the monotonous self-seeking individual, asserting his own self-sufficiency” (34). This revolution, Kuyper argued, “could not but become the cause of a deep-seated social need” because it represented “possession of money as the highest good” while “set[ting] every man against every other” in ruthless competition (35).

This diagnosis illuminates our present crisis with remarkable clarity. Contemporary AI development operates within what we might call “neo-atomistic” capitalism—a system that fragments workers into individual “human resources,” measures human value in terms of productivity metrics, and celebrates “disruption” as an unqualified good. The language of Silicon Valley reveals the worldview: workers are “FTEs” (full-time equivalents), relationships are “networks,” and human creativity is merely “content” to be optimized and monetized. As Derek Schuurman observes, Kuyper’s concept that “there is no neutral space” directly refutes “the assumption that an artifact is ‘just a neutral tool.’” Technology, like all human cultural activity, embodies the values and worldview of its creators.

The ideology driving much of AI development mirrors what Kuyper identified as the “mercantile gospel of ‘laissez faire,’” (35) which proclaimed that unleashing individual competition would benefit all. Just as the industrial capitalists of Kuyper’s era argued that mechanization would ultimately create prosperity for everyone, today’s tech leaders promise that AI will “augment” human workers and create new opportunities. Yet Kuyper warned that such promises ring hollow when “the bourgeoisie makes a display of its luxury which creates a false want in the poorer classes” (36-37) while actual material conditions deteriorate. The billionaire founders of AI companies assure us that artificial general intelligence will usher in an age of abundance even as they lay off thousands of employees and concentrate wealth at unprecedented levels.

Work as Divine Calling, Not Economic Function

Against this reductionist view of human labor, Kuyper articulated a rich theology of work rooted in Genesis and affirmed throughout Scripture. Work, he argued, is neither punishment for sin nor merely a means to economic ends—it is a divine calling woven into the fabric of human nature itself. “To work every day that God gives us, to accomplish something that makes up for the length of that day… that is a divine ordinance,” Kuyper wrote. “It applies to human beings not just after the fall but also before it” (376).

This theological foundation transforms how we must evaluate AI’s impact on employment. If work is merely an economic transaction—labor exchanged for wages—then replacing human workers with more efficient AI systems might seem rationally justified. But if work is, as Kuyper insisted, integral to human flourishing and dignity, then the wholesale displacement of workers constitutes a profound spiritual and social harm, regardless of economic efficiency gains.

Kuyper emphasized that work serves multiple purposes beyond material provision. It “bans idleness, suppresses capriciousness, teaches us to submit to a certain discipline in life,” and is “one of the most powerful factors God uses to counteract the bursting forth of sin” (260). Work provides structure, meaning, and opportunity for human beings to exercise their creative capacities in ways that reflect their status as image-bearers of God. When AI systems eliminate not just dangerous or degrading work but meaningful work—the work of writing, creating art, solving problems, and exercising skill—they threaten these deeper dimensions of human flourishing.

Contemporary research in AI ethics bears out Kuyper’s concerns in striking ways. Erin Holmberg, applying Kuyperian principles to generative AI, notes that “human workers must never be devalued to the level of a computer” because “humanness is not based on ability or skill; humanness is based on image-bearer status” (37). A living artist whose style can be mimicked by AI has not lost their value because the AI can produce similar outputs. Their dignity and worth remain unchanged because they are made in the image of God—something no algorithm can claim, regardless of its sophistication.

The Government’s Role: Justice, Not Charity

Kuyper was no romantic Luddite calling for the destruction of machines, nor was he a statist socialist demanding government control of the means of production. Instead, he articulated what we might call a sphere sovereignty approach to the social question—one that assigned distinct but complementary roles to government, society, and the church. This framework offers crucial guidance for responding to AI-driven displacement.

The government’s role, Kuyper insisted, is to uphold justice, not to manage society or replace civil institutions. “God the Lord unmistakably instituted the basic rule for the duty of government,” he wrote. “Government exists to arrange His justice on earth, and to uphold that justice”(57). Critically, justice requires that government not favor one sphere over another: “What it may therefore do in no case is to grant such assurance of justice to one sphere and withhold it from another”(58).

Applied to our context, this means government has a legitimate role in regulating AI development and deployment to protect workers from unjust treatment. Kuyper argued that “a code for commerce” must be matched by “a code for labor”—the government must “help labor obtain justice, and also for labor there must be created the possibility of independently organizing and defending its rights” (58). Contemporary proposals for AI regulation that ensure transparency, accountability, and protection for workers’ rights align with this Kuyperian framework. Government should not attempt to stop technological progress, but neither should it passively allow capital to “absorb more and more capital” until workers face the “brazen law of iron necessity” (36n19).

However, Kuyper was equally adamant that material assistance from government should be minimized: “unless you would enervate the position of the laboring class and destroy its natural dynamic, always limit the material assistance of the state to an absolute minimum. The continuing welfare of people and of nation, and so too of labor, lies only in powerful individual initiative” (58). This principle suggests skepticism toward proposals for universal basic income as the primary response to AI displacement. While safety nets may be necessary, Kuyper would argue that the solution must lie in enabling workers to participate meaningfully in economic life, not in making them permanent dependents of state largesse.

Common Grace and Christian Innovation

One might worry that Kuyper’s critique of industrial capitalism and contemporary AI development leads to technological pessimism or withdrawal. Quite the opposite. Kuyper’s doctrine of common grace provides a framework for affirming the genuine goods that technology can produce while maintaining critical vigilance about its direction.

Kuyper observed that “in modern times… a rich science is blossoming. Although being conducted almost exclusively by people who are strangers to the fear of the Lord, this science has nevertheless produced a treasury of knowledge that we as Christians admire and gratefully use” (52-53). He marveled at technological achievements: “Medical science… has been the instrument for relieving much suffering, for curbing many diseases, and for disarming much latent evil before its outbreak. Natural science has armed us in extraordinary ways against the destructive power of nature” (97).

Applied to AI, this means Christians should neither reflexively oppose the technology nor uncritically embrace it. Schuurman notes that AI has “exciting new possibilities” in medicine, where it aids in drug development and disease identification; in environmental monitoring; and even in Bible translation work. AI protein-design tools are accelerating medical research. Machine learning systems can help doctors diagnose diseases more accurately. These applications represent genuine goods enabled by common grace—gifts that should be received with gratitude and stewarded wisely.

Yet common grace does not absolve us of responsibility for ensuring technology serves human flourishing rather than human diminishment. Kuyper distinguished between the “structure” of creation (the good gifts God has embedded in the natural world) and the “direction” those gifts take (toward obedience or rebellion) (59). Technology itself is part of the created order’s potential, but the direction it takes depends on the worldviews, values, and choices of those who develop and deploy it. AI can be directed toward healing or toward exploitation, toward augmenting human dignity or undermining it.

The Idol of Mammon and the Engineering Mentality

Kuyper understood that technology’s danger lies not merely in its material effects but in its spiritual impact—in what it teaches us to value and trust. He warned that technological mastery breeds a dangerous self-sufficiency: “The universal dominion that we have achieved over the powers of nature has stimulated humanity’s feeling of power and thus has significantly weakened humanity’s feeling of dependence. Therefore, it had to lead to a dampening of religious life” (46).

This warning resonates powerfully with what we might call the “engineering mentality” prevalent in Silicon Valley—the assumption that all problems are technical problems amenable to technical solutions. When AI researchers speak of “solving intelligence” or “optimizing” human potential, they reveal a technocratic hubris that imagines human beings and human societies as systems to be debugged rather than as mysterious bearers of divine image to be respected. The transhumanist dreams of some AI researchers—uploading consciousness, achieving digital immortality—represent precisely the kind of technological idolatry Kuyper feared.

Moreover, Kuyper identified how technology disrupts the rhythms and practices necessary for spiritual health. Writing in the early twentieth century, he lamented how “electrical wires connect cities and towns to each other… The telegraph overwhelms you with urgent messages. The telephone distracts your attention from your work. It is no longer possible to walk calmly through big cities” (50-51). These words, written before smartphones and social media, describe our present distraction with eerie prescience. Replace “telegraph” with “social media” and “telephone” with “iPhone,” and Kuyper could be writing about 2025.

The loss of what Kuyper called “calm and quiet in which the pious life used to flourish” (46) has only intensified in our age of algorithmic feeds and AI-generated content designed to maximize engagement. Kuyper warned that “your minds are inundated, occupied by the sheer amount of knowledge and information about all sorts of things that bombard them day after day” (62) The solution, he suggested, is to “protect the freedom of our mind and force it to concentrate on what matters” (61-62)—to move “from the many, the varied, and endlessly distinct to the coherence of all things (63), penetrating through to the One from whom everything comes.”

Palingenesis: The Heart of the Matter

Ultimately, Kuyper insisted that no technological or political solution would suffice without spiritual regeneration—what he called “palingenesis” (27), using the biblical term for rebirth (18). He warned against intellectualism, the temptation to have “thinking and reasoning heads on a stake” (22) without “a heart under them and with two legs brought in motion by real life.”

This emphasis on personal transformation alongside structural critique distinguishes Kuyper’s approach from both naive technophilia and reactionary technophobia. Christians cannot simply regulate AI into submission or wish it away. They must, Kuyper would argue, ensure that their engagement with technology flows from “a right and living relation to the King” (32). Without this foundation, even the most sophisticated ethical frameworks for AI development remain mere “lists” that “aren’t worth a dime” (14) without a living faith.

This personal dimension does not diminish the urgency of structural reform—indeed, it intensifies it. Kuyper’s pastoral heart drove his political activism. He saw in the suffering of workers not merely an economic problem but a spiritual crisis that demanded Christian response. “When I behold the demoralization which comes up behind this need, and hear a raucous voice which, instead of calling on the Father in heaven for salvation, curses God, mocks His Word, insults the Cross of Golgotha,” he wrote, “then I stand before an abyss of spiritual misery which almost arouses my human pity more than the most biting poverty” (50).

Applied to our moment, this means Christians working in AI cannot compartmentalize their faith from their professional work. The software engineer training the next generation of language models, the product manager deciding which features to prioritize, the investor choosing which startups to fund—all stand coram Deo, before the face of God, accountable for how their work affects human flourishing. No technical sophistication excuses the engineer from asking whether their system will enable bosses to extract more productivity from exhausted workers. No innovation excuses the entrepreneur from considering whether their product will deepen addiction or erode authentic human connection.

The Organic Society vs. Atomistic Competition

Perhaps Kuyper’s most important contribution to our thinking about AI is his insistence on the organic nature of human society against individualistic atomism. He argued that “our national society is… not a heap of souls on a piece of ground, but rather a God-willed community, a living, human organism. Not a mechanism put together from separate parts… but a body with members, subject to the law of life; that we are members of each other (41), and thus the eye cannot get along without the foot, nor the foot without the eye.”

This organic vision stands in stark contrast to the prevailing Silicon Valley ideology, which treats society as a collection of individual users to be optimized, monetized, and disrupted. When tech platforms use AI to maximize engagement metrics, they treat human beings as isolated atoms whose behavior can be predicted and manipulated without regard for their embeddedness in families, communities, and traditions. When companies use AI to eliminate jobs in pursuit of efficiency, they ignore the organic connections between work and human identity, between employment and community stability, between productive capacity and social cohesion.

Kuyper’s organic vision suggests that the proper response to AI-driven displacement is not simply to retrain individuals for new jobs (though this may be necessary) but to strengthen the institutions and relationships that constitute genuine community. This means supporting families, churches, unions, guilds, and local associations—the Burkeian “little platoons” of civil society that give life meaning beyond consumption and production. It means resisting the gig-ification of work that treats human beings as interchangeable units to be summoned by algorithms. It means recognizing that when a factory closes or an industry is automated away, what is lost is not merely individual income but a whole ecosystem of relationships, skills, and shared life.

Toward a Christian Response

What, then, should Christians do in response to the AI revolution? Kuyper’s framework suggests several concrete actions.

First, Christians must challenge the underlying worldview that reduces human worth to economic productivity. This requires both proclamation and demonstration. In their preaching and teaching, churches must emphasize the inherent dignity of every human being as an image-bearer of God, regardless of their economic utility. In their practice, churches must create communities where worth is measured not by achievement or efficiency but by faithful discipleship and genuine relationship. This may mean churches deliberately create spaces for slow, inefficient, human-to-human ministry that resists the siren call of optimization.

Second, Christians must advocate for justice in the sphere of law and public policy. Following Kuyper’s principle that government should ensure justice for all spheres, Christians should support regulations that protect workers from exploitation, ensure transparency in AI systems that affect employment, and hold technology companies accountable for the social consequences of their products. This might include supporting stronger labor organizing rights, requiring impact assessments before deploying job-replacing AI, or taxing automation in ways that fund transitions for displaced workers.

Third, Christians must practice what Kuyper called “divine pity”—a suffering-with that moves beyond sentiment to sacrificial action. When AI displaces workers, the Christian response cannot be to shrug and invoke “creative destruction.” It must be to come alongside those who suffer, to use whatever resources and influence we have to create new opportunities for meaningful work, and to challenge systems that treat human beings as expendable. As Kuyper put it, “To the poor man, a loyal handshake is often sweeter than a bountiful largess. A friendly word, not spoken haughtily, is the mildest balsam for one who weeps at his wounds” (62).

Fourth, Christians working in technology must exercise their vocations with prophetic discernment. They must ask not simply “Can we build this?” but “Should we build this?” They must consider not only the immediate functionality of their systems but their long-term social consequences. They must resist the temptation to hide behind claims of neutrality or inevitability. And when necessary, they must be willing to speak truth to power, even at professional cost—refusing to build systems that dehumanize, leaving companies whose practices they cannot in good conscience support.

Fifth, Christians must cultivate the practices and disciplines that protect against technological idolatry. Kuyper’s warnings about distraction and the loss of contemplative space are even more urgent in our age of algorithmic manipulation. This might mean observing digital sabbaths, creating tech-free zones in homes and churches, teaching children to read deeply rather than skim superficially, and defending the value of activities—worship, play, conversation, art—that have no economic productivity but are essential to human flourishing.

The Hope That Remains

Kuyper closed his 1891 address with a recognition of uncertainty about the future—”These are hidden things which also at this congress we leave to the Lord our God”—combined with a call to faithful action in the present: “while we await whatever may come, there remains for us His revealed injunction, to do also at this congress whatever our hands find to do, and do it with all our might” (64).”

This combination of eschatological humility and practical determination offers guidance for our response to AI. We do not know how this revolution will unfold. The technology may prove more transformative than we imagine, or its limitations may become apparent. Economic dislocation may be severe, or new opportunities may emerge that we cannot yet foresee. Social conflict may intensify, or new forms of solidarity may arise.

What we do know is that Christ is Lord over every square inch of reality, including the data centers of Silicon Valley and the neural networks processing our words and images. We know that human beings retain their dignity as image-bearers regardless of whether machines can perform their jobs. We know that God has woven certain ordinances into the fabric of creation—ordinances about work, family, justice, and community—that we violate at our peril. And we know that the same God who brought forth beauty and order from primordial chaos can redeem even our technological overreach for His purposes.

Kuyper’s final prayer resonates across the decades: “that even though this rescue should be delayed, and even though the stream of unrighteousness would have to rise still higher, that it may never be possible to say of the Christians… that through our fault, that through the lukewarmness of our Christian faith (64), whether in higher or lower classes, the rescue of our society was hindered, and the blessing of the God of our fathers was forfeited.”

As artificial intelligence reshapes the landscape of human work and social relations, may it not be said that Christians stood silent while workers were reduced to machines of flesh, that we blessed disruption without mourning displacement, or that we embraced efficiency while abandoning the least of these. Instead, may it be said that we remembered the sixth commandment—that we refused to participate in social murder—and that we worked with all our might to ensure that even in an age of intelligent machines, human dignity, divine calling, and organic community were preserved and honored. For the question that haunted Kuyper’s age and haunts ours remains the same: will we serve Mammon, or will we serve God? Will we build a society that treats persons as resources to be optimized, or as image-bearers to be honored? The technology has changed, but the choice has not. ♦


Anders Liman is a J.D./M.T.S. candidate at Emory Law and Candler School of Theology. He holds an M.A. in technology ethics & policy from Duke University, as well as a B.S. and an M.C.S. in computer science from North Carolina State University.


Recommended Citation

Liman, Anders. “Thou Shalt Not Kill – Abraham Kuyper and the AI Revolution.” Canopy Forum, December 19, 2025. https://canopyforum.org/2025/12/19/thou-shalt-not-kill-abraham-kuyper-and-the-ai-revolution/.

Recent Posts