The Privatization of the Divine
Why Secularism Constructs the Myth of God’s Nonexistence
Ali Ishtiaq
4/23/202515 min read


“You can’t go on ‘explaining away’ forever. You will find that you have explained explanation itself away. You cannot go on ‘seeing through’ things forever. The whole point of seeing through something is to see something through it.” (C.S.Lewis)
“And establish the religion and do not be divided therein...” (Quran 42:13)
In today’s public square, debates about God are rarely about, well… God. They’re more like turf wars over who gets to define reality, police morality, and write the rules of polite society. They dress up as noble quests for metaphysical truth, but more often than not, they function as subtle power plays—strategic attempts to house-train the transcendent. It’s not about disproving God with airtight syllogisms. No, the real goal is bureaucratic: file God under “personal opinion” and quietly escort Him out of the conversation.
This is why modern skeptics don’t typically go after theology itself. That would be too much work. Instead, they shift the goalposts. The new move is to psychoanalyze the believer. Faith? That’s just a coping mechanism, a cultural quirk, or something you inherited like grandma’s tea set—precious, maybe, but irrelevant to policy. Enter what Charles Taylor calls the immanent frame—a socially constructed echo chamber where everything must make sense without referring to anything above, beyond, or divine. Even if you believe in God, it's now seen as a lifestyle choice—like yoga, but with more fasting.
Taylor digs deeper in A Secular Age, showing how we’ve gone from the “porous self” of the past—open to meaning from the cosmos—to the “buffered self” of modernity, wrapped in existential bubble wrap. In the pre-modern world, belief wasn’t a menu item—it was the plate everything was served on. Now? Belief is just one more subjective flavor in the frozen yogurt shop of meaning. Optional, non-binding, and definitely not allowed to lecture anyone else.
But Taylor merely sets the stage. It’s Alasdair MacIntyre who walks in and flips the table. In After Virtue, he exposes how Enlightenment thinkers, having severed ethics from its metaphysical roots, tried to keep the show going with moral puppets that had no strings. The result? Emotivism. Morality becomes a vibe. Rationality, a TED Talk. And metaphysics? Well, that’s been rebranded as “personal growth.”
We see the same trick in metaphysical debates: belief in God isn’t rejected by reason—it’s made unintelligible by swapping inference for therapy-speak. The sacred becomes symbolic. The transcendent becomes “your truth.” And causality? That gets selectively shelved whenever it threatens to lead to a Creator.
Edward Feser, in The Last Superstition, calls out this philosophical sleight of hand. Everyone’s a firm believer in causality—until it starts pointing in a divine direction. “Nobody doubts causality when explaining why a book is on a table,” he notes, “but apply the same logic to the origin of the universe and suddenly it’s all quantum fuzz and creative nothingness.” It’s not skepticism. It’s strategic amnesia.
Because let’s be real: the entire scientific method is married to the idea that causes lead to effects. If a scientist claimed that a chemical reaction “just happened,” they’d be laughed out of the lab. But when it comes to explaining everything—why there’s something rather than nothing—suddenly, the rules don’t apply. Contingency magically explains itself. Nothing gets promoted to Creator-in-Chief. This is not logic; it is a refusal to follow logic to its natural conclusion.
In this climate of intellectual dodgeball, theories aren't born of insight—they're born of sheer panic. Take, for example, the infamous panspermia theory, cooked up by none other than Nobel laureate Francis Crick. Faced with the mind-boggling complexity of life on Earth, Crick didn’t turn to a divine cause—he turned to aliens. That’s right: life is just too improbable to have started here, so maybe some intergalactic courier service dropped it off. Problem solved.
Apparently, when the choice is between acknowledging a transcendent Creator or imagining an interstellar Amazon delivery of DNA, the latter was considered the "scientific" route. One might think that a discipline as rigorous as science—especially when tearing apart metaphysical claims—would have revoked Crick’s Nobel and handed him a juice box and a seat in kindergarten for suggesting such sci-fi speculation. But no. Instead, his cosmic leap was met with straight-faced intrigue. And just when you think it couldn’t get more ironic, Richard Dawkins—atheism’s loudest megaphone—nodded in agreement during an interview saying:
"It could come about in the following way. It could be that at some earlier time somewhere in the universe a civilization evolved, by probably some kind of Darwinian means, to a very, very high level of technology, and designed a form of life that they seeded onto perhaps this planet. Now that is a possibility, and an intriguing possibility. And I suppose that it's possible that you might find evidence for that if you look at the details of biochemistry and molecular biology. You might find a signature of some sort of designer… And that designer could well be a higher intelligence from elsewhere in the universe." https://youtu.be/BoncJBrrdQ8
So let me get this straight—Richard Dawkins, champion of militant atheism and sworn enemy of all things theistic, hears the idea of a Designer and says, “Well, as long as it's not God, I'm listening.” Apparently, if life on Earth shows signs of design, the acceptable conclusion is not a Creator with intention and will, but a hyper-advanced alien civilization with a knack for genetic engineering. You see, divine fingerprints are intolerable—but alien fingerprints? Now that’s “intriguing.” It's a bit like refusing to believe your house was built by an architect, insisting instead that a nomadic race of invisible bricklayers from another galaxy just dropped by one afternoon. Honestly, it’s starting to feel like the motto of modern secularism is:
“Any designer will do… as long as He isn’t divine.”
Even more amusing is how such galaxy-hopping hypotheses brazenly violate science’s own cherished commandments—like inference to the best explanation and Occam’s razor. But don’t worry, those rules are only sacred until they start pointing beyond the material realm. At that point, the razors are blunted, and the inferences politely escorted out the back door. Because if believer dares suggest a Creator behind life, it’s instantly dismissed as the dreaded “God of the gaps” fallacy—an unforgivable sin against scientific orthodoxy. But if a secular scientist dreams up interstellar bioengineers to dodge that very conclusion, suddenly it’s “visionary,” “provocative,” and “worth exploring.” Bold imagination, they say—just don't let it wear a robe or part the Red Sea.
The double standard couldn’t be more obvious. This isn’t skepticism at work—it’s a highly selective form of irrationality dressed up in a lab coat. The modern debate about God, you see, isn’t really about God at all. It’s about whether we’re still allowed to follow reason when it dares to leave the laboratory and knock on Heaven’s door. This isn’t philosophical humility. It’s intellectual hide-and-seek—with the Divine as the one player perpetually disqualified.
In this context, endless debate over axiomatic knowledge does not lead to greater clarity, but to the erosion of foundational principles that once gave meaning, coherence, and direction to civilization. When truths that were affirmed across centuries are dragged into the arena of perpetual skepticism, they are gradually reduced to mere opinions. They turn metaphysical inquiry into psychological comfort-seeking, and rationality into cultural negotiation. Instead of pursuing truth, we negotiate preferences. And in doing so, we replace the inferential clarity of "2 + 2 = 4" with the relativistic absurdity of "2 + 2 = whatever helps me sleep at night."
Besides, when the old axioms get dethroned, they don’t just vanish into thin air like a magician’s rabbit. No, they’re replaced—quietly, sneakily—by a new set of axioms, often smuggled in wearing the mustache and trench coat of “neutrality,” “reason,” "progress", or everyone’s favorite buzzword: “Tolerance.” So no, we haven’t outgrown dogma—we’ve just traded one for another, now with a sleeker logo and better PR. These new beliefs are just as unprovable, just as normative, but now enforced with the zeal of a Netflix algorithm and the subtlety of a sledgehammer.
And therein lies the real con: it’s not the existence of new dogmas that’s troubling—it’s the smug pretence that we’ve outgrown them altogether. The old beliefs were at least honest about being sacred. The new ones? They deny their own dogmatic nature while expecting unquestioning obedience. That’s not progress. That’s just gaslighting with a PhD.
This, in truth, exposes the political engine purring beneath modern atheism. It’s less a metaphysical conclusion and more a PR campaign—one designed to quietly uninstall the theological firmware running public life. Modern secularism isn’t wrestling with God’s existence because it cares about the answer. It’s just trying to change the locks on who gets to define reality.
As Alasdair MacIntyre points out, the real issue isn’t a lack of reason—it’s the clash of civilizations dressed up as debate. In the fragmented babel of modern discourse, we don’t argue to find truth—we argue to mask the fact that no one agrees on what truth even means anymore.
So, the real question isn’t “Does God exist?” It’s: “Who gets the mic?” Or more bluntly: “Who has the authority to say, ‘This is how things are’ in public?” The endless hair-splitting over God’s existence is less about discovering the divine, and more about paralyzing conviction—especially the kind of conviction that carries moral and political consequences.
Søren Kierkegaard saw it coming. He warned that modernity’s favourite way to dodge faith is to debate about it—endlessly. Not to honour it, but to keep it at arm’s length. After all, if God remains a topic for panel discussions and not a force that disrupts your life, then you can go on pretending neutrality is a virtue.
From an Islamic perspective, this is actually a reversal. The Qur’an does not argue for God’s existence as if He were one hypothesis among others. Rather, it begins from dhann and moves toward yaqīn—from intuitive awareness to certitude—inviting reflection on the signs within the self and the cosmos. The question is not “Does God exist?” but “Will you not then reflect?” (Qur’an 51:20–21). The burden is not on God to prove Himself, but on the human being to recognize what is already manifest. “Is there any doubt about God, the Originator of the heavens and the earth?” (Qur’an 14:10).
In this light, the endless God debates of modernity appear less like intellectual curiosity and more like moral evasion. They allow secular society to appear fair-minded while subtly ensuring that belief in God never emerges as a serious alternative to secular norms. This is the political function of metaphysical agnosticism: to allow the marketplace of ideas to seem open while quietly closing the door on transcendent truths.
The Triumph of Preference: From Revelation to MTD
If modern debates about God serve as proxies for political containment, then their success can be measured not by the number of atheists produced, but by the kind of believers they leave in their wake. In this regard, secularism has won not by defeating religion outright, but by reshaping it into a culturally acceptable, depoliticized, and psychologically manageable form. Religion has not disappeared—it has been domesticated. And perhaps the clearest diagnosis of this phenomenon comes through the concept of Moralistic Therapeutic Deism (MTD).
MTD is a spiritual smoothie blended by sociologist Christian Smith after studying American teens but now clearly the house blend in religious cafés worldwide. MTD is the religion you get when God gets hired as a part-time life coach. It’s moralistic because the only real commandment is “Be nice.” It’s therapeutic because the ultimate goal is self-esteem, not salvation. And its deist because God is basically on sabbatical—available for emergencies, sure, but otherwise not one to micromanage your moral messes.
What makes MTD alarming is that it didn’t creep in through atheist manifestos—it grew right out of churches, mosques, and synagogues. This isn’t religion under attack. This is religion hitting the gym, getting a secular makeover, and coming out with a new mantra: “God wants me to feel good, and that’s all that matters.” Gone is the demanding, transcendent Lord of history. In His place stands a cheerful, nonjudgmental life advisor whose job is to rubber-stamp your choices and sprinkle your path with affirmations.
Of course, this wasn’t an accident. It’s the result of decades of cultural conditioning that trained us to treat truth like a flavor and moral obligation like a mood. Sure, philosophers still squabble over God’s existence in ivory towers, but out here in the wild? The real win has been the privatization of truth. God didn’t leave the building. He was politely escorted out, rebranded as a decorative accessory for the spiritually inclined.
MTD isn’t so much a religion as it is religion’s ghost—stripped of its spine, demands, and transcendence, reassembled to suit the inner child of consumer individualism. It keeps the iconography, borrows the lingo, but swaps the soul. In this new model, God doesn’t command or challenge. He just vibes.
Even within Muslim societies, we now witness a growing internalization of secular assumptions about religion—especially among the educated youth. Many young Muslims today genuinely believe that faith is entirely personal and private, something that should not be discussed, debated, or asserted in public spaces. Paradoxically, while they feel free to engage in discussions on politics, culture, and economics, they often treat religion as an exception—a realm where no truth claims should be made, because, in their view, every individual's belief is equally respectable, regardless of how contradictory, irrational, or baseless it may be.
For them, tolerance means unconditional veneration of all beliefs—not because they have evaluated them philosophically or theologically, but because they have adopted the postmodern dogma that all opinions are valid simply because they are sincerely held. Yet this very framework is self-refuting: if one were to argue that not all beliefs should be tolerated, they would refuse to tolerate or give any respect to such a belief, thereby betraying the incoherence of their own relativism.
This mindset is reinforced by MTD that reduces faith to a tool for inner peace, self-care, and emotional wellbeing. When religion becomes primarily about feeling better rather than becoming better, what we see is not a neutral shift in emphasis—but the triumph of a secular, therapeutic model of life over a truth-centered, God-centered worldview.
Belief without Teeth: Rise of the Passive Believer
All this clever declawing of truth doesn’t end in a fiery apocalypse of belief—it ends with a shrug. Enter: The Passive Man. Not a villain, not a heretic, just a very agreeable casualty of epistemic gentrification. He hasn’t abandoned truth; he’s just been gently house-trained to believe it’s a bit rude to bring it up in public. Truth, to him, is like wearing pajamas to a board meeting—totally fine in your own space, but wildly inappropriate to bring out in public.
This isn’t your classic dystopia of boot-stomping censorship. No, this is the velvet-gloved reprogramming of thought where certainty is rebranded as extremism, conviction as bigotry, and moral clarity as a red flag for your HR file. In this world, the greatest faux pas isn’t being wrong—it’s being sure.
The Passive Man is the star student of this civilizational syllabus. He learns early that serious claims about truth are just a bit… tacky. Grand metaphysical statements? Oppressive. Universal moral frameworks? Colonial. Public declarations of faith? Yikes—keep it down. So instead of resisting, he adapts. He internalizes the vibe: “Believe whatever you like—just don’t believe it matters.”
This isn’t passivity by accident. It’s the feature, not the bug. A strategic flattening of the human spirit where truth gets filed under “personal preference,” and deep belief is treated like an embarrassing hobby. Speak from certainty and suddenly you're not passionate—you’re problematic.
Here’s the trick: they don’t need to burn books or silence dissent. They just redefine the rules of the intellectual playground. Instead of arguing against religious or philosophical claims, they slap on some spicy labels—dogmatic, fundamentalist, dangerous—and exile them from the grown-up table of discourse. Voilà! No debate required.
And where is this masterpiece of ideological origami being manufactured? The modern university. Once a crucible for truth, now a vibe-check center for relativism. Here, you don’t defend a worldview—you dissect it. You don’t argue for truth—you study its cultural context. Every serious claim is deconstructed, pluralized, and neutered before it can ruffle the feathers of pluralism.
The results? Two-for-one special:
1. Bright young minds, who could’ve been champions of timeless wisdom, now treat their faith like a quirky fashion choice—something to wear, not something to build a civilization around.
2. Anyone who dares to think universal truths might actually be, well, universal, gets tagged as a zealot, not a thinker.
So, the university becomes not a marketplace of ideas, but a customs office of ideology—declaring all strong convictions as suspicious imports. The only orthodoxy allowed? Relativism. And it polices the borders of belief with academic jargon and diversity statements.
But let’s not pretend this is just an ivory tower issue. This is a civilizational reshuffle. Whoever shapes the minds of a generation, shapes the horizon of what that generation can imagine as good, true, and worth dying for. And if all they’ve been taught is that truth is just an opinion in a fancy outfit—then the future belongs not to the brave or the wise, but to the comfortably numb.
Peripheralizing the Central – Pakistan as a Case Study
Nowhere is the secular project’s strange success more paradoxically on display than in Pakistan—a country born waving the banner of La ilaha illallah, now nervously shushing it like an embarrassing relative at a diplomatic dinner. Mention the Qur’an or Hadith in a policy memo or parliament session or try quoting Imam Ghazali—one of the greatest Islamic scholars whose works formed the bedrock of Ottoman governance for over six centuries—and watch the bureaucrats shift uncomfortably like you just suggested legalizing dragons. Apparently, citing the intellectual giant who helped shape the longest-lasting Islamic dynasty in history is now considered too “out of touch” for a 5-year policy framework.
In the bureaucratic jungle, the unspoken rule is simple: keep it neutral, keep it Western, and for heaven’s sake, keep God out of your footnotes. Meanwhile, in elite schools and universities, the curriculum is dipped in Western epistemology with Islam cordoned off into the padded cell of “Islamiyat”—a subject so defanged and decontextualized it might as well be titled Folklore & Festivities 101. Here, Islam is served up as a cultural appetizer: harmless, hollow, and never spicy enough to cause conviction.
The result? A strange pantomime where public officials sprinkle “Insha’Allah” into speeches like garnish but recoil at the idea of divine guidance shaping actual policy. Pakistan becomes the perfect case study in how secularism doesn’t need to persecute religion—it just needs to politely escort it out of relevance.
The consequences are visible in high-definition. A generation of educated Muslims now sees amr bil maʿrūf wa nahi ʿanil munkar (enjoining good and forbidding evil) not as the cornerstone of socio-moral life, but as the neighborhood uncle who won’t stop texting “forwarded as received.” It’s not just that these principles are ignored—they’re seen as downright rude, the religious equivalent of manspreading on a crowded train.
To speak of Islam as truth, let alone as something that ought to shape public life, is now viewed as poor taste, even aggression—as if asserting moral universals is the new terrorism. And so, the secular project succeeds not by forcing Muslims to leave their faith, but by getting them to whisper it politely and apologize for the volume.
And here lies the brilliance of the strategy. Islam, the only remaining faith that presents itself as intellectually verifiable and morally binding, is neutralized not by argument but by reclassification. Like a wild animal sedated before transport, its claws of rational certainty and teeth of civilizational power are dulled—not by force, but by discourse. “You’re welcome to believe it,” the world says, “just don’t act like it’s true.”
This distortion hits at the very heart of Islam’s mission. The Qur’an doesn’t merely invite belief—it demands propagation of truth and resistance to falsehood:
“Invite to the way of your Lord with wisdom and beautiful preaching, and argue with them in the best manner.” (Qur’an 16:125)
“They want to extinguish the light of Allah with their mouths, but Allah will perfect His light, even though the disbelievers hate it.” (Qur’an 61:8)
These are not optional suggestions—they are commands rooted in the conviction that Islam is not just personally fulfilling, but ontologically true. The notion that such a truth should be hidden, privatized, or relativized would have been unthinkable to the pre-modern Muslims. But it is now the default stance among many—precisely because they have unconsciously absorbed the secular framing.
Islam isn’t looking for keyboard warriors or medieval cosplay—it’s not in the market for reactionaries. But let’s be clear: it also can’t run on the energy of polite head-nodders. What it calls for are believers who’ve got backbone, character, brains, and ihsan (excellence)—people who know how to pray and how to speak, who are physically healthy and mentally sharp, who embody faith not as a weekend hobby but as a full-time worldview. Muslims who understand that Islam isn’t just to be practiced—it’s to be proclaimed, argued, lived, and yes, legislated. Out loud. In public. With poise, not apology.
The real danger to this vision isn’t intellectual defeat—it’s psychological domestication. Because once truth gets demoted to taste, daʿwah becomes impolite, and jihād becomes lunacy. The perfect Muslim for the modern world? Someone who prays five times a day... quietly... at home... preferably without wi-fi. One who keeps their beliefs locked up like family heirlooms: too sacred to share, too awkward to explain. In this script, the only good Muslim is the invisible one—ritualized, privatized, de-fanged, and house-trained. But Islam was never meant to be a decorative faith. It’s a civilizational project. And civilizational projects don’t run on silence.
So, to conclude, the crowning glory of modern secularism isn’t that it proved timeless truths wrong—it’s that it made people stop caring whether they’re true at all. As long as truth doesn’t mess with their weekend plans or disrupt their Wi-Fi signal, it’s deemed irrelevant. Why refute eternity when you can just scroll past it?