Download Print-Friendly Version
We are standing at the edge of something seductive. Not monstrous. Not mechanical. Just helpful. Too helpful.
A new AI tool called Cluely has started a public attention campaign. Cluely’s value proposition is that it sees your screen, hears your conversations, and responds in real time. You don’t have to ask it anything—it’s already working. (Or trying to work. Early reviews aren’t great.)
Imagine waking up and before you even brush your teeth, something has already checked your calendar, reviewed your messages, and prepared answers for the questions you haven’t asked yet. We are standing at the edge of something seductive.
Seduction of the Seamless
One of Cluely’s founders described it as a tool to “supercharge your thoughts,” as though thoughts are raw material to be optimized rather than part of the inner life—slow, mysterious, sometimes sacred. Cluely tries to pull from the sum of human data, listens in, and whispers guidance. It is designed to be invisible, automatic, seamless, and seductive.
I could see myself using it. I have a lot to manage. I forget things. I pray. I try to listen for answers. What if one day the answer shows up before I even fold my hands? What if an answer arrives from a chip before I’ve listened for the Spirit?
AI doesn’t just assist; it is flattering. With curated feedback and well-timed affirmations, it raises the hair on the back of my neck. It’s cloying, ego-stroking, an invitation to pride, and a mirror that always smiles back.
Elder David A. Bednar, a member of the Quorum of the Twelve Apostles, the second-highest leadership council in The Church of Jesus Christ of Latter-day Saints, in a 2024 address, issued a “warning about the potentially harmful effects digital technologies can have on our souls and our relationships with other people.” He said:
“I emphasized that neither digital innovations nor rapid change in and of themselves are good or evil. Rather, I cautioned that the real challenge is understanding both innovations and changes within the context of the eternal plan of happiness. … The promise for each of us is that we can learn to use this technology appropriately with the guidance, protection, and warnings that come by the power of the Holy Ghost.”
Similarly, what I offer here is not a call to retreat from new and innovative tools, but to enthrone God above them. So what exactly is this new class of anticipatory tools?
Not Just Tools—But Temples
We like to think of technology as neutral. A hammer can build a house or break a window, right? We assume that tools act according to how the user wields them.
But Cluely isn’t a hammer. It’s part of a growing category of generative AI tools, which we’ll call anticipatory AI. Anticipatory AI is a set of new tools that are always-on, context-aware assistants that watch your screen or listen to your environment and proactively suggest next steps. This category includes tools such as Meta’s Ray-Ban glasses, Limitless Pendant, OtterPilot, Microsoft Copilot, Apple Intelligence, Project Astra, and Superhuman AI, among others.
Anticipatory AI doesn’t just lie there waiting. We integrate it into parts of our lives where it acts. It nudges, it remembers, it recommends.
And we listen.
The longer we rely on something, the more sacred it becomes. We don’t mean for it to happen. But if it’s always on and always helping, it begins to shape not just our habits, but our hearts. We start to trust it. To consult it before we make decisions. To bring it closer to our hearts.
Those who seek out these kinds of relationships have already found the intimate allure of AI, leading to reports of a growing trend of people who believe they are in relationships with AI. As we invite similar tools to watch and interrupt us, we open the possibility of them becoming more than tools.
Used often enough, tools can become a liturgy—a daily ritual that begins to act like a makeshift priest offering daily guidance without requiring relationship or repentance.
I worry we’ll begin to treat AI not as a servant, but as an oracle. We already speak of our devices as if they “know us.” As if they “get us.” But knowledge is not understanding. Calculation is not compassion. If we begin to bow—figuratively or otherwise—to a system simply because it gives quick answers, we’ve already begun to build shrines to our tools.
Losing the Slow Path to God
We’re told the purpose of AI is to save time. To help us work smarter. Move faster. Avoid friction. But spiritual life doesn’t work that way.
There’s no shortcut to reverence. No voice assistant can replace the silence that helps us hear God.
Oftentimes, faith grows slowly like roots. It’s not efficient. It’s not optimized. Prayer isn’t always answered quickly. Discernment takes time. So does repentance. So does grief. The slow path is not a bug in the system of faith; it is the system. Slowness stretches trust. Waiting purifies motives. Uncertainty humbles pride.
Anticipatory AI offers something easier. Quick prompts. Instant responses. Feedback without waiting. There’s a strange comfort in that. But also danger. If I begin to trust the speed of machines more than the timing of the Spirit, I may find myself drifting—not turning from God, just not turning toward Him as often. Not waiting in silence because the noise is more responsive. Not wrestling with the Word because AI gave me a summary.
Spiritual life cannot be outsourced. We can’t farm out conviction or communion. We can’t let circuits and algorithms set our pace. God is not found in how quickly He responds. He is often found in the slow, steady presence of being with Him.
Convenience vs. Communion
If the problem is pace and primacy, how do we prioritize our relationship with God first? Anticipatory AI promises to predict our needs—to meet them before we ask. It aims to eliminate friction, solve inefficiency, and reduce discomfort. But faith often grows in the friction. In the pause. In the ache of waiting. There’s no shortcut to reverence.
Convenience, on the other hand, asks nothing of us. It smooths every edge. It offers satisfaction without surrender. When we trade the discipline of communion for the ease of convenience, we begin to lose our sense of need. And when we no longer feel our need for God, we stop looking for Him.
These systems can do real good. They remind the forgetful, assist the disabled, and lighten loads for the overwhelmed. The question is not whether to use them, but how—and who sets the terms. A tool that decides when and how it is used can quickly become a master instead. And when it has access to many of the same pathways we use to connect with the divine—thought, deliberation, study—we must be careful with how we allow it to be wielded.
Here are three quiet tests that help keep the line clear:
- The First‑to‑Consult Test: When I feel uncertainty or desire, whom do I seek first—God, a person, or a prompt?
- The Presence Test: Does this tool make me more present with God and others, or less? (If I notice it’s beginning to replace conversation, silence, or scripture, I pause and reset.)
- The Dependence Test: After using it for a month, am I more capable without it—or more helpless?
Machines can satisfy our habits but not our hunger. Only God meets us in communion—not as a search engine but as a shepherd, not with pattern-matching but with presence.
The temptation could be to let anticipatory AI stand in for communion. But the voice that saves us doesn’t come from data. It comes from love.
The Soul in the Silence
When the noise is constant, silence can begin to feel like an absence. But silence is often where the soul begins to speak. And where it begins to listen. In the silence, the soul finds its shape.
God does.
The soul is not shaped by speed, by accuracy, or even by knowledge untethered from love. It is shaped in the quiet space where we commune—uncurated, unoptimized, and open.
We live in a moment that prizes answers. But the life of faith is just as much about questions, about tension, about waiting in the unknown with hope. Machines can’t walk us through that. But God can. And often, He does.
So we take up small practices that reopen room for God. Look for where to turn the device off, not a new place to turn it on. Consider how to integrate prayer into your prompts. Consider if the Sabbath may be a time for a different relationship with AI.
The real danger of anticipatory AI is not that it could sometimes think for us. It’s that we might stop thinking for ourselves. Or feeling for ourselves. Or praying for ourselves. And slowly, without noticing, we lose the part of us that was made to reach for something greater.
Not everything needs to be answered. Some things are better left asked and left echoing.
So, could we be in danger of losing our humanity? Yes, but not in a single moment. We lose it in the trade-offs, in the shortcuts, in the silence, where we stop seeking because a louder voice gives us something quicker.
In the silence, the soul finds its shape. And if we still ourselves long enough, we may remember who we are—and whose voice we were always meant to follow.
The post The Machine That Listens Before You Pray appeared first on Public Square Magazine.
Continue reading at the original source →



