Here is the future of your loyalty program: a customer opens ChatGPT, says “order me laundry detergent,” and an AI agent scans every retailer, compares every offer, ignores every points balance, and buys the optimal product in under four seconds. Your program never enters the conversation. Your points never get checked. Your brand never gets considered.
This is not a thought experiment. McKinsey calls it agentic commerce, and it is already here. ChatGPT launched Instant Checkout in September 2025. Google announced its Universal Commerce Protocol in January 2026. AI-driven referral traffic to retailers has surged 1,200% year over year. And the market is expected to hit $20.9 billion in AI-mediated retail spending this year alone.
If your loyalty strategy is built on points-for-purchase, you are standing on a trap door. And AI just found the lever.
The 81% Problem
Deloitte’s 2026 Retail Industry Global Outlook surveyed 330 retail executives, 86% of them at companies generating over $1 billion in annual revenue. The finding that should be keeping every loyalty leader awake: 81% believe generative AI will weaken brand loyalty by 2027. Not “might.” Not “could.” Will.
Half of those same executives expect the entire multi-step shopping journey to collapse into a single AI-driven interaction within two years. No browsing. No comparison shopping. No app-opening, point-checking, offer-scanning ritual that traditional loyalty programs depend on to stay relevant.
Think about what that means. The entire points-based loyalty model is predicated on a human being who remembers they have points, actively chooses your brand to earn more, and mentally calculates whether the reward is worth the effort.
AI agents obliterate every single one of those steps. They do not forget points balances. They do not feel sunk-cost attachment. They do not care about your tier status. They optimize on price, speed, fit, and availability. Period.
The loyalty industry spent three decades building an elaborate emotional theater around what is fundamentally a discount mechanism. AI agents do not attend theater.
Why Points Programs Were Always Fragile
The dirty secret of points-based loyalty is that it was never really loyalty at all. It was bribery with extra steps. Points create switching costs, not emotional attachment. And switching costs only work when the customer is the one doing the switching.
When an AI agent shops on behalf of a human, switching costs evaporate. The agent has no inertia. It has no habit. It has no vague sense that “I should probably use my Kroger points.” It evaluates every transaction from zero, every time. Your carefully engineered earn-and-burn loop becomes invisible infrastructure that no one interacts with.
This is not a bug in AI. It is the natural consequence of building loyalty programs that never actually created loyalty. They created behavioral routines masquerading as preference. And routines are the first thing automation eliminates.
Bain reports that 70% of consumers are already at least somewhat comfortable with an AI agent making purchases on their behalf. The humans are volunteering to hand over the decisions. What happens to your program when they do?
The Thing AI Cannot Optimize Away
Here is where this gets interesting, and where most of the industry commentary gets it wrong. The standard response to AI disruption is “personalize harder.” Use AI to make your offers more relevant, your recommendations more precise, your emails more timely.
This is the loyalty equivalent of bringing a faster horse to a car race. You are using identical tools as your competitors to defend a model that the tool itself is dismantling.
AI is extraordinary at optimizing transactions. It will always find the better price, the faster delivery, the more relevant product match. Competing on transactions in an AI-mediated world is competing on a dimension where the machine will always win.
But there is one thing AI shopping agents fundamentally cannot do: they cannot make a human feel something. They cannot create a sense of identity. They cannot manufacture the thrill of a meaningful choice, the tension of a time-limited challenge, the satisfaction of mastering a system, or the pride of achieving status through skill rather than spend.
That is not a transaction. That is a game. And games are the only loyalty mechanism that survives the AI apocalypse.
Gamification Is Not What You Think It Is
Let me be precise here, because “gamification” has been so thoroughly abused by the marketing industry that most executives hear the word and think of spin-the-wheel pop-ups and meaningless badge collections. That is not gamification. That is decoration.
Real gamification, the kind rooted in decades of video game design, is the science of engineering voluntary, repeated human engagement through meaningful choices, escalating challenges, and consequential outcomes. It is the reason people spend 30 hours a week in games they are not paid to play. It is the reason Duolingo has 116 million monthly active users practicing vocabulary through streak mechanics and leaderboard competition.
The distinction matters enormously in the AI context. A spin-the-wheel overlay is just another transaction. An AI agent can spin it, claim the coupon, and move on. But a system that asks a customer to make a meaningful choice, to reveal a genuine preference, to invest cognitive effort in a challenge that reflects their identity? That requires a human. Not because the AI lacks the processing power, but because the engagement has no value unless a person experiences it.
This is the critical insight the loyalty industry needs to internalize: the value of gamified engagement is inseparable from the human experience of it. A discount has value whether a person or a bot redeems it. A game only has value when a person plays it.
Measuring What Survives: SNES vs. NPS in an AI World
If gamification is the survival strategy, you need a way to measure whether your engagement actually works. And here is where the industry’s favorite metric, NPS, becomes dangerously misleading.
NPS asks one question after the fact: “Would you recommend us?” In a world where AI agents are making purchase decisions, recommendation becomes irrelevant. The customer is not recommending you to a friend. They are (or are not) engaging with you directly. NPS measures sentiment. It does not measure engagement health.
Steve’s Net Engagement Score (SNES) was built specifically to measure the quality of engagement in real time. The formula: SNES = (Interesting Choices x Consequence x Time Pressure) / Raw Clicks. Interesting choices are contextually meaningful decisions that reveal genuine preferences. Consequence is the weight those choices carry. Time pressure creates urgency that drives instinctive participation. Raw clicks, the denominator, represents mindless interactions that dilute engagement quality.
The SNES Spectrum: Points-based programs sit squarely in the AI Kill Zone. Gamified engagement lives where machines cannot follow.
The SNES spectrum runs from clickbait (below 1) through basic digital interaction (around 1, think Google search) into the gamification zone (5 to 150) and up to full video game engagement (50+). A points-based loyalty program that asks you to scan a card and collect stars? That scores near the bottom. A gamified experience that asks you to make real choices with real stakes on a deadline? That is where engagement becomes AI-proof.
Because here is the key: an AI agent can scan a loyalty card. It cannot make an interesting choice. It cannot feel consequence. It cannot experience time pressure. SNES measures the exact qualities that make engagement uniquely human.
What AI-Proof Loyalty Actually Looks Like
So what does a loyalty program look like when it is designed to survive agentic commerce?
First, it stops pretending that transactions are the relationship. Transactions become the backdrop, not the main event. The program earns attention and identity investment between purchases, not just during them.
Second, it creates experiences that require human participation to have value. Challenges that ask for genuine decisions. Competitions that reflect real skill or knowledge. Community dynamics that reward contribution and reputation. Progress systems where advancement feels earned, not bought.
Third, it generates psychographic data that no AI agent can replicate. When a customer makes a meaningful choice in a gamified environment, they are telling you who they are, not just what they bought. That data is qualitatively different from transaction history, and it becomes the moat that keeps your program relevant even when an AI is handling the purchasing.
This is the design philosophy behind platforms like PUG Interactive’s Picnic, which treats every customer interaction as an opportunity for consequential choice rather than passive point accumulation. The architecture is built around making brands playable: presenting customers with interesting, high-stakes decisions that generate genuine engagement and rich preference data simultaneously. When an AI agent handles the commodity purchase, the human still has a reason to show up. Not for the discount. For the game.
The Uncomfortable Truth
Here is the part that will make some loyalty leaders uncomfortable: AI is not killing loyalty. It is revealing that most loyalty programs were never actually creating it.
If your entire value proposition to a customer can be replicated by a bot comparing prices across five retailers in 200 milliseconds, you did not have loyalty. You had a coupon with a login screen. You had a points ledger that customers tolerated because the friction of switching was higher than the friction of staying.
AI removes that friction. And when the friction disappears, so does the illusion.
The programs that survive, the ones that thrive, will be the ones that built something a machine cannot replicate: genuine human engagement. The kind that comes from being challenged, being recognized, being asked to make choices that matter. The kind that makes customers feel like participants in something worth their attention, not just targets in a transaction funnel.
The industry is scrambling to figure out how to make loyalty programs “AI-agent compatible.” That is exactly the wrong question. The right question is: what kind of engagement is so fundamentally human that AI compatibility is irrelevant?
The answer has been sitting in game design for fifty years. The loyalty industry just has to stop ignoring it.