Your Survey Data Is Lying to You | Raf Alencar

Your Survey Data Is Lying to You

Why 95% of Your Customer Experience Is Invisible

Most organizations don't suffer from a lack of data.

They suffer from accepting the wrong constraints.

The core insight

Customer experience already exists in your operational data—transaction times, supervisor interventions, staffing patterns—for 100% of customers. Surveys only capture 5%. Most organizations are optimizing the wrong layer.

An Unexpected Case Interview

Yesterday, during what was supposed to be an informational interview with the leader of Customer Experience at a major Canadian retailer, the conversation took an unexpected turn. As we explored where my skills could be useful on his team, he suggested we switch gears and treat it like a case interview.

His question was simple:

"If you had to design a customer experience survey, what five questions would you ask?"

I didn't answer right away.

Instead, I started asking questions of my own: When does this survey run? Who actually answers it? How is it used downstream? What decisions are leaders hoping to make from it?

Eventually, we converged on five familiar drivers of satisfaction: availability, cleanliness, helpfulness of staff, price, and checkout speed.

It was a good exercise. I genuinely enjoy this kind of framing work because it forces you to step into both perspectives at once—the customer's lived experience and the organization's need to structure that experience into something measurable.

Then the conversation moved to dashboards.

How would I report this to executives? What KPIs would I track? How would leadership consume it?

That's where I hesitated.

Not because I haven't built hundreds of executive dashboards before—I have—but because it became clear that the hardest part of this problem had nothing to do with dashboard design.

It had everything to do with how the problem itself was being framed.

Looking beyond dashboard constraints to see operational reality

As we talked, I heard the boundaries: the tools in use, the data considered "in scope," the data implicitly treated as irrelevant.

I didn't push back in the moment—it wasn't the place. But when I got home, I couldn't stop thinking about it.

Not how to improve the survey. But whether the organization was being asked to optimize the wrong layer altogether.

The Hidden Constraints Nobody Questions

Before jumping into solutions, I always pause and ask a different question:

What assumptions are we treating as facts?

In this case, three constraints were quietly shaping everything—and blocking the path to real operational intelligence.

1. The Response-Rate Illusion

Only a small fraction of customers respond to surveys—and they are not representative.

Very satisfied customers respond to say thank you. Very dissatisfied customers respond to complain. The middle majority doesn't respond at all.

This creates self-selection bias. Leadership ends up reacting to extremes while flying blind on the normal, everyday experience.

Improving response rates from 5% to 8% doesn't fix this. It just makes the illusion feel more scientific.

2. Tool-Defined Thinking

During the conversation, I heard statements like:

"We're limited to Qualtrics."

"We have Microsoft Copilot, so we can use that."

These sentences matter.

Not because those tools are bad—they aren't—but because the tooling had quietly become the boundary of imagination.

Instead of asking "What's the best way to understand customer experience?" the problem was implicitly framed as "What can Qualtrics tell us?" and "What can we build inside these dashboards?"

That difference is subtle—and enormous.

3. The Insight Blind Spot

The most revealing constraint wasn't technical at all. It was conceptual.

Despite running millions of transactions across hundreds of stores, consumer insight was defined almost exclusively as survey feedback.

Operational data—POS logs, transaction duration, supervisor interventions, staffing levels—was treated as irrelevant to "experience."

That's where the real opportunity lived.

Survey data represents only a small fraction of total customer experience signals

Reframing the Problem

Later that evening, I found myself doing what I often do after these conversations: pulling the thread.

What if the problem wasn't the survey at all?

What if customer experience already shows up in behavior—and we're just not looking at it that way?

Customer experience doesn't only exist in opinions. It exists in operational reality.

A two-minute transaction with no intervention likely felt smooth. A fifteen-minute transaction with multiple overrides likely didn't. Repeated supervisor calls during peak hours signal systemic friction. Self-checkout slowdowns during rush periods point to training or technology issues.

These signals already exist. For 100% of customers. Inside systems the retailer already owns.

The issue wasn't lack of data. It was how the data was framed.

The Operational Intelligence Approach

Once reframed, the solution becomes architectural rather than tactical.

Phase 1: Operational Structuring

Start with the complete universe of POS transactions and enrich them with context: time of day, weekday, holidays, promotions, checkout type, transaction duration, basket complexity, supervisor interventions, voids, corrections, store traffic, and staffing levels.

No new data collection. Just smarter structuring.

Phase 2: Behavioral Segmentation

Cluster transactions based on how they behave, not what customers say: fast/smooth, normal, friction, high-complexity (not negative), and peak-hour stress patterns.

These clusters act as behavioral proxies for experience. A "friction transaction" is a frustrated customer—whether they filled out a survey or not.

Phase 3: Response Bias Modeling

Now bring surveys back in—but in a different role. Instead of treating them as truth, use them to calibrate behavior: Which segments respond more often? Do friction transactions over-index in survey data? How should insights be weighted to reflect reality?

Surveys stop being the foundation. They become a correction layer.

Phase 4: Synthetic, Decision-Ready Insight

This is where the output changes entirely.

Instead of:

"5% of respondents are unhappy with checkout speed."

Leadership gets:

"12% of all transactions last month showed friction indicators, concentrated between 5–7pm in Stores A, C, and F. Primary drivers include insufficient supervisor coverage and self-checkout timeout issues. Recommended actions: add one supervisor per store during peak hours, deploy software patch v3.2, and refresh promotional override training before the next campaign."

That's not reporting. That's operational intelligence.

From vague survey metrics to specific, actionable operational intelligence

Why This Matters

Immediate impact: Insight coverage expands from 5% to ~100%. Bias is modeled instead of ignored. Root causes become visible. Actions are specific and localized.

Strategic advantage: Intelligence scales automatically. More data improves the model over time. Cross-functional silos break down. Issues surface before complaints spike.

Most importantly, leadership stops guessing.

This Isn't a Retail Problem

The same restructuring pattern applies everywhere:

SaaS: usage patterns replace NPS as early churn signals. Telecom: network behavior surfaces experience before complaints. Healthcare: wait times and compliance predict satisfaction. B2B: account behavior reveals health long before renewal risk.

The industry changes. The pattern doesn't.

For solo consultants and independent operators navigating similar challenges—where you're wearing multiple hats and drowning in context-switching—the underlying principle is the same: clarity before systems. You can't automate what you haven't structured.

The Real Difference

Most organizations optimize inside constraints. They ask: "How do we get better survey data?" "How do we improve response rates?" "How do we build a better dashboard?"

A smaller number pause and ask a more uncomfortable question:

"What if this constraint isn't real?"

That shift—from optimization to restructuring—is where step-change transformation happens.

Not because of better tools. Not because of more data.
But because the organization stops confusing measurement with understanding.

Operational intelligence isn't about replacing surveys. It's about recognizing that customer experience already exists—continuously, behaviorally, and at scale—inside the systems organizations run every day.

Once you see that, surveys stop being the foundation. They become context.

What constraints is your organization treating as facts?

I'd love to hear what layer you think is being optimized when restructuring might be the real opportunity.

— Raf Alencar

Picture of Raf Alencar
Raf Alencar

Growth & Performance Leader | Customer Value, ROI & Scalable Growth through Analytics

Lastest Post