An interview with the founder of a new AI CX company on empathy, scale, and designing for real conversations
Key Takeaways
- Great CX starts with understanding intent, not scripts
- AI should reduce friction, not add another layer
- Empathy can be designed—but never automated blindly
- CX metrics matter less without real customer context
- Trust is built through consistency, not personalization tricks
Customer experience has become one of the most talked-about—and misunderstood—applications of AI. As companies rush to deploy chatbots, copilots, and automated support flows, many customers feel less understood than ever. That paradox is what led Priya Nandakumar to found Aurevia, a new AI-native CX company focused on helping businesses respond with more relevance and humanity at scale. With a background spanning customer operations, product design, and applied machine learning, Nandakumar has seen how easily good intentions can turn into bad experiences. In this interview, she explains why CX needs a reset, how AI can support—not replace—human connection, and what founders often get wrong about automation.
Interview
Q1: What motivated you to start an AI company specifically focused on customer experience?
The short answer is frustration. I spent years watching companies invest heavily in CX tools, yet customers kept telling the same story: “No one is actually listening.” We added channels, bots, and dashboards, but the experience became more fragmented, not more helpful.
I realized the issue wasn’t a lack of technology—it was a lack of understanding. Most CX systems are optimized for deflection and efficiency, not resolution or empathy. Aurevia started with a simple question: what if AI’s primary job was to understand why a customer is reaching out, rather than how quickly we can close the ticket?
Q2: How does Aurevia’s approach to AI differ from traditional CX platforms?
Traditional platforms treat interactions as isolated events. A ticket comes in, it gets routed, it gets closed. We treat interactions as signals in an ongoing relationship. Our AI looks at patterns across conversations, timing, sentiment shifts, and customer history to infer intent and urgency.
That allows teams to respond differently to a frustrated long-term customer than to a first-time inquiry, even if the surface-level issue looks the same. The AI doesn’t decide what to say—it helps decide who should respond, when, and with what context. That nuance is where experience is won or lost.
Q3: There’s a lot of concern about AI making CX feel impersonal. How do you address that?
I think that concern is valid, and honestly, well-earned. Many AI-driven CX experiences feel impersonal because they’re designed around control, not care. They’re optimized to contain the customer, not help them.
We approach this by designing for escalation, not containment. AI should recognize when automation is no longer appropriate and gracefully hand off to a human with full context. Empathy isn’t about sounding friendly; it’s about responding appropriately. Sometimes the most empathetic thing AI can do is step out of the way.
Q4: What do companies often misunderstand about measuring customer experience?
They confuse activity with understanding. Metrics like response time and CSAT are useful, but they’re lagging indicators. They tell you what happened, not why.
We encourage teams to look at friction signals—repeated contacts, emotional escalation, silence after resolution. Those patterns reveal where systems or policies are failing customers. AI can surface those insights, but leadership has to be willing to act on them. Measurement without accountability doesn’t improve experience.
Q5: As a founder, what principles are guiding how you build this company?
The biggest one is humility. We’re building systems that sit between companies and their customers, which is a position of real responsibility. That means being honest about limitations and designing with safeguards.
Another principle is proximity. I still listen to support calls and read transcripts regularly. If leadership drifts too far from real customer voices, the product suffers. Finally, we optimize for long-term trust over short-term efficiency. CX isn’t a growth hack—it’s a relationship, and relationships reward patience.
Looking Forward
Aurevia’s philosophy reflects a broader rethinking of what customer experience should look like in an AI-driven world. Rather than chasing speed or deflection, Nandakumar is betting on understanding, restraint, and better judgment at scale. Her perspective challenges the assumption that more automation automatically leads to better outcomes. As companies continue to adopt AI across customer touchpoints, the real differentiator may not be intelligence alone, but how thoughtfully it’s applied.