PSYKHE AI, Walmart & StubHub International Host Dinner in SF

October 17, 2025
Anabel Maldonado
Founder & CEO @ PSYKHE AI

Personalization has been the white whale of commerce tech for over a decade. Promised everywhere, delivered rarely. For most, it’s still shallow, siloed, and blind to actual human taste.

So during SF Tech Week, we gathered a small group at Ernest to ask the hard questions: What is taste, really? Why don’t current systems feel like they get us? And what will it take for AI to learn something as nuanced and cross-domain as preference?

Hosted by PSYKHE AI, with Yana Routshtein (Group Director of Personalization at Walmart) and Bob Kupbens (CEO of StubHub International, ex-Apple, Target, Delta, Neiman Marcus, eBay), the dinner brought together a mix of infrastructure founders, personalization leads, and AI operators working at the edge of identity, discovery, and design.

On Taste

We opened with a provocative definition:

Taste is a stable, cross-domain preference signature aligned to your latent axis.

In simpler terms: it’s how your preferences in fashion, travel, media, and even restaurants tend to cohere. Not random. Not idiosyncratic. Systemic. Your taste in films probably has something to say about your hotel choice. Your furniture choices? Psychologically predictive of your skincare. That’s not just aesthetics—it’s psychographic alignment.

We asked the group: Do your preferences across domains feel random or aligned? Unsurprisingly, most agreed: there’s a throughline. They just hadn’t seen a system that could read it.

Why Personalization Still Misses

We dove into what’s broken with current “personalization” stacks.

  • Behavioral data ≠ identity
  • Semantic relevance ≠ taste
  • “Similar items” ≠ what I’ll actually like

Everyone’s seen how brittle these systems are. Look at one linen shirt and you’re trapped in a linen vortex. They react, but they don’t understand. And LLMs, despite being able to parse prompts, struggle with preference prediction in domains like fashion or interiors. Why? Because the taste signal is latent, not lexical. It’s rarely in the query.

Designers and infra builders in the room agreed: this is less a model problem and more a grounding problem.

Can AI Learn Taste?

That led to the core provocation of the night:

Can AI actually learn taste? Can it predict and personalize not just based on what you did, but based on who you are?

Our answer: only with a new layer.

  • One that maps the latent axes of personality and aesthetics.
  • One that infers a psychographic embedding — a kind of “taste vector” — for people and products, across categories.
  • One that makes personalization persistent, proactive, and agentic.

What we call a Taste Brain — plugged in through MCP, powering any agent or interface.

The Takeaway

As models get smarter, interfaces more fluid, and agents more autonomous, there’s a missing piece: grounding.

Taste is the missing grounding layer for agents.

And unless we give AI that grounding — a way to understand us at the level of identity — it won’t help us discover what we actually want.

That’s the infrastructure PSYKHE AI is building. And that’s the layer we believe needs to exist.

Thanks to everyone who joined us in San Francisco. Let’s keep building the systems that actually get people.