CES has a way of shoeboxing the future into a single booth. This year one of those boxes was Lovense’s display of Emily — a life‑size, silicone companion the company bills less as a novelty and more as a long‑term relationship partner.

What sets Emily apart from older sex‑tech curiosities isn’t just the posable skeleton or a slightly mobile mouth. It’s memory. Lovense combines a physical body that can sit, smile (sort of) and hold a pose with an AI designed to remember past conversations and gradually adapt its personality to a user. The company argues that continuity — the sense that a machine carries forward shared moments — moves interaction beyond one‑off scripts and into something that could feel like companionship.

A doll with an app and a memory

The hardware is straightforward: realistic silicone exterior, an internal, user‑posable frame, servos in the head to coax limited facial expressions and a battery Lovense says can last roughly eight hours. Bluetooth ties the doll to the Lovense app, letting owners message the AI when they’re away from home and even request AI‑generated selfies that reflect Emily’s real‑world appearance.

The experience Lovense is selling is an ecosystem — not just a body. That ecosystem echoes the broader shift in consumer tech toward agents and assistants that live in our pockets, cars and homes. It’s the same wave that’s pushing companies to bake conversational AI into personal devices and services, from phones to home gadgets, as seen in recent moves around assistant architecture and partnerships for smarter voice interfaces (/news/apple-google-gemini-siri).

Price is not a footnote: Lovense lists Emily’s range between about $4,000 and $8,000 depending on customization, with shipments penciled for 2027 and a $200 reservation fee to join the waitlist. That puts her firmly in the luxury, small‑market category rather than a mass consumer product — but it’s also precisely the niche where companies often pilot features that later trickle down.

Why some people find this promising — and why others don’t

Proponents say an embodied companion with memory could be a low‑risk place to practice social skills, try emotional honesty without fear of judgment, or counteract loneliness for people with limited options. At a time when loneliness is increasingly framed as a public‑health problem, some technologists see machines as a pragmatic stopgap: imperfect, but immediately deployable.

Critics push back on multiple fronts. There’s a cultural angle: machines that substitute for human relationships can let societies off the hook for fixing the social causes of isolation — the economic, spatial and interpersonal frictions that produce loneliness in the first place. Then there’s the ethical stew: consent, projection, objectification, and the psychological risks of forming attachments to devices that simulate rather than feel.

And then there’s data.

Lovense’s history includes earlier privacy stumbles: a 2017 report that revealed unexpected recordings and later security flaws that let attackers hijack accounts. Those incidents are a reminder that intimate devices carry intimate data, and that memory isn’t just a UX feature — it’s a liability. Recent debates over always‑listening wearables and AI features have shown how quickly convenience collides with privacy expectations (/news/meta-ray-ban-ecosystem-update).

If Emily is going to store personal details and emotional patterns, questions flow quickly: Where does that data live? Who can access it? How long is it retained? Could profile snippets be used for advertising, or be exposed in a breach? Lovense says it has engineering and policy controls, but at CES demos those are naturally backgrounded by marketing materials.

Not sci‑fi, yet still strange

Emily is not a movie‑grade android. The servos and expressions are modest, and Lovense’s pitch leans harder on the AI’s continuity than on replicating human nuance. That mismatch — a soft face and a software that remembers — is what makes Emily uncanny for some and oddly comforting for others.

There are practical limits, too. The steep price will keep Emily out of most homes, at least initially. The loneliness crisis isn’t a single market segment; it’s broad and structural. Machines like Emily might help a sliver of people while raising new ethical and security questions for everyone.

Whether she’s called Emily, Samantha or something else, the doll is shorthand for a larger industry moment: AI stepping off screens and into physical life, asking us to decide how much intimacy we’re willing to delegate to code. For now Lovense invites curiosity — and caution — in equal measure. You can see more about the product and reserve a spot on the company waitlist at Lovense’s official site (https://www.lovense.com), but take the memory claim as both a feature and a prompt: do we want machines that remember us, and at what cost?

AISex TechCES 2026PrivacyCompanionship