If the idea of a camera peering into your toilet bowl made you uncomfortable, the marketing around Kohler’s Dekoda probably didn’t reassure you: the company billed the $599 gadget as a “privacy-first” health tracker with “end-to-end encryption.” Trouble is, that phrase has a pretty specific meaning — and Kohler isn’t using it the way most people expect.

Simon Fondrie-Teitler, a software engineer and former FTC technology advisor, dug into Kohler’s documentation and emailed the company. What he found (and publicized) is a simple mismatch between marketing and technical reality: Dekoda’s image and telemetry data are encrypted while traveling across the internet and at rest on devices and servers, but Kohler’s systems can decrypt and access that data on their end. In short: HTTPS/TLS, not the kind of end-to-end encryption messaging apps like Signal or WhatsApp use.

What Kohler actually told researchers

Kohler’s public statements and its privacy policy say data is encrypted in transit and at rest — and that’s true. But Kohler also confirmed that data is decrypted on its servers “to provide and improve our service.” The company says it may de-identify (their word) data and, with an optional consent checkbox in the app, use that de-identified information to train its AI models.

Translation: the transport is secure (your ISP or a passive eavesdropper can’t easily intercept packets), but Kohler — not you alone — can see the pictures and analysis on their backend if they choose. After the scrutiny, Kohler removed some references to “end-to-end encryption” from its product page and updated wording to say the system uses data encryption at rest and in transit.

Why the wording matters

End-to-end encryption (E2EE) carries a promise in the public mind: only endpoints — the sender and the intended recipient — can read the content; the service operator cannot. That guarantee matters when you’re sharing intimate messages or health data. Calling server-side TLS "end-to-end" blunts that expectation.

Security and privacy experts interviewed in coverage of the story pointed out the difference. TLS protects data on the wire but doesn’t stop the service provider from accessing it once it reaches their systems. Marketing that blurs those lines can leave people with a false sense of protection, especially for sensitive health data like images of bodily waste.

What Kohler might do with the images

Kohler’s privacy policy allows them to create “aggregated, de-identified and/or anonymized” datasets for business purposes, including training and improving machine-learning models. The company says the checkbox to allow that is optional and not pre-checked. Still, de-identification is tricky: re-identification risks persist depending on how the data is handled and what metadata is retained.

That concern isn’t unique to Kohler. As products that mine user-generated health signals proliferate, companies and regulators are scrambling to clarify when consent, anonymization, and data minimization are sufficient. There are also engineering alternatives: local, on-device analysis and client-side encryption for backups would keep raw images off company servers entirely — an approach Fondrie-Teitler and other critics say would better match consumer expectations.

Practical takeaways for prospective buyers

  • Read the privacy settings in the Dekoda app carefully. Kohler says model-training consent is optional; make sure the box reflects your preferences.
  • Remember that “encrypted in transit” means the connection is secure, not that Kohler can’t access the data on their servers.
  • If you care about absolute control over health images, prefer devices or workflows that analyze data locally or offer true client-side encryption.

Products that promise AI-driven insights from intimate data — be it images, audio, or patterns of behavior — deserve extra scrutiny. If you’re interested in the larger debate about consent-first approaches to training machine learning models, see recent industry moves like Sony’s FHIBE benchmark for consent and bias audits in vision systems, which pushes the idea that consent and provenance matter during model training (/news/sony-fhibe-ethical-ai-benchmark). And for a broader conversation about how large research models are being allowed to tap into users’ content, consider parallels with debates around services that index private data for AI features (/news/gemini-deep-research-gmail-drive-integration).

Kohler’s Dekoda sits at an awkward intersection: a device that literally looks at the most private part of daily life, sold with a stack of modern-sounding security terms. That’s enough to make anyone squirm — and to remind us that precise language matters when companies sell privacy alongside AI.

PrivacyEncryptionHealth TechAI