OpenAI quietly opened a new frontier in consumer health this week: a ChatGPT feature that can ingest your medical records and data from fitness apps to answer health-related questions with more personal context.

What OpenAI announced

In a blog post, OpenAI unveiled ChatGPT Health and said it will let early users in the US connect items like Apple Health, Peloton and MyFitnessPal, plus medical records, so the chatbot can tailor replies to a person’s history and measurements. The company emphasized that Health is "designed to support, not replace, medical care," that conversations in the Health experience are stored separately from other chats, and that the data won’t be used to train its models. You can read OpenAI’s full announcement here.

OpenAI also pointed to the scale of demand: millions of people already turn to ChatGPT for questions about health and wellbeing. The firm is rolling Health out to a small group of early users and has a waitlist for broader access.

Why this matters — and why people are nervous

Letting an AI read your chart is materially different from asking a generic question. With access to a timeline of diagnoses, medications, labs and activity data, the chatbot can give answers that are more specific — and more consequential. That’s the upside: clarity and personalization without scheduling a doctor.

But privacy advocates and experts are sounding alarms. Andrew Crawford of the Center for Democracy and Technology called for "airtight" safeguards around health information, noting that health data is among the most sensitive people share. Critics worry about how companies might combine or monetise insights from this data down the line, especially as AI firms explore advertising and other revenue models.

Regulatory differences matter, too. OpenAI’s initial launch excludes the UK, Switzerland and the European Economic Area, where strict rules govern how sensitive data is processed. In the US, where rules vary by sector and company, some firms are not bound by the same privacy protections—raising uncertainty about long-term data use and liability.

The business angle: health as the next big market

For AI labs, health is a high-stakes growth opportunity. Observers point out that productizing personalized health assistance could reshape how patients find care and what they buy to address symptoms. Investors and rivals are watching closely: Alphabet’s Gemini and other competitors are likely to push into the same space, and industry chatter suggests this is where the major labs intend to battle for user attention and recurring revenue. (If you want to see how other AI products from OpenAI are expanding, note OpenAI’s Sora recently landing on Android in the US and Canada OpenAI’s Sora lands on Android.)

Bloomberg and others frame health as the next major market for AI companies — a crowded, lucrative arena where accuracy, trust and regulatory compliance will determine winners.

What ChatGPT Health can and can’t do

OpenAI is careful to say it isn’t offering diagnoses or treatment plans the way a clinician would. In practice, that distinction will be fuzzy for users who receive a detailed, personalized explanation about symptoms or medications. Generative models can sound authoritative even when they’re wrong, so plain-language disclaimers alone won’t prevent misinterpretation.

Max Sinclair, an industry founder, called the launch a "watershed moment," suggesting the tool could become a trusted medical adviser and even influence retail decisions tied to treatment or self-care.

If you’re considering signing up

A few pragmatic steps before you grant Health access to your records:

  • Read the permission screens carefully to see exactly which apps and types of records are shared.
  • Check OpenAI’s Health-specific privacy promises on the announcement page and look for retention and deletion options.
  • Treat any AI-provided medical guidance as informational — verify with a licensed clinician before changing treatments or medications.
  • Consider which devices and apps you connect. Consumer wearables and health apps (for example, an Apple Watch) can add fine-grained data; make sure you’re comfortable sharing that level of detail. See pricing and availability for an Apple Watch if you’re thinking about wearable-sourced data.

How regulators and rivals may respond

Expect a mix of regulatory scrutiny and competitive counterpunches. Companies that already handle protected health information under HIPAA or are integrated into healthcare systems will emphasize compliance; consumer-facing AI companies will stress consent and data minimization. Google and others are unlikely to sit still — recent product moves around combining AI with user documents and accounts suggest rivals are preparing similar integrations Gemini’s Deep Research plugs into Gmail and Drive.

OpenAI’s health push also spotlights broader questions about who owns and controls medical data in an era of powerful consumer AI: patients, clinicians, platforms, or regulators.

There’s real promise in more accessible, personalized medical information. But realizing that promise without rolling back privacy or safety will take careful engineering, transparent policy, and—critically—trust from people and the institutions that care for them. For now, ChatGPT Health is a U.S.-only experiment that sets a clear marker: consumer AI is moving from general advice into people’s medical records, and the industry and regulators will have to catch up fast.

AI HealthOpenAIPrivacyDigital Health