Apple this week updated its App Review Guidelines to require apps to disclose — and obtain explicit user permission for — sharing personal data with third‑party AI systems, a move that tightens privacy controls just as the company prepares its own AI upgrades.

The core change

The most consequential edit appears in guideline 5.1.2(i). Apple added the sentence:

"You must clearly disclose where personal data will be shared with third parties, including with third‑party AI, and obtain explicit permission before doing so."

That wording narrows what developers must tell users and makes AI a named category for the first time in this rule. Apps that fail to comply risk removal from the App Store under Apple's long‑running App Review regime.

Why the timing matters

The update arrives as Apple readies an upgraded, AI‑enhanced Siri for 2026. Reports have said parts of that voice assistant will lean on external models, including Google’s Gemini, raising questions about how Apple will balance its privacy positioning while relying on third‑party AI technology. By explicitly calling out "third‑party AI," Apple is signalling that AI‑related data flows are a distinct privacy concern that needs explicit consent.

What else changed

Apple published a broader set of edits beyond AI disclosure. Key additions and clarifications include:

  • 4.1(c) — Anti‑copycat rule: apps may not use another developer’s icon, brand or product name in their app’s icon or name without approval.
  • 1.2.1(a) — Creator apps must flag content that exceeds an app’s age rating and use age gating based on verified or declared age.
  • 3.2.2(ix) — Loan apps may not charge APRs above 36% (including fees) or require repayment in 60 days or less.
  • 5.1.1(ix) — Crypto exchanges added to the list of apps in highly regulated fields.
  • 4.7 and related subsections — Clarifications that HTML5/JavaScript mini apps and software not embedded in the binary remain in scope and must follow age‑restriction rules.
  • 2.5.10 — Apple removed an older line discouraging empty ad banners/test ads.
  • These edits respond to recurring App Store problems — from copycat apps flooding search results to regulatory scrutiny of finance and crypto services.

    Industry reaction and competing perspectives

    Privacy advocates and many users welcomed the move as an extension of Apple’s privacy posture, which has included measures like App Tracking Transparency. For consumers, the change promises clearer visibility into when apps are sending personal data into external AI systems — whether that’s for personalization, chatbots, image processing, or analytics.

    Developers expressed mixed views. Some see the rule as a reasonable transparency requirement. Others warn it could introduce compliance complexity and UX friction, particularly for smaller teams that rely on third‑party AI APIs for features. The term "AI" remains broad — it can encompass everything from simple on‑device machine‑learning models to large language models (LLMs) hosted by third‑party providers — and that ambiguity raises questions about where Apple will draw the enforcement line.

    There are also competitive implications. Limiting opaque third‑party data sharing could steer developers toward on‑device processing or toward Apple’s own AI services, a dynamic that critics say could advantage Apple as it rolls out platform‑level AI features.

    What developers will need to do

    Practically speaking, apps that use any external AI provider to process personal data should:

  • Update privacy disclosures to explicitly name when and where personal data is shared with third‑party AI;
  • Add explicit consent prompts that meet Apple’s disclosure and permission expectations;
  • Audit data flows to understand what qualifies as "personal data" under the new rule; and
  • Consider on‑device alternatives or contractual safeguards with AI vendors to limit exposure.

Several developer resources and community posts have already begun circulating to help teams adopt the new language and consent flows, but smaller shops will face more work to implement technical and legal safeguards.

Enforcement and outlook

Apple historically enforces its guidelines strictly — noncompliant apps have been removed in past crackdowns. But enforcement here will be watched closely. How Apple defines "third‑party AI," how deeply it inspects developer data flows, and whether it provides a grace period will determine the rule’s real impact.

The update also underscores broader regulatory and market trends: platforms are responding to pressure to make AI more transparent and for users to have more control over personal data. Other stores and regulators are likely to watch Apple’s approach and may adopt similar expectations.

Bottom line

Apple’s guideline change makes explicit something many privacy experts have been urging for: when apps hand personal information to external AI systems, users should know and agree. For users it’s a clarity win; for developers it adds compliance work and design considerations. And for Apple, the move both reinforces its privacy brand and shapes the competitive landscape as AI becomes core to mobile experiences.

As AI feature rollouts accelerate, the details of enforcement — and whether Apple will refine the rule to better separate on‑device ML from third‑party cloud AI — will determine how sharply the App Store ecosystem changes.

AppleApp StorePrivacyAIDevelopers