Building Trust in the Age of Synthetic Media

Abstract

The panel examined how content‑provenance standards—most notably the Coalition for Content Providence and Authenticity (C2PA) framework—can serve as foundational infrastructure for trust, transparency, and accountability in a world where synthetic media (AI‑generated images, video, audio, and text) is proliferating at scale. Moderated by Adobe’s Andy Parsons, the discussion blended policy perspectives from the Indian government, industry viewpoints from Google, Adobe, Microsoft and ITI, and technical insights into standards‑based solutions such as C2PA, SynthID and Google’s “content credentials.” Panelists debated the limits of these technologies, the urgency of upcoming Indian regulations, the need for principle‑based, interoperable legislation across jurisdictions, and practical pathways for public‑private collaboration to empower citizens while protecting privacy and free expression.

Detailed Summary

John Miller (ITI) opened the session, positioning trust as the linchpin of today’s digital ecosystem. He highlighted:

  • The rapid expansion of synthetic media and its implications for India’s emerging AI governance, digital public‑infrastructure, and the newly‑enacted privacy law and IT‑rules.
  • The importance of open, interoperable technical standards—especially content‑provenance mechanisms such as C2PA—to provide verifiable metadata that underpins transparency, accountability and compliance.
  • That provenance is not a moderation tool but a “verifiable context” layer that can coexist with privacy safeguards and freedom of expression.

Miller stressed that the real challenge is operationalising trust: moving from standards on paper to deployed solutions across platforms, media, and public systems, with a strong emphasis on public‑private collaboration.

2. Moderator’s Introduction and Panelist Roll‑Call

Andy Parsons (Adobe – Moderator) introduced himself as the global head of content authenticity at Adobe and recapped the origins of C2PA (founded by Adobe, Microsoft, Google and others in 2021). He warned that C2PA is not a silver bullet, but a foundational building block for cryptographic provenance (“nutrition‑label‑style” metadata).

He then invited each panelist to introduce themselves and situate their work in the Indian and global context.

2.1 Gail Kent (Google) – Google’s Policy & Product View

  • Role: Global Public Policy Director, responsible for Search and Gemini (Google’s generative‑AI model).
  • Key Points:
    • Google has long pursued digital‑literacy tools (reverse‑image search, “Lens on Pixel,” etc.) and now adds AI‑specific provenance via two mechanisms:
      1. SynthID – a lightweight identifier that flags AI‑generated content.
      2. Content Credentials (C2PA‑compatible) – embed creator, tool, timestamp, and model details directly into media files.
    • Emphasised that understanding the source of information is essential for trustworthy AI adoption.
    • Noted that Google’s Pixel 10 is the first consumer phone to automatically embed C2PA credentials in captured images.

2.2 Sameer Boray (ITI) – Industry‑Policy Perspective

  • Role: Senior Policy Manager, former lawyer from India.
  • Key Points:
    • ITI produced a policy guide on synthetic media that maps existing laws and highlights gaps needing new regulatory tools.
    • Described C2PA as one of several provenance techniques (others include watermarking and human‑in‑the‑loop verification).
    • Stressed that no single solution suffices; C2PA is a solid start but must be complemented by other measures.
    • Highlighted the importance of scalability—human verification is infeasible at Indian‑scale, but technology‑agnostic standards can be.

2.3 Deepak Goel (MeitY) – Government & Legislative Angle

  • Role: Deputy Secretary (Cyber‑Law) at the Ministry of Electronics & Information Technology.
  • Key Points:
    • The ministry is working on four major legislative tracks: the Information Technology Act, the Digital Personal Data Protection Act, the Aadhaar framework, and the Online Gaming Regulation Act.
    • Central to all is citizen empowerment – ensuring easy‑of‑living for individuals while facilitating ease of doing business for industry.
    • The government aims to stay technology‑agnostic and, where possible, purpose‑agnostic in its regulations, allowing the private sector to supply solutions (e.g., C2PA) that can be iterated without re‑legislation.
    • Emphasised that provenance is not content moderation; it is about verifiability, accountability, and citizen‑centric risk management.

3. Core Discussion Themes

3.1 What C2PA Actually Delivers (and Doesn’t)

  • Technical foundation: Cryptographically signed provenance metadata that is immutable, interoperable, and machine‑readable.
  • Limitations:
    • Does not prevent the creation of synthetic media; it only labels origin.
    • Relies on adoption by content‑creation tools and platforms; gaps remain (e.g., many messaging apps, Instagram, WhatsApp).
    • Trust is still a human judgment—metadata must be presented in a user‑friendly way to be effective.

3.2 Implementation Challenges in India’s Ecosystem

  • Timeline pressure: A recent Indian regulatory notice gave 10 days for certain compliance steps, raising concerns about feasibility.
  • Scale & Diversity: Over 1 billion internet users, multiple languages, a mobile‑first market, and a wide array of device types (smartphones, low‑end feature phones, IoT).
  • Proposed approach (Samir Boray):
    • Phased rollout – start with high‑impact platforms, iterate, and widen coverage.
    • Multi‑stakeholder workshops – bring together government, standards bodies (C2PA), platform operators, and civil‑society groups to calibrate realistic milestones.
  • Technical feasibility (Andy Parsons): C2PA’s open‑source reference implementations make it possible to embed provenance across ecosystems, but device‑level constraints (processing power, storage) need attention.

3.3 Interoperability & Global Convergence of Laws

  • Andy Parsons argued that standardisation (open, royalty‑free, cryptographically secure) is C2PA’s biggest strength, enabling cross‑border adoption.
  • Gail Kent added that privacy‑preserving, security‑focused principles should guide any legislation; a principle‑based framework allows divergent technical implementations while preserving legal convergence.
  • Deepak Goel echoed this, stating the Indian ministry is intentionally technology‑agnostic so that any future provenance tool (C2PA, watermarking, etc.) can comply without needing legislative rewrites.

3.4 The Role of Media Literacy

  • John Miller (via the panel) emphasized that technology alone isn’t enough; citizens need media‑literacy education to interpret provenance signals.
  • Gail Kent highlighted Google’s investment in simple, multimodal reading tools that surface provenance information within search, YouTube, and other consumer products.
  • Samir Boray warned that without inclusive literacy initiatives (covering low‑literacy audiences and regional languages), provenance metadata may remain invisible to the very people it aims to protect.

3.5 Policy‑Holistic View – Beyond Content Moderation

  • John Miller (in closing remarks) reminded the audience to treat synthetic‑media governance holistically: intersecting privacy law, cybersecurity rules, and AI‑governance guidelines.
  • Deepak Goel illustrated this by referencing the AI Governance Guidelines recently released by the Indian government, which already note provenance as a key technical control.

4. Audience Q&A (≈5 minutes)

QuestionSummarised Response
Implementation timeline – 10‑day deadline vs. technological readiness (raised by audience to Samir)Samir acknowledged the urgency but recommended a phased, multi‑round‑table approach to set realistic milestones; he stressed that expectations of a full rollout in 10 days are unrealistic.
Device heterogeneity and multi‑modal content (question to Andy)Andy highlighted that mobile‑first markets will need lightweight SDKs and that C2PA’s design accommodates different device capabilities; however, full adoption on legacy devices will take longer.
Compatibility of provenance standards across jurisdictions (question to panel)Consensus: Principle‑based, technology‑agnostic legislation coupled with open standards (like C2PA) offers the best path to global convergence. Specific implementations can vary, but the legal backbone should remain harmonised.

5. Closing Reflections

  • John Miller concluded by urging continued collaboration: “Content provenance isn’t a silver bullet, but it is a critical building block for an ecosystem of trust that can scale globally.”
  • The panel thanked the audience, reiterated that ongoing dialogue—across governments, industry, and civil society—is essential as synthetic media technology evolves.

Key Takeaways

  • C2PA provides cryptographically secure, interoperable provenance metadata that can be embedded in images, video, audio and documents, but it is not a substitute for content moderation.
  • Trust is multi‑dimensional: it requires technical standards, principle‑based regulation, and media‑literacy initiatives to help end‑users interpret provenance signals.
  • India’s regulatory push (including a 10‑day implementation window) showcases the urgency of the issue, yet realistic deployment will need a phased, collaborative approach involving government, standards bodies, platform owners and civil‑society groups.
  • Technology‑agnostic, principle‑based laws (as advocated by MeitY) enable global interoperability of provenance solutions and avoid fragmented, contradictory regulations.
  • Google’s and Adobe’s product roadmaps (e.g., SynthID, Pixel‑10 automatic C2PA embedding, Adobe’s open‑source tooling) illustrate how large tech firms are already building consumer‑friendly provenance capabilities.
  • Scalability concerns (India’s >1 billion internet users, linguistic diversity, mobile‑first usage) demand lightweight SDKs, multilingual UI, and inclusive media‑literacy programs.
  • Public‑private partnership is essential: governments set high‑level principles, industry supplies the technical mechanisms, and users receive clear, trustworthy signals.
  • Future regulatory convergence is feasible if laws focus on principles (security, privacy, accountability) rather than prescribing specific technical solutions.

Prepared from the verbatim transcript of the “Building Trust in the Age of Synthetic Media” panel at the India AI Impact Summit 2026.

See Also: