Sovereign AI Infrastructure for Bharat and Global South

Abstract

The session explored what “sovereign AI” means for India and the broader Global South. Panelists debated the intent‑driven nature of sovereignty, the need for data‑localisation, Indian‑owned hardware, open‑source cloud stacks, and the emerging role of AI models as part of the infrastructure. Topics then shifted to practical bottlenecks—capital, power, talent, policy, networking, edge‑computing, 6G, and quantum‑computing—and to the necessity of a responsible‑AI global convention. The second half featured three startup founders who demonstrated concrete ways to democratise AI: a multilingual translation/ASR system, an AI‑co‑pilot for factory automation, and an open‑source multilingual foundation model (Project EKA). The discussion highlighted the intertwined technical, economic, and geopolitical levers needed to build a sovereign compute ecosystem for India and its peers.

Detailed Summary

  • Moderator (Amit) welcomed the audience, requested the startup founders to assemble for a group photo, and announced the start of the panel.
  • He outlined the format: a common question to all panelists, followed by individual questions, and finally an audience question if time allowed.
  • The core theme was “sovereignty” in AI, aligned with India’s Atmanirbhar Bharat and DPDP (Digital Personal Data Protection) objectives.

2. The Common Question – “What does sovereignty mean to you?”

2.1 Tarun Dua (E2E Networks)

  • Intent‑first definition – sovereignty is not a checklist (e.g., data localisation, Indian hardware) but the intent to control one’s own destiny.
  • Emphasised three pillars of intent:
    1. Data localisation – keeping data within Indian borders.
    2. Indian‑owned data‑centres – eliminating foreign influence over physical infrastructure.
    3. Indigenous cloud stack – using Indian cloud operators whose software stack is open‑source (OpenStack, Kubernetes) rather than proprietary foreign solutions.
  • Added a fifth pillar: sovereign AI models—the model itself must be built and owned domestically, not just the compute.
  • Called out that India has historically lagged behind East‑Asian nations (e.g., Japan) in fostering such intentionality, but recent global events have accelerated the focus.

2.2 Lakshmi (Panelist – identity not on the supplied list)

  • Broadened the definition to hardware heterogeneity and energy efficiency:
    • Developers should be free to choose the optimal hardware (GPUs, ASICs, neuromorphic chips) for a given workload.
    • Emphasised the need for energy‑efficient data‑centre design (cooling, power).
  • Stressed programmability: standardized APIs and programming models (e.g., PyTorch, MLIR) should hide hardware differences from developers.
  • Highlighted the importance of real‑time inferencing for multi‑agent workflows, which demand low latency and high throughput.

2.3 Neelakantan Venkataraman (Tata Communications)

  • Summarised sovereignty in one sentence: “No kill‑switch outside our control.”
  • Described the full stack that must stay inside Indian borders:
    1. Passive infrastructure – colocation space, power, cooling.
    2. Compute infrastructure – servers, accelerators.
    3. Cloud‑management platform – orchestration, provisioning, control‑plane.
    4. Metadata layer – data about data (catalogues, lineage) must also reside locally.
    5. Application & model layer – AI services and models themselves.
  • Noted that while raw data may be Indian, user‑generated metadata can inadvertently be stored abroad, creating a hidden vulnerability.
  • Cited progress: Indian cloud providers now rely on open‑source stacks (OpenStack, Kubernetes) reducing dependence on proprietary “kill‑switch” software.

3. Deep‑Dive Questions

3.1 AI Digital Public Infrastructure – Lakshmi’s View

  • Heterogeneous hardware is essential; a single‑type hardware ecosystem would re‑introduce vendor lock‑in.
  • Real‑time inferencing demands sub‑2‑billion‑parameter models that can run at the edge with low latency.
  • Standardised compilers and APIs (e.g., MLIR, PyTorch) are required for easy programmability.
  • Energy‑efficient hardware combined with robust cooling reduces operational costs and improves sustainability.

3.2 Bottlenecks to Scaling Domestic AI Cloud (Tarun Dua)

  • Capital availability: historically constrained, but now improving thanks to recent policy incentives and clearer data‑centre taxation frameworks.
  • Policy support: the Indian government has introduced “clear‑cut” tax incentives and data‑centre policies to attract global cloud players.
  • Overall assessment: Tarun believes India is no longer constrained by capital or policy; the primary obstacle is simply speed of execution to match global compute announcements.

3.3 Connectivity & Edge – Neelakantan Venkataraman

  • Described network as the “invisible glue” that links clouds, edge nodes, and end‑users.
  • Highlighted the latency challenge; sub‑10 ms latency on 5G is insufficient for ultra‑critical use‑cases (e.g., robotic surgery, high‑speed gaming).
  • Introduced the concept of differential privacy‑enabled federated collaboration across Global South nations:
    • Local data stays within national borders; only model weights are shared.
    • This enables cross‑border AI cooperation without compromising sovereignty.
  • Edge compute at scale: distributed GPU pods at the edge can host “small‑language‑model” inference (< 2 B parameters) for localized applications (agri‑tech, healthcare, Industry 4.0).

3.4 6G & Future Edge Connectivity

  • Current 5G offers sub‑10 ms latency but is expensive and not robust enough for ultra‑critical workloads.
  • 6G is projected to deliver ultra‑low latency, higher throughput, and lower cost, making edge‑centric AI inferencing viable for mission‑critical domains (e.g., remote surgery, immersive gaming).

3.5 Energy & Power Infrastructure

  • Tarun Dua argued the single most important decision is securing local, reliable power for data‑centres:
    • SMR (Small Modular Reactor) nuclear or Bharat‑Standard Renewable (BSR) solutions could provide continuous, low‑cost power adjacent to data‑centre campuses.
    • Locally generated power could replace diesel generators, cutting OPEX and ensuring 8–10‑year uninterrupted operation.

3.6 Ecosystem Readiness – Lakshmi’s Follow‑up

  • Beyond power, India needs an ecosystem capable of building, operating, and maintaining heterogeneous hardware at scale.
  • Such an ecosystem would deliver sovereign AI infrastructure to diverse user groups (start‑ups, academia, public sector).

3.7 Risk Capital & Scaling Start‑ups (Neelakantan Venkataraman)

  • While sustainable power is essential, access to risk capital remains a bottleneck for AI‑focused start‑ups.
  • The venture‑capital depth in India is still limited; without sufficient funding, promising AI innovations cannot scale to compete globally.

3.8 Quantum Computing – Perspectives

SpeakerViewpoint
Tarun DuaAnticipates quantum processors (QPUs) becoming part of the compute stack in 5–10 years; they will complement CPUs/GPUs for specialized workloads (e.g., cryptography).
Neelakantan VenkataramanRecognises quantum’s potential but notes software‑stack maturity, security implications, and practical deployment are still far off.
LakshmiEchoes concerns about security (quantum could break existing cryptography) and the need for robust frameworks before widespread adoption.

3.9 Global AI Governance – Audience Question

  • Question: “How will disparate national AI regulations converge into a single, responsible AI convention?”
  • Panel consensus:
    • A multi‑disciplinary, multi‑country initiative is required, bringing together governments, industry, academia, and civil society.
    • Current practice: individual companies set their own guardrails; there is no universal standard yet.
    • The need for a global responsible‑AI convention is clear, especially as AI systems become critical national infrastructure.

4. Transition – Closing the Panel

  • The moderator thanked the panel, announced the upcoming segment two (startup showcase), and reiterated the request for a group photo.

5. Startup Showcase (Segment 2)

5.1 Saurav Bandyopadhyay – Shunya Labs

  • Product: 55‑language real‑time translation model (1.3 B parameters) plus in‑house TTS.
  • Compute footprint: Trained on four GPUs; demonstrates how small compute budgets can produce high‑impact multilingual AI.
  • Data acquisition: Partnered with Google’s OneEQ project and NASCOM to collect previously unavailable language data.
  • Performance claim: “Pingala V1” achieves the lowest word‑error‑rate (WER) among all ASR systems for 204 languages (open‑weight model on HuggingFace).
  • Business model:
    1. Open‑source / open‑weight models for public good (no direct profit).
    2. Custom model services – building verticalised models for specific clients (average 40 GPU‑hours per model).
  • Goal: Democratise AI access across India’s linguistic diversity, avoiding “AI divides.”

5‑1. Audience Interaction

  • Question: “If everything is open‑source, how do you monetise?”
  • Answer: The company operates two tiers: (a) large‑model providers (generic, horizontal models) – they act as “wrappers,” and (b) custom model development for specific use‑cases, which is a paid service.

5.2 Ravinder Kumar – TechNode 8

  • Problem statement: Indian manufacturers lack automation know‑how despite a pressing shortage of skilled labour.
  • Solution: An AI co‑pilot (agentic AI) that turns factory automation into a DIY process across three lifecycle phases:
    1. Concept design – AI agents generate robotics cell designs, deployment architectures, and visualisations based on high‑level requirements.
    2. Programming – agents produce PLC, CNC, and robot code (including 3‑D coordinate handling) that can be simulated and deployed directly onto machinery.
    3. Troubleshooting – agents act as a maintenance brain, diagnosing faults across seven hierarchical layers (from physical hardware up to sub‑component level).
  • Demonstrations: Live example with a Japanese firm; showcased robot programming that translates spatial coordinates into executable code.
  • Target markets: Tier‑1/Tier‑2 manufacturers, Indian Air Force (maintenance co‑pilot for high‑value defence equipment).
  • Strategic message: AI will augment, not replace, human workers; the goal is to make India globally competitive in manufacturing.

5.3 Rohitash Debsharma – Socket AI Labs

  • Mission alignment: Part of the India AI Mission – a national push to create sovereign AI capabilities.
  • Project EKA:
    • A 120 B‑parameter multilingual open‑source foundation model.
    • Emphasises ownership of the entire pipeline – data, training stack, evaluation, safety framework.
  • Phase 1 – “EKA Code”:
    • An in‑house math‑and‑code model for AI‑assisted coding, secure code generation, and large‑scale analytics.
  • Phase 2 – “EKA Defense” (future):
    • Secure inference, sensor reasoning, autonomous systems, and mission‑critical reliability for defence and critical infrastructure.
  • Global‑South Perspective:
    • Emerging economies face multilingual complexity, infrastructure constraints, and affordability pressures.
    • Rather than replicating Western models (GPT, Claude, Gemini), India needs regionally relevant models that address these constraints.
    • Success would shift partnerships from dependency to collaboration across Global South nations.
  • Call‑to‑action: Invites attendees to visit the Socket AI booth for live demos.

5‑4. Closing of Showcase

  • Moderator thanked the founders, announced tree‑planting (10 trees per speaker) and distribution of digital certificates, and invited speakers back on stage for mementos.

6. Overall Session Conclusions

  • Sovereignty is a layered, intent‑driven concept that goes beyond data localisation to include hardware, software, cloud management, and AI model ownership.
  • Key technical levers identified: heterogeneous hardware, energy‑efficient data‑centres, open‑source cloud stacks, low‑latency edge compute, and future 6G / quantum technologies.
  • Policy & ecosystem gaps: capital (risk capital), reliable power (SMR/BSR), talent, and a unified global‑AI governance framework.
  • Startup showcase demonstrated concrete pathways for democratising AI: multilingual models, AI‑driven factory automation, and sovereign foundation models.

Key Takeaways

  1. Sovereign AI = Intentional Control – It is not a checklist; every decision across data, hardware, cloud, and model layers must be guided by the intent to keep control within India.
  2. Full‑stack Sovereignty – The stack includes passive infrastructure, compute, cloud‑management platforms, metadata, and applications/models—all must reside domestically.
  3. Hardware Heterogeneity & Energy Efficiency are essential for a resilient sovereign ecosystem; developers should freely choose the optimal accelerator without vendor lock‑in.
  4. Network & Edge are the “Invisible Glue” – Low‑latency, affordable connectivity (future 6G) and distributed edge pods are critical to bring AI services to tier‑2/3 cities and mission‑critical use‑cases.
  5. Local Power Supply (SMR/BSR) is the top infrastructure decision for scaling compute sustainably and avoiding reliance on diesel generators.
  6. Risk Capital remains a bottleneck for Indian AI start‑ups; without deeper VC/PE funding, scaling innovative solutions will be hampered.
  7. Quantum Computing is a future add‑on (5‑10 years). It will augment the existing CPU/GPU stack but raises new security challenges that must be addressed before deployment.
  8. A Global‑South AI Convention is needed – A multi‑stakeholder, multi‑country framework to set responsible‑AI guardrails, moving from fragmented national policies to a shared ethical standard.
  9. Start‑up Demonstrations proved feasibility:
    • Shunya Labs built a 1.3 B‑parameter 55‑language real‑time translation model on four GPUs and achieved world‑leading ASR WER.
    • TechNode 8 created an AI co‑pilot that can design, program, and troubleshoot factory automation end‑to‑end.
    • Socket AI Labs launched Project EKA, a 120 B‑parameter open‑source multilingual foundation model with a roadmap for secure, defence‑grade AI.
  10. Collective impact – By combining open‑source foundations, edge‑centric compute, and sovereign policy, India can position itself as a leadership hub for the Global South’s AI future, shifting from dependency to partnership.

See Also: