The Digital Public Infrastructure Map

Tracking national-scale digital infrastructure around the world

Session 3: Towards Robust Metrics for Trust in DPI

Summary

How do you measure success when the metrics themselves might be part of the problem? This question anchored the third Community of Practice session on DPI Measurement convened by the IIPP’s Digital team on June 16 2025, as researchers from APTI Institute and Data Privacy Brazil presented findings that challenge conventional wisdom about digital infrastructure impact.

The session showcased research that addresses critical gaps in how we understand and measure the ongoing effectiveness of DPI projects. APTI Institute’s Asta Kapoor and Kunal Raj Banua demonstrated how India’s celebrated adoption numbers mask a more complex reality of limited sustained usage and trust deficits. Data Privacy Brazil’s Rafael Zanatta approached similar questions with evidence from Brazil’s gov.br platform, where record-breaking user statistics obscure systematic exclusions, from undocumented citizens to transgender individuals, while inadvertently enabling widespread fraud.

These presentations served as the basis for a probing discussion into the ways that certain measurement practices can lead to an image of DPI deployment that unwittingly conceals crucial information about its real-world implementation. The conversations that ensued revealed that meaningful DPI measurement requires not just better metrics, but a willingness to confront uncomfortable truths about power, coercion, and the gap between policy intentions and lived experiences.

The full recording of the session can be found on the IIPP’s YouTube Channel here.

An Inclusion-First Approach

APTI Institute’s contribution to the session centered on a key challenge in India’s digital transformation, highlighting the gap between adoption metrics and meaningful usage. This insight led to developing an “ontology of trust”: a framework that deconstructs trust into measurable elements. The framework identifies three core components (agency, decision-making, and interaction) and three lifecycle phases (building, maintenance, and repair). Within this framework, each transaction, from app download to service completion, becomes an opportunity to build or erode trust.

Building on this foundation, Kunal Raj Banua presented APTI’s second major contribution: a practical toolkit for creating gender-inclusive digital identity systems. The toolkit addresses a critical distinction between adoption and inclusion. “Adoption would sort of mean people who have been registered, who have been onboarded onto a system,” Banua explained. “But inclusion should encompass a bit more of a larger understanding of access to such systems and their ability to use.”

The toolkit specifically targets the three stages where inclusion typically fails:

By identifying common pitfalls and offering evidence-based remedies, the toolkit aims to embed inclusion as “an actual intentional effort up front, as opposed to an afterthought.”

Banua emphasised that this approach not only addresses societal barriers but also creates systems that are inherently more scalable. The team has established their own Community of Practice to contextualise the toolkit for different countries and expand beyond gender to address multiple dimensions of exclusion.

A Data Justice Approach to DPI Measurement

Data Privacy Brazil’s Rafael Zanatta brought a complementary perspective from Latin America, presenting six years of research mapping Brazil’s national identity system, gov.br. The platform represents a remarkable achievement: 166 million users making it the world’s most accessed government platform. Yet this success story, Zanatta revealed, masks critical vulnerabilities and exclusions that conventional metrics fail to capture.

Perhaps most striking was what Zanatta termed “the dilemma of success.” Gov.br’s massive reach has created unprecedented opportunities for criminal exploitation. “Criminals are using scams and building really similar platforms as gov.br using AI,” he reported, describing an explosion of digital fraud where people are tricked into taking false loans or opening credit lines. The government, focused on celebrating user numbers, appears unprepared to map or address these negative consequences of its own success.

Zanatta outlined how Data Privacy Brazil advocates for what they call a “data justice approach”: moving beyond quantitative metrics to capture lived experiences through qualitative methods. Zanatta proposed specific questions for meaningful measurement: “Do you trust that the government uses the biometric data only for the purpose?” “Can you log in and use gov.br without any kind of assistance?” This approach, Zanatta argued, represents the difference between nominal inclusion (counting users) and meaningful inclusion (understanding impact).

Key Insights from the Community

The presentations sparked vigorous discussion among session participants, with several interventions highlighting the political and practical challenges of meaningful DPI measurement. Vinita raised a particularly pointed critique about the depoliticized language often used in DPI discourse. “When we say adoption, adoption was especially for welfare entitlements, coercion,” she argued, noting that in India, enrollment was often a prerequisite for receiving vital benefits. “If we are measuring DPI from a perspective of public trust… what would then our criteria be?” Her intervention challenged the session to acknowledge that adoption through coercion fundamentally differs from voluntary engagement—a distinction current metrics fail to capture.

“If we are measuring DPI from a perspective of public trust… what would then our criteria be?”

Several participants emphasised the critical but overlooked role of intermediaries: community organisations, and local actors who bridge the gap between DPI systems and end users. Kunal Banua observed that these entities often surface fragmented user perceptions that never reach system designers, yet they’re rarely included in formal measurement frameworks. This invisibility, participants suggested, represents both a measurement gap and a missed opportunity for creating more responsive systems.

Towards More Honest Measurement

The session revealed a fundamental tension at the heart of DPI measurement: the metrics that make systems appear successful – adoption rates, user counts, transaction volumes – may be precisely those that obscure their failures. Moving forward requires not just technical innovation but institutional courage to measure what might reveal uncomfortable truths about digital infrastructure’s impact on human dignity.

Join us for the next Community of Practice session on July 16th exploring Trust and Inclusion in DPI contexts.

Subscribe to our newsletter for updates and resources from the DPI measurement community.