Session 5 - Measuring “Just” DPI
Key Takeaways
- Measurement Shapes Reality: The metrics we choose don’t just describe DPI impacts, they actively shape funding priorities, design decisions, and ultimately determine who benefits from digital infrastructure.
- From Upward to Outward Accountability: Effective DPI measurement must shift from metrics designed for government reporting to frameworks that capture lived experiences and developmental outcomes for citizens.
- Justice as Measurement Framework: Adopting data justice principles such as representation, rights, and redress, provides more meaningful evaluation criteria than technical indicators alone.
- Qualitative Evidence Matters: The supremacy of quantitative metrics in policy spaces obscures critical insights about exclusion that only emerge through ethnographic and experiential research methods.
- Measuring What’s Missing: The most important DPI impacts may be those we’re not measuring because they’re harder to fund, including dignity, agency, and the hidden labour of navigating digital systems.
Session Overview
Measurement shapes priorities. When we measure the wrong things, or measure the right things poorly, we risk perpetuating systems that serve some while failing others. But this reality contains its own solution: by fundamentally reimagining what and how we measure, we can transform DPI from a potential instrument of exclusion into a genuine tool for social justice.
The fifth session of our Community of Practice on DPI Measurement, held on August 20th, brought together three researchers who are doing precisely this reimagining. Rather than simply critiquing existing metrics, Sumedha Deshmukh (Cambridge University), Vanita Leah Falcao (King’s College London/ODI Global), and Tony Roberts (Institute of Development Studies) presented actionable frameworks for measurement that centres human dignity, captures invisible labour, and holds systems accountable to those they claim to serve. Their work suggests that the very act of measurement, when done thoughtfully, can become a lever for justice.
Navigating the Promises and Realities of the DPI Agenda
Drawing from her ongoing research addressing the tensions between the developing narratives around DPI and the emerging realities of its design and implementation Sumedha Deshmukh demonstrated how this gap persists because our measurement systems are designed to confirm promises rather than investigate realities.
The problem runs deeper than missing data. Deshmukh showed how existing frameworks actively obscure critical information. When metrics focus on aggregate cost savings, they hide the distribution of costs and benefits. When they celebrate total registrations, they erase the experience of rejection. When they report government efficiency gains, they ignore citizen time lost to system failures. Each measurement choice reflects a decision about whose perspective matters.
This creates what Deshmukh identified as cascading accountability failures. Without transparent methodologies, numbers can’t be verified. Without verification, claims can’t be challenged. Without challenge, systems can’t improve. The result is measurement infrastructure that validates itself through circular citation while remaining fundamentally disconnected from ground truth.
Deshmukh’s proposed intervention seeks to develop measurement practices that remain engaged with the hard realities of DPI implementation. Examples of this could include increased transparency in data collection, independent verification of impact claims, and metrics that capture not just what systems deliver but what they demand from users.
Measurement and the Danger of “Digitalwashing”
Vanita Leah Falcao introduced “digitalwashing” as a lens for understanding how DPI can create an illusion of progress while entrenching existing inequalities. Through fieldwork on India’s PMMVY maternity benefit scheme, she demonstrated how digital systems often amplify rather than alleviate the bureaucratic burdens faced by vulnerable populations.
The pattern of exclusion that Falcao documented goes as follows: digital integration creates new forms of documentary burden that disproportionately affect those least equipped to navigate them. A woman seeking maternity benefits must ensure alignment across multiple databases: Aadhaar, banking, health records, civil registration. Each system having its own logic, its own requirements, its own failure modes.
What Falcao termed “digitalwashing” describes this gap between representation and reality. Systems report successful digitisation while women experience heightened exclusion. The measurement implications of this situation are profound, highlighting the risk that current metrics only serve to capture the system’s view of itself rather than the lived experience of the users.
Session participants immediately recognised these patterns. As Siobhan Green observed, “these issues won’t be unique to this use case, but rather common,” raising the critical question: “Is there a common customer service or incident tracking process across different govt services served by DPI?” The answer, overwhelmingly, is no. grievance mechanisms either don’t exist or require the very digital literacy and access that excluded people in the first place.
Falcao’s work demonstrates that meaningful measurement must bridge the gulf between system metrics and human metrics. This means tracking not just successful transactions but failed attempts, not just processing time but citizen time, not just digital efficiency but human dignity and wellbeing. Only by measuring what people experience rather than what systems report can we build DPI that serves justice rather than statistics.
Measurement Beyond Quantification
Tony Roberts concluded the session with a provocative thought experiment: “The operation was successful, but the patient died.” This medical metaphor captures the fundamental disconnect between technical metrics that indicate successful DPI implementation, and human outcomes where people’s actual wellbeing remains unchanged or deteriorates. Roberts’ rights-based framework offers a path beyond this paradox toward measurement that serves justice rather than justification.
Roberts proposed Linnet Taylor’s Data Justice Framework as a corrective, introducing three dimensions typically absent from DPI assessment.
Representation reveals the violence of making certain lives legible to systems while rendering others invisible.
Rights examines whether DPI enhances or erodes fundamental freedoms, from privacy to participation.
Redress investigates what happens when systems fail, exposing the cruel irony of digital-only grievance mechanisms that exclude the digitally excluded.
These principles anchor measurement within a rights-based framework that makes explicit the political and ethical dimensions of data collection. As Roberts emphasised, measurement isn’t neutral documentation but active construction, it shapes whose experiences count and whose questions get answered.
This recognition demands clarity about measurement’s purpose and beneficiaries. Without this clarity, we risk what Roberts warns against: measurement that serves institutional validation rather than human dignity, that answers to funders rather than citizens, that documents success while ignoring failure.
Conclusion
The session demonstrated that better measurement alone won’t solve DPI’s exclusions, but without it, we can’t even see what needs solving. Every metric embodies assumptions about whose experience counts. The speakers showed how making different choices transforms what becomes visible and therefore addressable.
Measurement reform means changing whose questions get answered, whose experiences become evidence, whose realities count. This is beginning to happen, but requires institutional courage to implement what methodological innovation makes possible.
Join the Conversation
Get Involved: The Community of Practice continues to evolve based on member contributions and interests. To suggest topics for future sessions or share implementation experiences that could inform measurement framework development, please reach out through our feedback form.
Stay Connected: Subscribe to our DPI Map Newsletter to receive updates about upcoming sessions and research insights.Â
Upcoming Sessions: The topic of the upcoming session to be held mid-September will be announced in the coming weeks.Â
For questions about the Community of Practice or to discuss collaboration opportunities around DPI security measurement, contact our community manager Mitchel Pass at m.pass@ucl.ac.uk.