India’s Proposed Data Protection Bill Breaks from Notice and Consent

India is considering a data protection bill that would mark a significant advance in rethinking how to protect digital consumer rights.  It would move India away from reliance on the broken model of individual consent as the means to protect personal data and instead put far more responsibility on the provider to act in its customers’ best interests. Such a progressive approach could place India at the forefront of modern data protection regimes.

Most data protection regimes around the world rely heavily on the model of disclosing in a privacy notice how information will be used and assuming people will read it and provide informed consent. Customers in emerging markets, especially those who are underserved and poor or who are going digital for the first time, may have limited literacy or experience with technology, making them ill-equipped to protect their data. It is unrealistic even in developed countries for customers to read all the disclosure documents for all the apps on their smartphones.

Photo: Simone D. McCourtie, World Bank
Photo: Simone D. McCourtie, World Bank

Yet most emerging and developing countries continue to rely on the consumer consent approach when adopting new data protection regimes. Instead of looking to the future and building modern data protections for the kinds of economies they will have in 10 to 20 years, these countries have relied on an approach that no longer provides sufficient protections for the technological advances of the 21st century.

The prospects in India are different. It is taking an innovative approach that could set the standard for the future. The pending bill contains some novel provisions, such as the creation of “consent managers,” “data trust scores” and a regulatory sandbox that would allow start-ups to safely test new business models without fear of regulatory repercussions. One of the bill’s most significant, though somewhat unnoticed, advances is a move away from reliance on individual consent as the means to protect personal data.

In its recent work examining the future of data protection, CGAP concluded that the burden of data privacy should shift from individuals to providers. Providers should be responsible for using data only for legitimate purposes and in a manner that serves customers’ interests. A legitimate purposes test would limit use of data to what is compatible, consistent and beneficial to consumers, while allowing firms to use de-identified data to develop new and innovative products and services. A key feature of a legitimate purposes approach is that it cannot be overridden by obtaining individual consent. In other words, everyone benefits from legitimate purposes protections, regardless of which boxes they are required to check before accessing a website, downloading an app or using a digital service.

A legitimate purposes test enables providers to use an individual’s data to service accounts, fulfill orders, process payments, collect debts, control for quality, enforce security measures or conduct audits. Innovative uses of data would be permitted if they are consistent with the service for which the data were initially collected. Going beyond such uses, data could be used for more wide-ranging purposes if they were robustly de-identified to reduce the risk of them being used in ways that are harmful to the individuals who provided these data.

An alternative approach that CGAP also supports, a fiduciary duty standard, would require data collection and processing firms to always act in the interests of, and not in ways detrimental to, the subjects of the data. This approach would limit the information asymmetry in many markets in which providers have a much greater knowledge than their customers do about how data may be used. The fiduciary duty approach also recognizes that poor people should not be required to give up their data protection rights to use digital services. Instead, legally obligating providers to act in the best interest of their customers can help establish trust and confidence among customers that their data are being used responsibly, making them more willing to use new products and services.

The Indian data protection bill follows both of CGAP’s proposed approaches. First, following the fiduciary duty standard, the bill requires that personal data be processed only “in a fair and reasonable manner” that “ensure[s] the privacy of the individual.” This element of fairness would make sure that the individual’s interests are preeminent. Second, the bill states that data may be used only for the purposes the individual “would reasonably expect that such personal data shall be used for, having regard to the purpose, and in the context and circumstances in which the personal data was collected.” In other words, a legitimate purpose standard.

We welcome India’s embrace of our key proposals, demonstrating how emerging markets can lead the way on individual privacy. Placing the burden on providers — and not individuals — to protect data privacy gives India the opportunity to leapfrog to a future-ready data protection environment by shifting the paradigm for data protection in a way that both reflects the technological realities of the 21st century while it considers the needs of its poorest citizens. We hope it will provide a model for other developing economies looking for a better way to build an inclusive future.



The consumer consent model for data privacy and protection is broken. It’s time for a new data paradigm whereby financial services providers and data collectors take greater responsibility for protecting customers’ data.

Add new comment