使数据为穷人服务(英文版).pdf
MAKING DATA WORK FOR THE POOR David Medine and Gayatri Murthy New Approaches to Data Protection and Privacy January 2020 Consultative Group to Assist the Poor 1818 H Street NW, MSN F3K-306 Washington, DC 20433 USA Internet: cgap Email: cgapworldbank Telephone: +1 202 473 9594 Cover photo by Geoffrey Buta, CGAP Photo Contest 2018. CGAP/World Bank, 2020 RIGHTS AND PERMISSIONS This work is available under the Creative Commons Attribution 4.0 International Public License (creativecommons/licenses/ by/4.0/). Under the Creative Commons Attribution license, you are free to copy, distribute, transmit, and adapt this work, including for commercial purposes, under the terms of this license. AttributionCite the work as follows: Medine, David, and Gayatri Murthy. 2020. “Making Data Work for the Poor: New Approaches to Data Protection and Privacy.” Washington, D.C.: CGAP. All queries on rights and licenses should be addressed to CGAP Publications, 1818 H Street, NW, MSN F3K-306, Washington, DC 20433 USA; e-mail: cgapworldbank. ExEcuTIvE SuMMARy 1 ExEcuTIvE SuMMARy A S THE COMMERCIAL USE OF PERSONAL DATA GROWS exponentially, so do concerns over whether that data will be used in consumers best interests. This is particularly true for financial services in emerging economies, where data expand the potential for reaching poor and underserved communities with suitable products but where customer protection risks are great. In many markets ranging from Indonesia to India and Kenya, it is unfair to impose the burden of consent on individuals to protect their data when such a large proportion of the population are opening accounts or coming online for the first time, literacy rates are low, and individuals face potential language and technological barriers. Many countries are considering comprehensive legislation to protect peoples data and privacy across all sectors and services. The European Unions General Data Protection Regulation (GDPR) is the most well-known effort in this regard. Countries as diverse as Ghana, South Africa, India, Indonesia, Kenya, and even the United States are considering or have passed wide-ranging data protection legislation. Yet data protection regimes rely heavily on individual consumer consent. This places an unreasonable burden on low-income customers. For example, it is unrealistic even in developed countries for customers to read all the disclosure documents for all the apps on their smartphone. CGAP has concluded that the consumer consent model is broken, additional protections are necessary to protect consumers, and protections can be introduced in ways that do not inhibit responsible innovation. Accordingly, we make three policy recommendations. MAKING DATA WORK FOR THE POOR 2 1. Shift the Onus of Protection onto Providers The burden of data privacy should shift from individuals to providers. Providers should be responsible for using data only for legitimate purposes and in a manner that serves customers interests. Two alternatives are addressed in this paper. One approach, a legitimate purposes test, limits use of data to what is compatible, consistent, and beneficial to consumers, while allowing firms to use de-identified data to develop new and innovative products and services. A key feature of a legitimate purposes approach is that it cannot be overridden by obtaining individual consent. In other words, everyone benefits from legitimate purposes protections, regardless of which boxes they are required to check before accessing a website, downloading an app, or using a digital service. A legitimate purposes test enables providers to use an individuals data to service accounts, fulfill orders, process payments, collect debts, control for quality, enforce security measures, or conduct audits. Innovative uses of data would be permitted if they are consistent with the service for which the data were initially collected. Going beyond such uses, data could be used for more wide-ranging purposes if they were robustly de-identified to reduce the risk of them being used in ways that are harmful to the individuals who provided these data. 1 2 3 Shift Onus Onto Provider Place new responsibilities onto data collectors and processors, rather than relying on consumer consent. Two options: Digital Bill of Rights Empower consumers to control their own data by allowing them to easily access, correct and port data free of charge. Privacy Representatives Ensure fairness in processing of data through privacy representatives who can review consumers data profiles and check algorithmic models for fairness, bias and exclusion. Legitimate Purposes Test Only allowed to use data in ways that benefit the customer; Fiduciary Duty Must always act in the interests of the customer. DIGITAL RIGHTS 10011 01011 01101 01011 11001 OR CGAPs Three Recommendations ExEcuTIvE SuMMARy 3 An alternative approach, a fiduciary duty requirement, requires data collection and processing firms to always act in the interests of, and not in ways detrimental to, the subjects of the data. Such legislation, which is currently being considered in the United States and India, would mean that providers could not use data in ways that benefit themselves over their customers, or sell or share customers data with third parties that fail to put the customers best interests first. This approach would limit the information asymmetry in many markets in which providers have a much greater knowledge than their customers about how customers data may be used. The fiduciary duty approach also recognizes that poor people should not be required to give up their data protection rights to use digital services. Instead, legally obligating providers to act in the best interest of their customers can help establish trust and confidence among customers that their data are being used responsibly, making them more willing to use new products and services. 2. Empower Users through Modern Digital Rights That Go Beyond Consent Our second policy recommendation calls for adopting a set of six digital rights that empower consumers to access, review, and correct their data and to transfer their data to other providers. Most of these rights should be enforced not only at the beginning of a service or relationship, but even after customer data have been collected or processed. 3. Ensure Fairness in Processing Through Privacy Representatives As artificial intelligence (AI) becomes increasingly complex and widespread, we need to counter the danger that algorithmic or machine learning reinforces exclusionary biases. Otherwise AI could increase economic inequalities rather than counter them. Moreover, AI-driven decisions are beyond the individuals ability to monitor and evaluate. Consumers need expert assistance in assessing how automated decisions are made. Privacy representatives, whether persons or digital mechanisms, should be introduced to assess decision-making models for fairness, bias, and exclusion. This may not seem pertinent today in most emerging markets, but it will soon become a critical tool to prevent exclusion as products are introduced that use AI and machine learning to assess who is eligible and on what terms are introduced. Protecting data is critical to developing trust and confidence among poor consumers and to building a truly inclusive digital economy. Data can be used to lift up and benefit the lives of poor people, but they must not be used to exacerbate exclusion and inequality. Outdated reliance on consumer consent cannot provide adequate protections. We need to begin work on new approaches and foundational rights that are future ready. DIGITAL RIGHTS 10011 01011 01101 01011 11001 4 MAKING DATA WORK FOR THE POOR THE PROLIFERATION OF PERSONAL DIGITAL DATA I N MANY DEVELOPING COUNTRIES, THE EXPANSION OF NEW ID systems, access to digital financial services, and the deeper penetration of mobile devices are pulling more and more people into the digital economy, rapidly expanding the size of the digital data trails they leave behind. Consider three fast-moving countries: Indonesia, India, and Kenya. As Figure 1 shows, mobile Internet penetration and financial account access are increasing side by side. These developments are unlocking new opportunities to use data analytics to provide access to financial services for the underserved. Many poor people lack formal financial records or credit histories and hence often are excluded, but data analytics provide alternate means of assessing eligibility for financial products. This may also lead to lower prices, greater competition and choice, and more useful, customized services. Data can expand financial inclusion in many ways, including through alternative credit scoring, links to application programming interfaces (API), big data or public data analytics, and automated claims processing. FIGURE 1. Increase in low-income populations joining the digital economy Sources: Findex (World Bank), GSMA, Identification for Development (World Bank). 2011 2017 Indonesia 20% 49% India 35% 80% Kenya 42% 82% 2014 2018 Indonesia 21% 43% India 19% 35% Kenya 16% 24% Account Holders (Findex) 161 countries have ID systems using digital technologies, reinforcing the need for robust privacy and data protection safeguards. Aadhaar ID enrollment in India 2009 2013 2019 Launch 440 mn 1.2 bn Mobile Internet Penetration (GSMA) 5 THE PROLIFERATION OF PERSONAL DIGITAL DATA Alternative credit scoring. Digital credit products, whether for individuals or micro, small, and medium enterprises (MSMEs), collect data from alternative and online sources. These include electronic bank accounts, payments gateways, online accounting companies, and e-commerce marketplaces. Using these data, financial institutions can create alternative credit scores and offer faster and more customized loans. Firms like Branch and Tala are prominent examples of firms offering microloans in Africa and South Asia using various types of data from smartphones. See Box 1. Application programming interfaces (APIs) links. New fintech and technology companies can expand data sharing by using the APIs of banks or mobile money companies to offer products such as insurance, investments, or other services based on payment behavior. 1 Use of open APIs may result in fintech innovation that can give underserved people access to new products and services. Examples range from leased tractors to pay-as-you-go solar to financial health apps. Big data or public data analytics. Data can be valuable when they come in large quantities that can be observed over time. For example, insurance providers use satellite and yield data to reduce the cost of offering insurance to smallholder farmers. Automated claims processing. Though not yet widespread in financial services in emerging markets, algorithms and other digital processing will increasingly be used to assess, approve, and disburse claims for insurance. But as the commercial use of personal data grows exponentially, so do concerns over whether that data will be used in the consumers best interests. Algorithms and artificial intelligence (AI) are being used to make decisions about customers, such as whether they will get a loan or whether their insurance claims should be paid in a way that is consistent, accurate, and scalable. But there is also the risk that bias or unfairness in such models will entrench exclusion around socioeconomic status, gender, race, or caste at scale. 1 An API allows one software program to “talk” to another. APIs enable a wide range of innovative products and services that millions of people use every day. For instance, ride-hailing apps use APIs to leverage other companies mapping and payments systems. When a financial services provider “opens” its APIs, it makes them widely available to other companies. To learn more about the possibilities of open APIs, see “Open APIs in Digital Finance,“ CGAP, cgap/topics/collections/open-apis. BOX 1. Examples of firms using alternative credit scoring Tala uses a smartphone app to evaluate applicants credit risk. It gathers various types of data, including where loan applicants spend their time, how many people they communicate with every day, how often they call their parents (by searching call logs for the word “mama”), and less surprisingly, whether they pay their bills on time. Branch makes lending decisions using information stored on smartphones, including call logs, SMS logs, Facebook friends, contact lists from other social media accounts, photos, videos, and other digital content. Fintechs such as Yoco in South Africa and Aye Finance in India make noncollateralized loans to MSMEs based on data from the firms digital payments and transactions. 6 MAKING DATA WORK FOR THE POOR Personal data can be leaked, stolen, or exposed through security breaches that can result in identity theft or embarrassment (Baur-Yazbeck 2018). Attempts to hack financial firms, including those that provide financial services to poor people, have the potential to effect devastating losses to vulnerable populations, which can undermine trust and confidence. Customers in emerging markets, especially those who are underserved and poor, who are going digital for the first time may have limited literacy or experience with technology, making them ill-equipped to protect their data. Recognizing this, many countries, whether developed, emerging, or developing, have been considering comprehensive legislation to protect peoples data and privacy across all sectors and services (see Box 2). The European Unions General Data Protection Regulation (GDPR), which took effect in May 2018, is one of the most well-known efforts in this regard, but countries as diverse as Ghana, South Africa, India, Indonesia, Kenya, and even the United States are considering or have passed wide-ranging data protection legislation. There is no one approach, with some data protection laws applying across the economy while others employing a sector-by-sector approach. BOX 2. Recent approaches to data protection legislation Countries have taken different approaches to data protection, including sectoral versus omnibus laws. Notable examples include the following: The European Unions GDPR applies broadly. Its features include a focus on consumer rights, reliance on consumer consent with a legitimate use constraint, and a duty to build in privacy as products are designed. The United States uses a sectoral approach, in which consent plays a significant role. It includes a health privacy law (Health Insurance Portability and Accountability Act), laws focusing on data protection within credit bureaus, and privacy laws pertaining to children, video rentals, and drivers records. Several approaches to comprehensive privacy legislation are now being considered, some of which would rely on consent. India has a draft personal data protection bill framed as an omnibus law that would create a separate data prote