Dvara Research BlogDvara Research Blog
Dvara Research Blog
Doorway to Financial Access
  • Home
  • Our Work
  • Themes
  • Subscribe
    • Email Subscription
    • Feed
  • Contact Us
Menu back  

Big Data, Financial Inclusion and Privacy for the Poor

August 22, 20172 CommentsGuest Viewed : 9435

Guest Post by Dr Katharine Kemp, Research Fellow, UNSW Digital Financial Services Regulation Project

Financial inclusion is not good in itself.

We value financial inclusion as a means to an end. We value financial inclusion because we believe it will increase the well-being, dignity and freedom of poor people and people living in remote areas, who have never had access to savings, insurance, credit and payment services.

It is therefore important to ensure that the way in which financial services are delivered to these people does not ultimately diminish their well-being, dignity and freedom. We already do this in a number of ways – for example, by ensuring providers do not make misrepresentations to consumers, or charge exploitative or hidden rates or fees. Consumers should also be protected from harms that result from data practices, which are tied to the provision of financial services.

Benefits of Big Data and Data-Driven Innovations for Financial Inclusion

“Big data” has become a fixture in any future-focused discussion. It refers to data captured in very large quantities, very rapidly, from numerous sources, where that data is of sufficient quality to be useful. The collected data is analysed, using increasingly sophisticated algorithms, in the hope of revealing new correlations and insights.

There is no doubt that big data analytics and other data-driven innovations can be a critical means of improving the health, prosperity and security of our societies. In financial services, new data practices have allowed providers to serve customers who are poor and those living in remote areas in new and better ways, including by permitting providers to:

  • extend credit to consumers who previously had to rely on expensive and sometimes exploitative informal credit, if any, because they had no formal credit history;
  • identify customers who lack formal identification documents;
  • design new products to fit the actual needs and realities of consumers, based on their behaviour and demographic information; and
  • enter new markets, increasing competition on price, quality and innovation.

But the collection, analysis and use of enormous pools of consumer data has also given rise to concerns for the protection of financial consumers’ data and privacy rights.

Potential Harms from Data-Driven Innovations

Providers now not only collect more information directly from customers, but may also track customers physically (using geo-location data from their mobile phones); track customers’ online browsing and purchases; and engage third parties to combine the provider’s detailed information on each customer with aggregated data from other sources about that customer, including their employment history, income, lifestyle, online and offline purchases, and social media activities.

Data-driven innovations create the risk of serious harms both for individuals and for society as a whole. At the individual level, these risks increase as more data is collected, linked, shared, and kept for longer periods, including the risk of:

  • inaccurate and discriminatory conclusions about a person’s creditworthiness based on insufficiently tested or inappropriate algorithms;
  • unanticipated aggregation of a person’s data from various sources to draw conclusions which may be used to manipulate that person’s behaviour, or adversely affect their prospects of obtaining employment or credit;
  • identity theft and other fraudulent use of biometric data and other personal information;
  • disclosure of personal and sensitive information to governments without transparent process and/or to governments which act without regard to the rule of law; and
  • harassment and public humiliation through the publication of loan defaults and other personal information.

Many of these harms are known to have occurred in various jurisdictions. The reality is that data practices can sometimes lead to the erosion of trust in new financial services and the exclusion of vulnerable consumers.

Even relatively well-meaning and law-abiding providers can cause harm. Firms may “segment” customers and “personalise” the prices or interest rates a particular consumer is charged, based on their location, movements, purchase history, friends and online habits. A person could, for example, be charged higher prices or rates based on the behaviour of their friends on social media.

Data practices may also increase the risk of harm to society as a whole. Decisions may be made to the detriment of entire groups or segments of people based on inferences drawn from big data, without the knowledge or consent of these groups. Pervasive surveillance, even the awareness of surveillance, is known to pose threats to freedom of thought, political activity and democracy itself, as individuals are denied the space to create, test and experiment unobserved.

These risks highlight the need for perspective and caution in the adoption of data-driven innovations, and the need for appropriate data protection regulation.

The Prevailing “Informed Consent” Approach to Data Privacy

Internationally, many data privacy standards and regulations are based, at least in part, on the “informed consent” – or “notice” and “choice” – approach to informational privacy. This approach can be seen in the Fair Information Practice Principles that originated in the US in the 1970s; the 1980 OECD Privacy Guidelines; the 1995 EU Data Protection Directive; and the Council of Europe Convention 108.

Each of these instruments recognise consumer consent as a justification for the collection, use, processing and sharing of personal data. The underlying rationale for this approach is based on principles of individual freedom and autonomy. Each individual should be free to decide how much or how little of their information they wish to share in exchange for a given “price” or benefit. The data collector gives notice of how an individual’s data will be treated and the individual chooses whether to consent to that treatment.

This approach has been increasingly criticised as artificial and ineffectual. The central criticisms are that, for consumers, there is no real notice and there is no real choice.

In today’s world of invisible and pervasive data collection and surveillance capabilities, data aggregation, complex data analytics and indefinite storage, consumers no longer know or understand when data is collected, what data is collected, by whom and for what purposes, let alone how it is then linked and shared. Consumers do not read the dense and opaque privacy notices that supposedly explain these matters, and could not read them, given the hundreds of hours this would take. Nor can they understand, compare, or negotiate on, these privacy terms.

These problems are exacerbated for poor consumers who often have more limited literacy, even less experience with modern uses of data, and less ability to negotiate, object or seek redress. Yet we still rely on firms to give notice to consumers of their broad, and often open-ended, plans for the use of consumer data and on the fact that consumers supposedly consented, either by ticking “I agree” or proceeding with a certain product.

The premises of existing regulation are therefore doubtful. At the same time, some commentators question the relevance and priority of data privacy in developing countries and emerging markets.

Is data privacy regulation a “Western” concept that has less relevance in developing countries and emerging markets?

Some have argued that the individualistic philosophy inherent in concepts of privacy has less relevance in countries that favour a “communitarian” philosophy of life. For example, in a number of African countries, “ubuntu” is a guiding philosophy. According to ubuntu, “a person is a person through other persons”. This philosophy values openness, sharing, group identity and solidarity. Is privacy relevant in the context of such a worldview?

Privacy, and data privacy, serve values beyond individual autonomy and control. Data privacy serve values which are at the very heart of “communitarian” philosophies, including compassion, inclusion, face-saving, dignity, and the humane treatment of family and neighbours. The protection of financial consumers’ personal data is entirely consistent with, and frequently critical to, upholding values such as these, particularly in light of the alternative risks and harms.

Should consumer data protection be given a low priority in light of the more pressing need for financial inclusion?

Some have argued that, while consumer data protection is the ideal, this protection should not have priority over more pressing goals, such as financial inclusion. Providers should not be overburdened with data protection compliance costs that might dissuade them from introducing innovative products to under-served and under-served consumers.

Here it is important to remember how we began: financial inclusion is not an end in itself but a means to other ends, including permitting poor and those living in remote areas to support their families, prosper, gain control over their financial destinies, and feel a sense of pride and belonging in their broader communities. The harms caused by unregulated data practices work against each of these goals.

If we are in fact permanently jeopardising these goals by permitting providers to collect personal data at will, financial inclusion is not serving its purpose.

Solutions

There will be no panacea, no simple answer to the question of how to regulate for data protection. A good starting place is recognising that consumers’ “informed consent” is most often fictional. Sensible solutions will need to draw on the full “toolkit” of privacy governance tools (Bennett and Raab, 2006), such as appropriate regulators, advocacy groups, self-regulation and regulation (including substantive rules and privacy by design). The solution in any given jurisdiction will require a combination of tools best suited to the context of that jurisdiction and the values at stake in that society.

Contrary to the approach advocated by some, it will not be sufficient to regulate only the use and sharing of data. Limitations on the collection of data must be a key focus, especially in light of new data storage capabilities, the likelihood that de-identified data will be re-identified, and the growing opportunities for harmful and unauthorised access the more data is collected and the longer it is kept.

Big data offers undoubted and important benefits in serving those who have never had access to financial services. But it is not a harmless curiosity to be mined and manipulated at the will of those who collect and share it. Personal information should be treated with restraint and respect, and protected, in keeping with the fundamental values of the relevant society.

 

—-

References:

Colin J Bennett and Charles Raab, The Governance of Privacy (MIT Press, 2006)

Gordon Hull, “Successful Failure: What Foucault Can Teach Us About Privacy Self-Management in a World of Facebook and Big Data” (2015) 17 Ethics and Information Technology Journal 89

Debbie VS Kasper, “Privacy as a Social Good” (2007) 28 Social Thought & Research 165

Katharine Kemp and Ross P Buckley, “Protecting Financial Consumer Data in Developing Countries: An Alternative to the Flawed Consent Model” (2017) Georgetown Journal of International Affairs (forthcoming)

Alex B Makulilo, “The Context of Data Privacy in Africa,” in Alex B Makulilo (ed), African Data Privacy Laws (Springer International Publishing, 2016)

David Medine, “Making the Case for Privacy for the Poor” (CGAP Blog, 15 November 2016)

Lokke Moerel and Corien Prins, “Privacy for the Homo Digitalis: Proposal for a New Regulatory Framework for Data Protection in the Light of Big Data and the Internet of Things” (25 May 2016)

Office of the Privacy Commissioner of Canada, Consent and Privacy: A Discussion Paper Exploring Potential Enhancements to Consent Under the Personal Information Protection and Electronic Documents Act (2016)

Omri Ben-Shahar and Carl E Schneider, More Than You Wanted to Know: The Failure of Mandated Disclosure (Princeton University Press, 2016)

Productivity Commission, Australian Government, “Data Availability and Use” (Productivity Commission Inquiry Report No 82, 31 March 2017)

Bruce Schneier, Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World (WW Norton & Co, 2015)

Daniel J Solove, “Introduction: Privacy Self-Management and the Consent Dilemma” (2013) 126 Harvard Law Review 1880

 

Share Via :Tweet about this on Twitter
Twitter
Share on Facebook
Facebook
Share on LinkedIn
Linkedin
Email this to someone
email
Big DataData Privacyfinancial inclusionPrivacy
2 Comments
  1. Reply
    August 23, 2017 at 11:05 am
    parvinderster

    We value financial inclusion as a means to an end…. that is a great point but just wondering is access a stepping stone to equity.

  2. Reply
    August 23, 2017 at 11:08 pm
    Nachiket Mor

    This is a very powerful post and brings an entirely fresh insight to the issue of privacy. Its conclusion that we will need to place careful limits on what data are being collected and why, and develop mandatory data-deletion requirements is a very important one to carefully consider. After reading her post I now see why simply guidelines on storage and usage may not be enough.

Leave Comment

Cancel reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

20 + one =

clear formSubmit

Related posts
Solving the Micro-Entrepreneurs Working Capital Conundrum: BharatBazaar Pilot
June 18, 2021
The Need for Boundaries: Respecting Privacy in Financial Consumer Data Practices
March 9, 2018
Agricultural Markets: Five Opportunities for Innovation After Demonetisation
January 10, 2017
Replug – Interview with Dr. Viral Acharya
December 28, 2016
Insights on Public Data & Visualisation – In conversation with co-founders of “How India Lives”
December 10, 2016
The Nexus of Financial Inclusion and Stability: Implications for Holistic Financial Policy-Making
September 26, 2016
Search
Recent Comments
  • Prasanna Srinivasan on Care through competition: The case of the Netherlands: “This made interesting and informative reading. Thank you. Inevitably, the mind ran a comparison with the Indian context even while…”
  • Misha Sharma on Direct Benefit Transfers in Assam, Chhattisgarh, and Andhra Pradesh: Introducing the Dvara-Haqdarshak Study on Exclusion in Government to Person Payments: “Great post, Aarushi. It will also be interesting to document the challenges faced in accessing these transfers and experiences with…”
  • Misha Sharma on What is Social Protection?: “Thanks for writing this, Anupama. A much needed piece and looking forward to the second post in this series. It…”
Subscribe and Follow Us

Popular Post

Popular Post
  • ‘Buy Now, Pay Later’: What is it, and how does it affect customer protection?
    May 5, 2022
  • Call for Papers: Field Workshop on Household Finance 25th June, 2022
    May 4, 2022
  • Care through competition: The case of the Netherlands
    April 28, 2022

Categories

Categories
  • Channels(88)
  • Consumer Protection(33)
  • Events(30)
  • Featured(42)
  • Field Reports(6)
  • From the field(9)
  • General(22)
  • Guest(30)
  • Household Research(75)
  • Long Term Debt Markets(9)
  • News(45)
  • Origination(30)
  • Products(42)
  • Regulation(112)
  • Research(254)
  • Risk Aggregation(26)
  • Risk transmission(63)
  • Small Cities(21)
  • Technology(25)
  • Uncategorized(105)
  • Unemployment Support(5)

Archives

Archives
  • May 2022 (2)
  • April 2022 (4)
  • March 2022 (2)
  • February 2022 (3)
  • January 2022 (3)
  • December 2021 (4)
  • November 2021 (6)
  • October 2021 (4)
  • September 2021 (4)
  • August 2021 (6)
  • July 2021 (6)
  • June 2021 (10)
  • May 2021 (7)
  • April 2021 (9)
  • March 2021 (10)
  • February 2021 (8)
  • January 2021 (4)
  • December 2020 (7)
  • November 2020 (7)
  • October 2020 (11)
  • September 2020 (10)
  • August 2020 (12)
  • July 2020 (3)
  • June 2020 (5)
  • May 2020 (8)
  • April 2020 (4)
  • March 2020 (8)
  • February 2020 (3)
  • January 2020 (9)
  • December 2019 (4)
  • November 2019 (3)
  • October 2019 (7)
  • September 2019 (3)
  • August 2019 (2)
  • July 2019 (4)
  • June 2019 (4)
  • May 2019 (4)
  • April 2019 (7)
  • March 2019 (2)
  • February 2019 (3)
  • January 2019 (3)
  • December 2018 (5)
  • November 2018 (2)
  • October 2018 (5)
  • September 2018 (2)
  • August 2018 (2)
  • July 2018 (2)
  • June 2018 (2)
  • May 2018 (1)
  • April 2018 (1)
  • March 2018 (5)
  • February 2018 (2)
  • January 2018 (2)
  • December 2017 (5)
  • November 2017 (4)
  • October 2017 (3)
  • September 2017 (1)
  • August 2017 (3)
  • July 2017 (1)
  • June 2017 (3)
  • May 2017 (4)
  • April 2017 (3)
  • March 2017 (4)
  • February 2017 (3)
  • January 2017 (6)
  • December 2016 (5)
  • November 2016 (2)
  • October 2016 (3)
  • September 2016 (5)
  • August 2016 (4)
  • July 2016 (4)
  • June 2016 (8)
  • May 2016 (4)
  • April 2016 (5)
  • March 2016 (4)
  • February 2016 (3)
  • January 2016 (3)
  • December 2015 (3)
  • November 2015 (1)
  • October 2015 (2)
  • September 2015 (3)
  • August 2015 (5)
  • July 2015 (3)
  • June 2015 (3)
  • May 2015 (3)
  • April 2015 (2)
  • March 2015 (3)
  • February 2015 (1)
  • January 2015 (1)
  • December 2014 (5)
  • November 2014 (4)
  • October 2014 (3)
  • September 2014 (4)
  • August 2014 (4)
  • July 2014 (4)
  • June 2014 (8)
  • May 2014 (1)
  • April 2014 (4)
  • March 2014 (5)
  • February 2014 (6)
  • January 2014 (8)
  • December 2013 (7)
  • November 2013 (8)
  • October 2013 (7)
  • September 2013 (7)
  • August 2013 (5)
  • July 2013 (6)
  • June 2013 (7)
  • May 2013 (6)
  • April 2013 (8)
  • March 2013 (9)
  • February 2013 (6)
  • January 2013 (9)
  • December 2012 (8)
  • November 2012 (7)
  • October 2012 (5)
  • September 2012 (5)
  • August 2012 (5)
  • July 2012 (7)
  • June 2012 (4)
  • May 2012 (6)
  • April 2012 (4)
  • March 2012 (7)
  • February 2012 (6)
  • January 2012 (8)
  • December 2011 (8)
  • November 2011 (7)
  • October 2011 (8)
  • September 2011 (7)
  • August 2011 (3)
  • July 2011 (6)
  • June 2011 (11)
  • May 2011 (8)
  • April 2011 (9)
  • March 2011 (13)
  • February 2011 (10)
  • January 2011 (8)
  • December 2010 (10)
  • November 2010 (10)
  • October 2010 (10)
  • September 2010 (7)
  • August 2010 (13)
  • July 2010 (10)
  • June 2010 (6)
  • May 2010 (13)
  • April 2010 (7)
  • March 2010 (10)
  • February 2010 (5)
  • January 2010 (4)
  • December 2009 (3)
  • November 2009 (1)
  • October 2009 (6)
  • August 2009 (1)
  • July 2009 (2)
  • June 2009 (1)
  • May 2009 (1)
  • April 2009 (1)
  • March 2009 (1)
Share Via :Tweet about this on Twitter
Twitter
Share on Facebook
Facebook
Share on LinkedIn
Linkedin
Email this to someone
email
Site Map

www.dvara.com