A Framework for Responsible Digital Credit

In the era of digital credit, we need not just new laws, but also new mental models for responsible digital credit provision.

As digital credit providers have grown exponentially over the past few years, and as digital products and models have proliferated, so too have concerns around consumer protection. In the recently published report, Responsible Digital Credit, I argue that ensuring that digital credit customers receive responsible treatment requires more than enhanced consumer protection laws and regulations. It also requires strong commitment from the digital credit industry. Finally, it needs consumers who are empowered to play a more proactive role in managing their digital credit responsibly.

Over the past year, I have interviewed digital credit industry associations, credit providers and their third-party data analytic providers, regulators, policy makers, and various consumer protection leaders. Many of these groups are working to develop responsible digital lending principles. With their input, my report proposes a guiding framework for responsible digital lending.  The framework uses the Smart Campaign Client Protection Principles as a basis for grouping the issues. In addition, I propose an eighth principle: security and fraud protection.

Digital credit, and especially the use of mobile phones, significantly alters interactions between borrowers and lenders. Lenders utilizing the mobile channel to advertise, acquire and transact with customers must pay particular attention to the limitations of mobile interaction with customers and follow industry mobile best practices. While many of the standard consumer protection practices continue to apply, new concerns, or more accurately, new faces on older concerns, have emerged. In this post, I want to highlight several of the considerations I found most important and relevant for the digital lending industry.

Responsible Algorithms

Many digital credit models rely on algorithms to process data about clients and make credit decisions. Because most lenders consider their algorithms proprietary, it is difficult for observers to know on what basis credit decisions are actually made. This opaque situation raises the spectre of discrimination. Attributes that are considered inappropriate bases for differentiation, such as religion, language, gender, or ethnic origin, are areas of particular concern with regard to algorithm-based lending practices. As highlighted by a Smart Campaign report, while credit offers may differ based on risk analysis, such differentiation should be consistently applied, stated in advance, and made with the goal of benefitting clients. To prevent discriminatory digital credit practices, some recommendations include:

  • Pretest, test and retest for potential bias. Digital credit providers and third-party data analytics providers should continuously monitor the outcomes of their algorithmic programs to identify potential problems. Such testing could involve running scenarios to identify unwanted outcomes and developing and building controls that prevent adverse outcomes.
  • Document the rationale for algorithmic features. Digital credit providers should be able to provide visualization and a decision tree for the factors their algorithms analyze, as well as the justification for relying on these factors. The rationale should address how borrower debt capacity is assessed, given the importance of this concept in traditional underwriting.
  • Independent review of companies’ proposed data sets and/or algorithms.

A self-regulatory approach could be used to develop best practices guidelines for data inputs and the development of nondiscriminatory artificial intelligence systems, following the model that has been successful for the Payment Card Industry Security Standards Council.

Indebtedness Monitoring

Because digital lending is often based on non-traditional underwriting criteria, to prevent over-indebtedness among digital borrowers it may be relatively more important to monitor outcomes at the portfolio level. Some of the ways to do this include:

  • Avoidance of debt traps by ensuring that clients are not forced into automated repeat loans.
  • Responsible credit reporting – Better compliance with reporting strengthens markets and leads to better client protection practices.
  • Monitoring digital credit provider’s portfolio quality – Industry benchmarks for portfolio quality can help to advance the understanding of risks and avoid business models that rely for their success on high default rates.

Digital Transparency

Digital credit platforms allow for 24/7 communications with clients that, if harnessed properly, can assist them with immediate technical support, remind them about upcoming payments and even identify counselors or resources for clients in financial stress. However, when communication is totally digital, there are heightened concerns about ensuring that clients really know what they are signing up for. Disclosure must lead to client understanding. Borrower disclosure standards should address:

  • Standards for honest advertising and penalties for misleading marketing materials of credit products. Furthermore, providers should advertise the name of the financial regulator they report to in all advertising materials.
  • Readability and format – Disclosures need to be easy to comprehend, available in the customer’s local language, and provided in print and digital formats.
  • Comparative tools – Standard summary disclosure documents would allow borrowers to compare credit offers, both digital and traditional.
  • Adequate notice periods – Consumers should be notified in advance of changes to interest rates, fees or terms and conditions.

A somewhat different but related and relevant concern, often overlooked in this context, is the vulnerability of individual investors who put their money into pure peer-to-peer (P2P) lending platforms. Protections need to be extended to them as well, with a focus on ensuring that they receive the information they need to evaluate risks and that they have recourse channels when something goes amiss.

Data Privacy and Usage Standards

While many regulators and most industry groups agree that it is important to protect the confidentiality and security of customers’ information, data use and the level of disclosure about its use are challenging topics. In the EU and in countries such as Australia, regulators are enacting rules mandating that individuals be notified about the use of their personal data and allowed to opt out. Opt-out rules should be considered as a potential requirement for all digital credit providers, especially in markets where mobile push-marketing has become problematic.

Over the next several years it will be important for laws or regulations to be put in place to protect consumer data privacy and guide information sharing. The rules need to:

  • Provide for informed consent, such that customers explicitly permit electronic data collection prior to its collection and use, and are given the option to consent or not.
  • Require providers to have a clear data collection and handling policy which limits the amount of data collected as well as the timeframe it is stored.
  • Inform clients about their data trails and credit histories, including the ability to ensure accuracy and correct for any errors or discrepancies.
  • Give regulators the power to discipline providers who repeatedly allow personal data to be misused.

Fraud Prevention and Security

New security vulnerabilities arising from digital credit strongly point to fraud and cyber-security as consumer protection concerns. It is increasingly urgent for digital credit providers to adopt standard security compliance measures and then to regularly test and update them. Providers must also be responsible for ensuring that the third-parties they use, including lead generators, brokers, agents, and data analytics providers, also have adequate security in place. Ultimately, providers need to be responsible for the actions of their third-parties (e.g., agents, contractors, lead generators, brokers and collectors).

Consumers should be made aware of the potential for fraud and encouraged to report suspected fraud cases or security breaches. Consumer awareness campaigns on the most common incidents of fraud (via the internet, SMS alerts, signage at agent kiosks, etc.) can help with this.

While the recommendations just listed do not apply in all markets, they do point to the urgent need for regulators, industry bodies, individual providers and consumers to work together to construct the ground rules for responsible digital credit that will ensure that consumers are adequately protected.

Editor’s Note: This research was made possible by support from Mastercard Foundation and other partners.

 

Join the Conversation

Stay informed. Subscribe to our newsletter.