我们认为，金融科技可以帮助低收入人群增强抵御风险的能力、建立资产、管理现金流、增加收入。在过去几年里，Bamboo Capital先后对哥伦比亚、墨西哥、智利和坦桑尼亚的金融科技公司进行了四次股权投资，累计投资额超过1600万美元。这些获得投资的创企通过各种各样的方式推动着当地普惠金融产业的发展，P2P贷款平台KuboFinanciero促进融资大众化，ComparaOnline拓展了保险产品的用户渠道，Movii为低收入人群提供移动支付和储蓄服务，而First Access则为新兴市场金融机构提供智能数据服务。
对于不与客户面对面交流的销售人员，确保行为道德是很难的。假设销售人员收到两条相互矛盾的指示：a）本月放出大量资金，并且b）认真做好并对客户的信誉进行判断。因为目标激进，销售人员会对这两条指令进行折衷处理。金融科技是否会减少或加剧激进的销售目标？现在说可能为时过早。当发生冲突时，大多数人只有在被监视状态才会表现出符合道德规范的行为。Bibi Mehtab Rose-Palan在最近的一篇论文中引用了富国银行的例子，在未经用户同意的情况下，银行以客户名义开设了至少有两百万个存款和储蓄账户。Rose-Palan认为虽然银行号召员工遵守道德规范，但是设定的激进目标导致员工进行欺诈、"道德沦丧"。在这方面投资者责任重大：设定不现实的高目标，就是变相鼓励员工在工作中走捷径。例如，不监视员工行为（对待客户是否友善），而是监控员工绩效（是否达到销售目标），大多数员工就会只专注于销售目标。反之，如果在整个董事会、高级管理层和销售人员中推行道德领导，这种风险就会大大降低。
Financial technology has the potential to help lift millions out of poverty. But are we adequately assessing its risks?
The potential for financial technology, or fintech, to help the financially excluded populations of the world is well documented. As two billion people are still without bank accounts, savings, loans, and access to payment services, fintech is indeed a welcome innovation. At Bamboo Capital Partners, we believe fintech can help low-income people reduce vulnerabilities, build assets, manage cash flow, and increase income, and we have invested as such: In the last couple of years, we have made four equity investments in fintech companies in Colombia, Mexico, Chile, and Tanzania, committing more than $16 million. Our investees are helping democratize access to finance through peer-to-peer lending platforms (KuboFinanciero), promoting access to insurance (ComparaOnline), enabling mobile payments and savings for low-income people through nano deposits (Movii), and providing a smart data platform for emerging market financial institutions (First Access).
Yet fintech doesn’t come without risk. Artificial intelligence (AI) failures, personal data mining, hacking, identity theft, and aggressive digital credit offers affect not only the rich and hyper-connected, but also low-income customers. A rural villager using a basic mobile phone is also exposed to fintech risks. So uncovering the potential negative impacts of impact investing is crucial for these customers.
Low-income customers produce less of a digital trail than do higher-end users, but factors such as low literacy and numeracy, low awareness of data protection rights, little representation, reduced options, and reduced assets and savings compound their vulnerability to abuse.
AI and the loss of high-touch customer service
A “high touch” customer service is a service that has high human involvement, as is the case with traditional microfinance transactions. Anyone who accompanies a loan officer while visiting clients will witnesses the high level of human involvement. Human transactions are based on trust: The borrower trusts that the lender will conduct a fair assessment of their credit worthiness, and will respect the terms and conditions of the deal. The lender trusts the borrower with their money, along with the principal and interest. Every day billions of transactions are sealed with a handshake, a signature, and an eye-to-eye exchange. The human touch is particularly important for low-income customers, where faith in the individual is greater than the faith in an institution.
So what happens when a customer obtains a loan through a faceless device instead? A digital-only transaction redefines the trust relationship and the commitment on both ends. Moreover, when a digital loan is granted through a mobile provider, there are no longer just two but three parties involved. None of them sees the other. Proponents of algorithm-based lending argue this eliminates the subjectivity factor in decision-making, replacing it with data-based decisions. But digital transactions with automated on-boarding may result in excessive standardization. The repayment capacity analysis may be lax or replaced by AI-driven algorithms. Which of the two delivers a better, fairer judgment: an algorithm or a loan officer? What we know is that debt burden and repayment capacity must be adequately scrutinized. If this is not the case, it can lead to over-lending and customer over-indebtedness, or rejection of a loan based on opaque reasoning, including arbitrary profiling based on factors such as location. As frictionless financial services are increasingly targeting those in the low-income bracket, it is paramount to ensure that we don’t overlook suitability principles such as “sell only what the clients can use and need.”
The value of transparency in finance
Another important aspect of client protection in both traditional finance and fintech is transparency. Algorithms are valuable commercial property that are rarely disclosed. Traditional lending policies are more transparent. Just as a T-shirt label doesn’t have enough space to list the CO2 emissions level of its production, there is limited space on mobile devices to disclose information regarding terms, use of personal data, default consequences, and grievance mechanisms. At times customers are not even aware that they have consented to a loan. AI can lead to automatic blacklisting from credit bureaus for which repair is difficult, costly, and slow. Some reports indicate more than half a million people are blacklisted in Kenya for amounts as small as one US dollar—and unfortunately, they will not obtain any loans until they are cleared (if and when).
In digital lending, when the customer’s loan request is rejected by AI using “alternative data”—which may include geolocation, frequency of SMS use, phone charging, medical records, or, for the more Internet-savvy, browsing history, social media profiles, and online purchasing records—what recourse does the customer have? Who is behind this automation-based decision? Redress in case of AI errors may prove harder to obtain.
There are other risks affecting client protection in fintech. These include abuse and breaches of personal data and security, including privacy or failure to obtain prior consent; technology and network risk; deficient customer identity authentication; misuse of passcodes; and weak regulatory frameworks and poor law enforcement and redress.
Ensuring ethical behavior from behind a screen
Ensuring ethical behavior from a sales person who does not see their client in person is hard. What happens when the sales person is under pressure to deliver aggressive targets? Suppose you receive two conflicting instructions: a) Place a large amount of money this month, and b) do it carefully and with good judgment on customers’ credit worthiness. Aggressive targets mean there will be a trade-off between the two instructions received. Will fintech reduce or exacerbate aggressive sales targets? It may be too early to say. When in conflict, most of us behave ethically only when observed. Case in point: In a recent paper, Bibi Mehtab Rose-Palan cites the example of Wells Fargo, where at least two million deposit and savings accounts were opened in the names of customers without their consent. Rose-Palan concludes the “morality diminishing” factor that led the employees to conduct this fraud was the aggressive sales targets set by the same company that called on them to behave ethically. This is a critical responsibility of investors: Set unrealistically high targets, and you will encourage staff to take behavioral short cuts. For example, if you don’t monitor staff behavior (did they treat your client well?) but do monitor staff performance (did they reach the sales target?), most employees will likely focus on the sales goal. If you instead promote ethical leadership across the board, senior management, and sales force, you will reduce this risk.
Focus on the end user
As investors and providers of fintech, we have the responsibility to identify, acknowledge, and mitigate potential negative risks on the very population we aim to serve. The microcredit industry frequently overlooks risks like over-indebtedness. This is not a call to halt innovation. Fintech will be the solution to the last mile of financial inclusion. But responsible fintech is what we want. Recent years have seen a number of fintech consumer protection initiatives surfacing. Of particular relevance to investors are guidelines for responsible fintech. The ability to question our assumptions, and check where and how things might go wrong, are characteristics of responsible players. So is staying focused on the end user—taking account of the specific vulnerabilities of the low-income customer, remaining accountable primarily to them, and exercising respect and good judgment. Remember, AI cannot substitute for human empathy and human judgement.