By Joe Robinson, Hummingbird CEO
Here’s a simple truth.
There is no way to understand the financial industry without also understanding our country’s long and difficult struggle against financial discrimination.
Discrimination and economic disenfranchisement go hand in hand, and some of worst forms of racism and discrimination can be found where prejudice and bias meet with the financial system. In fact, many of the most important pieces of legislation from the past half-century were written specifically to combat discriminatory lending practices. The Fair Housing Act (part of the Civil Rights Act of 1968), the Equal Credit Opportunity Act of 1974, the Community Reinvestment Act of 1977 were all enacted as part of deliberate efforts to make banking more equitable. But the fight is far from over. Similar pieces of legislation (such as the Fair Access to Financial Services Act of 2020) are currently working their way through Congress. Just last month, the Consumer Finance Protection Bureau announced changes to its supervisory operations aimed at increasing governmental protections against illegal discrimination.
Discrimination and bias in compliance
Compliance and anti-money laundering (AML) efforts are focused on fighting financial crime. As an industry, we are united by the goal of helping stop the flow of illegal funds and track down the bad actors responsible. With such a virtuous mission, it can be tempting to think we are immune to concerns regarding discrimination and bias. This couldn’t be farther from the truth, however. Investigators play a pivotal role in helping determine bank customers’ continuing access to financial services. Declaring a subject or activity “suspicious” has consequences that extend beyond the compliance function. While compliance investigations may be approached without prejudice, there is still the matter of systemic, algorithmic, and unconscious bias to contend with.
At Hummingbird, we spend a lot of time thinking about these issues. In particular, we think about how what we do can help reduce the amount of unconscious and systemic bias in compliance work. Our goal is, above all, to build the best possible toolkit for compliance professionals, and designing products that help defend against discrimination is a crucial part of that.
Below are a few of the areas where our work to help combat discrimination and unconscious bias has borne fruit.
Increasing transparency and contextual awareness through the Hummingbird UI
The main way we communicate with the thousands of compliance professionals who use Hummingbird on a daily basis is through our product. We believe that tools matter – not just for improving efficiency and accuracy – but because they affect how you do your job. Compliance work is highly-specialized, and we believe an investigator’s tools should be an extension of that specialized skill set. Hummingbird was built to empower investigators, giving them the confidence to make objective, evidence-based decisions.
To that end, the Hummingbird product interface is designed never to presume guilt. Many AML teams deal with transaction monitoring false positive rates that are well over 90% – meaning 9 out of 10 subjects investigated are doing nothing illegal. Knowing that it is the work of an investigator to sort through false positives (however many there are), it would be a mistake to make the subjects of these investigations look guilty by default.
Our main workspace – what we call the investigation canvas – is designed to present information in its most contextually-relevant format. Because as far as an AML investigation is concerned, more context equals less conjecture. The investigation canvas goes beyond a simple list of recorded transactions to include dynamic displays of geography, location, and related entities. These are essential to allowing an investigator to distinguish between what is suspicious – and what is simply unusual or even normal. The ability to quickly and accurately spot the difference is essential to objective, non-discriminatory decision-making.
Why is it so important that our product interface never presume guilt? Simple. The decision to submit a Suspicious Activity Report to law enforcement is a weighty one, with serious consequences. Regardless of the eventual SAR-filing decision, the subject of an AML investigation may be assigned a higher risk rating by their financial institution. Too high a risk rating, and the institution may decide to sever ties with the customer – a terrible thing if the person has done nothing wrong. We remain vigilant of possible bias in order to make sure that our desire to catch criminals exploiting the financial system doesn’t come at the expense of normal, everyday people.
Creating a space where algorithms can receive scrutiny and oversight.
Modern finance depends on algorithms. There’s simply no way for banks, the market, or modern money-movement to function without them. There are countless instances where algorithms help assess, review, and justify financial decision-making. Within the bank-customer relationship alone, algorithms can help determine whether or not to accept a customer looking to open a new account; then, once that account is given the green light, another algorithm may help the bank decide whether to approve, reject, or review transactions that customer makes. Additionally, algorithms play a role in customers’ access to financial products, helping determine things like creditworthiness, loan amounts, and risk ratings.
The fact that algorithms are such an integral part of the financial system makes reviewing and monitoring them essential for preventing discrimination. This is true both while the algorithm is in development and after launch. Any algorithm used in aid of decision-making should be continuously monitored and refined as part of routine updates. Algorithms are improved by feedback loops (ideally done by a team of engineers with diverse backgrounds, a topic we’ll discuss in detail below).
Developing an unbiased algorithm depends on two things: 1.) an in-depth understanding of how the algorithm is performing, and 2.) the human expert, who is there to train, supervise and approve of how the algorithm draws its conclusions. Hummingbird was designed to be the place where the performance of algorithms (in our case, our clients’ risk rating or transaction monitoring systems) can be captured, tabulated, and analyzed. By delivering data back to teams in a format that facilitates clear-eyed assessment and insights, we help these algorithms maintain a schedule that includes frequent updates and regular health-checks.
Building a diverse company.
There is no industry where building a diverse company isn’t important. It’s as important to us designing compliance tools as it is to our clients as they build a compliance program. The reasons for this are obvious. A company can screen for, refuse to hire, or dismiss individuals who engage in racist, sexist, or other discriminatory behaviors. But we are all susceptible to forms of unconscious or implicit bias, and it is in overcoming our own unintentional prejudices that a diverse staff is essential.
Teams made up of individuals from a variety of backgrounds (including different genders, races, ethnicities, ages, sexual orientations, abilities, economic classes, past work experiences, and more) benefit from different viewpoints and areas of expertise. The group as a whole gains more breadth and depth to draw on when making difficult decisions, resulting in actions that are more deliberate, sensitive, and compassionate. And not only do these teams generally find more success – they generally have more fun and are more creative as well.
As individuals, we are constrained by the limits of our learned and lived experiences. A team, however, has the opportunity to be more than the sum of its parts. A diverse team empowers us to think broadly and to look beyond the narrow, shortsighted views of monoculturalism.
At Hummingbird, we consider this to be a fundamental part of our mission, an integral part of everything we do.