Chapter 12: How Will Your Bank “Earn It” in the Future?

Chapter 12: How Will Your Bank “Earn It” in the Future?

Up until this point we’ve talked about how your bank needs to “Earn it.” We’ve walked you through the importance of building relationships, the central role pricing plays, and how those relationships build your bank’s all-important brand.

But that’s all about where your bank needs to go now. When you get there, where will your bank go next?

Why Banks Should Really, Really Care About AI

The banking industry we all know today has been shaped by a handful of key transformations. There was the formation of the FDIC in the 1930s to restore public faith in banks. Then there was the phasing out of Regulation Q interest rate ceilings on deposit accounts in the 1980s. In 1994, the Riegle-Neal Interstate Banking Act relaxed restrictions against interstate branching.  And of course, there was the explosive growth of online and mobile banking over the last two decades that has reshaped the way banks deliver their products and interact with their customers. Now we are on the brink of another generational transformation with machine learning and artificial intelligence.

(Note: While there are definitional differences between artificial intelligence and machine learning, for the purposes of simplicity, we’ll only use the term “AI” in this chapter.)

Many bankers will dismiss this notion as just the latest “trend of the day” among technology firms. After all, bankers have been exposed to many fear-mongering sales pitches, and they’ve learned to ignore the doomsday “your business is about to change!” kind of messaging. But this time it’s different, because AI is exploding in every industry and countless different use cases.

A go-to example, and one you’re probably already familiar with, is the self-driving car. But AI is also at the root of many tools and products you use on a daily basis. Amazon, for example, uses AI to predict what products you will want to see, based on what you’re already viewing, what you’ve purchased in the past, and what others with similar context have purchased. And they are exceptionally good at it.

spotifySimilar approaches are used by Spotify to suggest what music you will like, by Apple to predict which apps you will buy, by Google to hone the perfect search results, and by Facebook to shape your personal news feed. There are other subtle examples, as well. Have you noticed airline confirmation emails now automatically generate calendar events for the flights? Or have you seen your iPhone or Facebook photo collections organized by the people in the pictures?

These things happen behind the scenes, and are rarely noticed or thought about by end users, but they have been game changers for each of the aforementioned companies. They are now taking that success and pushing AI front and center in our lives in the form of Siri, Alexa, Cortana, Google Assistant, and the like. These “bots” are powered by AI, and the more interaction we have with them, the better they “learn” how to respond to human voice commands, and accurately and efficiently answer queries.

As the possibilities for AI expand, the list of industries it can impact will grow. In fact, banking might be the perfect business for AI. To understand why, we need to go back a few years to a story that will be familiar to many experienced bankers.

These “bots” are powered by AI, and the more interaction we have with them, the better they “learn” how to respond to human voice commands..."

Ed to the Rescue

Years ago, Dallas had the good fortune to work for a bank with pristine credit quality. This squeaky-clean portfolio was fiercely protected by Ed, who was one of those classic, old-school credit guys. Ed had minimal formal credit training, and the bank didn’t rely on any sophisticated modeling or algorithms for monitoring risk. Instead, they relied on Ed’s gut instincts.

Ed had a way of sniffing out bad deals, and several team members looked forward to the weekly loan committee meeting, during which Ed would tear into the latest poor sap that dared to bring a wobbly deal for approval. After one of the more contentious meetings, Ed was asked how he was able to quickly spot tiny flaws that the analysts had missed after hours of work, and why he was such a stickler about them.

“I learned this business in the ‘80s, and had to help clean up two banks before I was 40. The bad news was I missed way too much time with my family. The good news was that I saw first-hand every conceivable way a deal could bite you in the ass. And I’ve decided that I am NEVER doing another clean up.”

Ed couldn’t always put his finger on why a deal was bad, but he had learned to trust himself when something just felt “off.” The bank passed on a lot of deals based on those feelings, and their competitors gladly jumped on them. A whole lot of them ended up defaulting.

Obviously Ed wasn’t some kind of Nostradamus of banking. Instead, he was spotting patterns and correlations, even if he was doing it subconsciously. He knew he’d seen similar situations before, and they had ended badly. Most banks used to be run this way. It was one of those approaches that worked well … until it didn’t.

When Ed’s Not Enough

Why? Because some banks didn’t have quite as good a version of Ed. And some banks outgrew their Ed, and got big enough that they couldn’t give the personal smell test to every single deal. Much of the industry simply ran out of enough Eds who had cut their teeth in the bad times. A lot of banks were using an Ed who had never seen a true credit correction.

The real issue is that humans are actually pretty bad at spotting and acting on patterns. Our lizard brain – the part of the mind responsible for our instinctive reactions, like “fight or flight” – leads us astray far more often than we realize. It was true even for Ed’s bank; he may have kept our portfolio safe, but he did so at a huge opportunity cost. The growth the bank eked out was slow and painful, and being a stickler on quality meant they passed on a lot of profitable business.

The Faulty Lizard Brain

So how could the very thing that made Ed so valuable – his ability to use limited and ambiguous data to make an instinctive prediction – also turn out to be a problem? Because when we lean heavily on gut instinct we open ourselves up to multiple cognitive biases that also make their homes in the lizard brain. A few are particularly prevalent in banking.

Confirmation Bias

This is the natural tendency of people to favor information that confirms their preexisting belief or hypothesis. In fact, we not only favor this confirming information, we actually seek it out at the exclusion of any contradictory information. Those perpetual market bears that always see a crash around the corner (and yes, every office has at least one)? They go seeking ugly data that confirms that expectation, and are much more likely to remember and believe that data than any positive economic data they might find. It’s become an easier trap to fall into in today’s world, with its unlimited – and easily accessible – supply of data and opinions.

Sunk Cost Bias

Instead of being ignored, sunk costs (those costs that have already been incurred and cannot be recovered), are often a major driving force in determining ongoing strategy at banks. You’ve probably heard one of these maddening arguments before:

• We spent X dollars on the software, so we are sticking with it even though it is clearly not the best solution.
• We took those advances to hedge those specific loans, so we cannot pay them off early.
• We started moving rates down last month, so we can’t move them up this month.
• We have invested too much time and money in that branch to close it.

You get the idea.

Incentives Bias

This is pretty straightforward: humans figure out the type of behavior that’s rewarded and then shape their behavior accordingly. If you put in a compensation system that rewards lenders for growth, then you’re going to get a bias toward closing deals. Sometimes that’s fine, but often that means the bank winds up with some questionable, high-risk loans on its books. Flip the script and reward lenders for minimizing risk and you’ll get high quality deals … but not too many of them. Finding the right balance with incentives is tricky; failure to do so leads to short-term behaviors that are detrimental in the long run.

Inertia Bias

Or as Carl calls it: “We’ve always done it this way” bias. It’s maddening but sadly, not uncommon.

Case in point: Carl’s had several discussions with banks that are having an issue with loans leaving the bank through natural amortization. When he brings up the idea of switching those customers to an interest-only loan he often gets this response:

“We never thought of it that way … but we can never do that because folks believe that you just have to amortize things.”

You may have just chuckled a bit, but if you give it some thought, chances are you’ve got an example of inertia bias in your bank as well.

Eds Everywhere

Despite all those potential traps set by the lizard brain, the story of Ed probably still rings true for a lot of bankers. The surprising thing isn’t that some banks still handle credit risk this way; it’s how many other kinds of decisions are handled in the exact same manner. Most banks have an Ed for credit, for pricing, for investments, for security, and for every other significant function they handle. And almost all of them are, when you get right down to it, flying by the seat of their pants.

Other businesses look like this, too. A doctor diagnoses based on both the latest medical tests and their own judgement, which has been honed over hundreds or thousands of similar cases. A lawyer suggests legal strategy based on precedent and their own case history.

But in each of these examples, the human is limited by two things. First, how many experiences do they have that fit the exact same criteria? Usually it numbers in the dozens or low hundreds, and it’s not enough to be statistically significant. Second, are they pulling off the Herculean task of avoiding all the cruel tricks our minds play on us? The lizard brain is a powerful foe to overcome.

The Beauty of Cold Calculation

That brings us back to AI. Humans may be limited in spotting patterns, but luckily, we have help in the form of computers. And they were literally created to do just this.

Your best, most experienced bankers might see a particular type of loan a few hundred times in their career. A computer with access to a decent data set can review a few hundred thousand in a split second. It will find patterns that we miss, and it won’t be subject to all of our little flaws. Now you aren’t even required to have your experienced humans build software that knows how to spot a bad deal. Just like Siri and Cortana, it will learn that all on its own.

Why Now? (A Brief History of AI)

AI may seem like a futuristic technology that is just now taking shape, but it was actually born in the 1950s, when famed cryptologist and computer scientist Alan Turing started teaching computers to carry on conversations. That soon led to AI becoming an official field of research at Dartmouth College in 1956. After massive funding cuts in the 1970s created what came to be called “the AI winter,” the field eventually emerged to show progress in the mid 1980s, and again in the mid 1990s.

ibm-watsonThe technology started to become more mainstream in 1997 when IBM’s Deep Blue defeated Garry Kasparov in chess. A later IBM creation, Watson, learned how to play (and win) Jeopardy in 2011. And in 2016, Google’s AI finally achieved what many thought was the final frontier in human gaming superiority, beating Lee Sodol at the game Go. While these feats were newsworthy, they were also somewhat trivial. They were achieved by academic types, and had more of a parlor trick feel than a real, purposeful application.

The technology wasn’t all that complicated, it was just that there were two severe bottlenecks.

First, computing power was too scarce. Cranking through millions of computations in a short time required enormous computing power, and only a few places in the world had access to enough of that power to really make it work. Second, there weren’t very many data sets worthy of machine learning. Data storage was prohibitively expensive, so most data was abandoned to the ether. Both of those limitations are quickly eroding.

The proliferation of cloud computing has made growing amounts of processing power and data storage available for mere pennies. Today it’s cheaper to store all data than it is to take the time deciding which data is important and which can be thrown out. Because of this, there are now enormous untapped piles of data sitting all over the world. And, anyone with a valid credit card can access Microsoft Azure or Amazon’s AWS and spin up as many servers as needed to power any conceivable software.

With all that horsepower and raw material (data) available, computers are able to learn like never before.

Why Banks?

While this technology is being embedded everywhere, banking might be the industry in which AI can have the greatest impact.

1: The data is already there

We hear almost daily from bankers that their data “is messy” and isn’t organized well enough to be useful. But, that simply isn’t true. Banking data is some of the best organized on the planet. All banks have been storing similar data for decades in mostly consistent formats, and in systems that all look alike. True, those systems are old and difficult to interact with (for example, using codes like “102” to represent “Wall Street Prime” to save a little memory). And the data often sits in disconnected silos. But, it exists, and it’s in digital form. Think about the state of the data for most of your customers. What kind of data do your real estate developers, attorneys, or doctors have?

Banking is a treasure trove of useful data, just waiting for the right solutions to combine all of those separate sources. Once the data is combined, AI can start mining for trends and relationships between variables we never knew existed. When banks start taking data they’d previously used only for processing transactions and start using it to deliver intelligence back into their systems, a whole new world of possibilities opens up. They can go from simply looking at outcomes to looking at the choices that were made along the way, and why. Then they can use that information to determine what to do in the future in similar circumstances.

2: In AI, context is king

Have you ever asked Siri to tell you a joke? Or tried to get specific information about a local restaurant? Sometimes the results are phenomenal, and you get exactly the answer you were looking for no typing required. Other times? Not so much.

The reason is that this technology is really hard to execute. Apple has to start with some idea of what users might ask. Then, yes, Siri can learn from all of those subsequent interactions, but the list of possible questions and tasks is literally limitless. They have to be prepared for anything, because someone will eventually ask Siri for it.

But in banking, we have much better context. For starters, any bots or intelligent digital assistants we create won’t need to be able to tell a joke or choose a restaurant. They will be dealing with something pretty darned specific, a financial transaction. And they’ll have contextual information for the user, in the form of financial statements and their previous relationship with the bank. When you narrow the potential fields down and you’re only answering questions about commercial banking, the level of AI difficulty drops considerably.

3: Banking relies on judgement calls from experienced executives

When you boil it down to its core, banking is really the business of risk. We spend our days answering one fundamental question:

How much and what kinds of risks are involved in this transaction, and is the return sufficient to justify the allocation of capital?

Bankers have spent decades building ever more sophisticated tools for measuring and monitoring risk, but eventually, in every meaningful transaction, a human makes a decision to answer that specific question. No matter how fancy the algorithm, or how many tabs on the spreadsheet, a person is deciding based on a combination of those model outputs and their own personal experience. Again, like our friend Ed, how many deals like this have they seen, and what was the outcome?

Software doesn’t have to be limited to the dozens of deals meeting specific criteria that a banker might see in a career; it has access to many thousands of data points that can be sorted and analyzed ad nauseam. And it will be far less likely to have the kinds of biases that can afflict even the most self-aware of humans. In short, when we are making decisions based on experience, humans are not even close to being a match for software. It is impossible for us to replicate the sheer volumes computers were built to handle.

Does this mean we should leave banking to the machines? Absolutely not! If anything, the need to have top-notch lenders will become ever acute. An article in the Harvard Business Review, “The Simple Economics of Machine Intelligence,” explains why.

“The first effect of machine intelligence will be to lower the costs of goods and services that rely on prediction.”

Translation: A bank’s overhead lowers as it becomes cheaper to crunch the numbers and calculate the risks.

“When the cost of any input falls so precipitously, there are two other well-established economic implications. First, we will start using prediction to perform tasks where we previously didn’t.”

Translation: Banks will be able to look at deals from all sorts of different angles that were previously not possible. For example, with AI you’ll be able to predict with much greater accuracy just how much a customer will use that line of credit they’re applying for.

“Second, the value of other things that complement prediction will rise.”

Translation: There are human nuances that machines don’t understand, and all of those same meaningful transactions will still need a person to make the final decision. When the predictive powers become table stakes, the decision-making of lenders will likely be the determining factor on many deals.

With AI, a person can make a far more informed decision. Well-designed software does much more than simply analyze reams of data; it actually boils that data down to useful and contextual insight that can be used to augment the human decision. It’s why we often talk about AI as “Augmented Intelligence” or “Amplified Intelligence.”

Banking is a (Human) Relationship Business

AI is really just the next step in the improvement process for banks. Because at the end of the day, banking is still about humans connecting with each other.

That connection isn’t made on the golf course, by sending a holiday card, or during a power lunch. It’s created through pricing, when a borrower sits down with a banker and they work together to create a deal that benefits both sides.

The relationships forged through those connections are the foundation upon which your bank’s brand – and its future – rests. They’re the key to everything.

Put simply, they’re what will enable your bank to Earn It – now and in the years to come.

Last Chapter

Chapter 11: The Power of Continuous Improvement

If changing your bank’s pricing culture, tactics and tools seems overwhelming, that’s because it is. But it’s not impossible if you harness the power of continuous improvement.

Join the Mailing List