Canada Focus: Open Banking has a broken incentive model, let’s fix it

OpenBankingExpo,
05 Feb 2020

Regardless whether you are the CEO or the janitor, the buyer or the seller, the cop or the crook, we are all driven by incentives. Negative incentives like the penalties for disregarding regulations, positive incentives like a good return on investment, and most effective of all when both are structured to work together.

So far, jurisdictions have mostly kick-started Open Banking ecosystems with regulatory requirements: “Release the raw data or face penalties”. This is a very effective way to get things initially growing quickly, but less so after that first spike.

Regulatory requirements tend to incentivize bare-minimum compliance because they do not reward anything beyond that. Furthermore, in most of today’s Open Banking jurisdictions data contributors bear escalating costs if their data gets more popular with requestors. The more value their data adds to the ecosystem, the more they are penalized for putting out a “good product”. Any rational organization (big or small) in such a position is unlikely to invest in anything more than a 2nd rate, slow, minimum effort offering.

Raw data is like raw iron ore in that it has value but usually needs to be refined before it is useful and reliable for insight. Most data science projects spend 40-70% of their time just cleaning and reworking data into the right form before they can even start to build value from it. Raw payments data for example is often riddled with wrong store addresses, names, industry classifications, etc… Drawing conclusions from that can be like trying to use a map where 10% of all roads and towns are in the wrong place.

If this unrefined output is what is being shared, every requestor must re-discover the errors and build their own mechanisms for fixing them. At a national scale this is an immense amount of wasteful work duplication. Would it not be better for everyone if data contributors instead had a compelling reason to offer the best data they could?

Perhaps the goal of raw data sharing is only a part of what we should be aiming for. What would we have to do differently to create the conditions for a self-scaling, self improving, safe data exchange ecosystem for Canadians? To start, we would need to reward players who invest their resources into making it better. While also making it easy to identify and penalize players who make the ecosystem worse.

If we get things right, the ecosystem would organically draw in increasing numbers of voluntary data contributors. Those who are mandated to provide minimums will now have a compelling reason to contribute more refined, useful data. However entire new businesses would also arise specializing in collecting and recombining data in new ways that add value to the ecosystem. All because there would be clear economic rewards for this.

The one caveat is that every data access and use case must have informed consumer consent. Fortunately, this requirement will naturally encourage a share of the value to be passed on to consumers in exchange for their “yes”.

One of the most powerful attributes of data is its non-consumable nature. The same data can be used by 5 or 10 or 50 different parties for different uses, stacking the value of each. Because of this, there is likely plenty of untapped value in financial data to healthily incentivize all players (including consumers).

Every collaborative ecosystem that has ever successfully scaled organically has had in its DNA a set of incentives that were aligned with the behaviours that increased the ecosystem’s value. If that type of healthy scaling is what we want to accomplish with Open Banking, we will need to fix this.

 

Marc Folch leads the innovation research, emerging solutions & Open Banking function at Interac Corp. Canada.