April 18, 2019
Our Weekly NewsletterContact
Across Instinctif Partners’ Financial Services team, we are always keeping an eye on the key developments taking place across the sector to evaluate their impact on the many businesses we work with. Here we share our picks of the week’s most interesting news, and our expert views.
Payments Patter Evolves
The rapid pace of change in the payments industry is also changing the language people use to talk about money, according to a new survey from Barclay’s money-sharing app Pingit. Popular new terms include “tapping” when talking about card payments, or “pinging over” money. “Notes” and “dosh” are the most popular informal terms for money, though the survey noted significant regional variations. As technology continues to evolve, this fast-moving area of language will continue to adapt with it. (From AOL, 13th April 2019).
IT Meltdowns Tarnish Credit Scores
Several of the UK’s biggest banks have experienced high-profile IT failures in recent months, and new research suggests nearly two in three victims have found errors on their credit scores as a result. Banks’ IT meltdowns can result in payments failing to enter or leave accounts as expected, causing missed payments and placing a black mark on some consumers’ credit scores. Many of these mistakes are still going under the radar, with only a quarter of those who heard about a major IT outage at their bank subsequently checking their credit report. (From Daily Mirror, 15th April 2019).
AIM Market Suffers Slow Start
Flotations on London’s Alternative Investment Market (AIM) fell to the lowest level for a decade in the first three months of 2019, with political and economic uncertainty cited as a major cause. Just one company offered up shares to the City’s junior stock market in the first quarter of 2019, compared to nine offerings in the same period last year. Experts suggest Brexit is leading to a “wait-and-see” approach among companies considering floats. (From City AM, 15th April 2019).
Ghost Brokers Spook Drivers with Insurance Scams
So-called “ghost brokers” are using Instagram to target young drivers – who typically have high motor insurance costs – with tantalisingly cheap insurance policies. The broker will offer to set up the policy on the driver’s behalf and present them with what appear to be legitimate documents. However, when the driver attempts to claim, they can find that the policy is invalid and are hit with huge bills. The number of investigations into ghost broking has risen 81% since 2015, as social media continues to expand its reach. (From Telegraph, 16 April 2019).
Fintech Pay Gap Furore
The latest bout of gender pay gap data reporting has revealed pay disparity in the fintech industry to be worse than in investment banking and asset management – both sectors which have faced criticism for diversity shortcomings. The handful of fintech companies large enough to file figures (those with 250 or more staff) reported an average gender pay gap of 31.5%. The figures – described by Zopa’s chief executive as “far from ideal” – jar with the sector’s progressive and disruptive image. (From Financial News, 15 April 2019).
Ensuring big data doesn’t lead to dystopia
This week the Institute of Chartered Accountants in England and Wales (ICAEW) warned that financial services should be cautious when implementing ‘big data’ to make business decisions – or risk losing its true social purpose.
More than a third of financial services firms already use internal data and insight to drive “big decisions” within their businesses. This might come from customer emails, complaints, support enquiries or even statements on social media. Algorithms and statistical models then use that data without human direction, instead relying on patterns to make choices.
But, by handing business decisions over to machines, could financial services lose its way?
“Recommendation Engines” are already commonplace for a number of web-based services. Amazon, for example, rolls out algorithms to offer customers a tailored shopping recommendation based on many data points way beyond simply looking at shopping history. In fact, a third of our Amazon purchasesa third of our Amazon purchases can be attributed to machine recommendation. Many banks are now using the same sort of technology, with some beginning to ‘nudge’ customers towards smarter savings solutions and new products and services.
Automating business decisions can bring efficiencies and opportunities to uncover untapped revenue streams. But critics note it also cedes a level of control to machines which can have unintended consequences. In one scenario, there is a fear that allowing machines to make lending decisions may create a feedback loop where using flawed or incomplete historical data generates an increase in racial bias which could prevent people of a certain race or ethnicity from getting credit.
This has already happened in the insurance sector – only last year, some insurers’ systems were accused of automatically offering people with “ethnic sounding” names less competitive rates.
Our financial services industry already has in-built biases – for example, we price insurance differently by sex and by age. But what if a machine took insurance’s accepted bias into making mortgage decisions?
The Government is investigating this issue with the creation of a Centre for Data Ethics and Innovation (CDEI) to assess how algorithms are being used in business and legal decision-making, and whether they are causing unintended bias or harm.
Of course, there is a school of thought that argues bias and a lack of transparency are human traits that can eventually be programmed out of an algorithm. Some also argue that using algorithms and big data to pick directors and board members can create better corporate governance at the top of firms.
There is little doubt that businesses have made their minds up, and machine learning is a key part of the future for financial services. The challenge for the industry will be to ensure that it keeps enough control to ensure customers are treated fairly, and to openly communicate the methods of thinking – whether human or computer – behind its decisions.