Big data and misunderstanding in the insurance sector


John J. MastersJohn Masters, Technology Director, writes on innovation and technology

The FCA’s decision to investigate the use of big data by insurance companies is not surprising in itself – insurers are under pressure as never before, and analytics offers a superb opportunity to protect margin and define risk. Any change of this magnitude is likely to attract the attentions of the regulator. More intriguing is the use of the phrase itself, which is well on the way to being discredited by the majority of truly forward-looking technology professionals and business leaders.

The premise behind big data was that large datasets were inherently full of game-changing insights, just waiting to be found, given the right computing power and number-crunching techniques. Throw enough analysis at enough data, so the theory went, and a whole host of lucrative intelligence would come to light. However, this is not only the antithesis of good analytics, it is the antithesis of what most insurers actually do in practice.

At this year’s Gartner Symposium, of which Metapraxis was a sponsor, one of the most-repeated phrases was ‘data is inherently dumb’, and the truth of that is self-evident. Any good manager will tell you that, in order to get a useful answer, you have to first ask an intelligent question, and the original interpretation of ‘big data’ missed out that key part of the process.

In the vast majority of cases, what the FCA’s probe on big data will look at isn’t big data at all, or at the very least not what purveyors of standard ‘big data’ technology would recognise as such. That doesn’t mean the datasets are not large, but they can generally be managed within a hierarchy and structure, and insurers analyse them with a specific question in mind. For example: How much more likely is a driver to have a crash if they regularly drive after 10pm? Is that affected by their age? Or their experience behind the wheel? That sort of insight requires a lot of statistical skill and technical capability, but it is far from speculative, and is based upon a firm understanding of the importance of a key metric to the organisation – in this case, risk.

However, mis-labelled as it may be, the FCA’s intervention is not without value. While insurers’ use of data is generally well-governed and ethically sound, it is important that it aligns with customer perceptions and priorities, and does not cause the sort of misunderstanding that would prompt regulatory action. Therefore, insurers should take this as an opportunity, not to pause in the ongoing improvement of how data is used in the sector, but to look at how that use of data aligns with the wider goals of the business.

That means understanding how the actual calculations of premiums interacts with the other drivers of value in the business – not just risk and margin. Business leaders in the sector must find a way of mapping the relationships between these and the other factors which influence business success, including brand capital, market share, regulatory risk and consumer trends.

Leave a Reply