Implementing a Data-First Approach


Once you have nailed the “why” and the “what” explored in our previous blog, next it is the “how” where Kirstin Duffield, CEO of Morning Data, in a collaborative approach with Hélène Stanway, delves into the next step of implementing a data-first approach through effective data management.

A Data-first World.

Data management can’t just be a case of sending the information as a flat file or the apparently universal medium of spreadsheets, because whilst almost every organisation has and can use spreadsheets, this is not the answer in a data-first world.

To effectively move data from one place (database) to another, a data technologist will want proper databases and “proper” systems. That is not to say that the source and destination need to be the same, quite the opposite. The trick is to develop an executable service that has appropriate requirements to complete its intended purpose. For this APIs (Application Programming Interfaces) are used which are executable bits of code. These executable bits of code can:

  • be called to provide some information, for example using an IMO number (vessel number) the code can return the beam, the flag, the call sign, even its position
  • allow for data to be submitted and stored with a success message returned
  • and, and, and….

What else is needed?

Being data-first requires a set of common standards which can send, accept, and invoke from a whole host of different systems and databases, making it universally suitable for the interchange of data.

If data is being sent, and the required standards have been met in terms of completeness and content, then appropriate credentials are needed to be able to process the return message for example either a pass or fail.

A common interpretation of the content is required such that data now becomes infinitely useful. This may start with understanding that certain data points must be a number, with a defined number of decimal places, uses for monetary amounts, or it may accept alpha characters such as a name or alphanumeric like a UMR. It could be a date and could permit more than one format of that date and converts them into a common date format once ingested.

Beyond the structure of numbers and letters, syntax, length and format, then comes reference lists. When there are easy to accept global reference lists like currencies and countries, these are commonly used. Within the insurance market, ACORD is the standards body that manages market-wide agreed sets of reference lists. Spoiler alert… these reference lists are a core component of the Core Data Record (CDR); the standards the London Market will need to adopt to enable data to be exchanged.

Once the payload of content has been agreed, this payload then needs to be fixed message language, for example JSON (a language independent data format derived from JavaScript). Set out the data elements required, set a format for each field, and the reference lists for all those that can have one.

Even with the API structure and credentials, there is one remaining piece to consider, the business rules. Even where the format of the data payload has been complied with the combination of values may still not be correct. It is often necessary to have some business rules, for example checking that the sum(s) insured should be higher than the Premium (assuming the same currency!). Depending on its purpose these business rules may be light touch and simple or fairly complex.

If data quality and completeness are key goals, these data, technology, and business rules support driving the benefits of a data-first organisation.

In the case of the CDR for the new digital gateway, there is a combination of rules and requirements to support the 3 use cases of the CDR (accounting & settlement, tax & regulatory reporting, claims matching) from multiple agencies posing requirements. For example, if the Policyholder is in Spain, and the type of contract is Insurance, then the Policyholder’s NIF code must be supplied. Extrapolating out all of the requirements for the CDR there are hundreds of rules based on the combination of the data based on the coverage and territory structure of the policy.

This is where a post-it note on the side of the broker’s screen is not going to cut it and a RESTful JSON API service is the order of the day.

Gathering data from whatever source or sources, package it up into a payload and call the API – pass right first time and succeed or get a rejection message back.

About the Author.

Kirstin Duffield is a data-first technologist in the London insurance market and the Managing Director/CEO of Morning Data, providing software and service solutions to the global insurance industry. Bonded by their mutual passion for helping organisations to transform and innovate digitally, Kirstin and Hélène Stanway, r10’s Advisory Director, collaborate to write a series of articles on the reasons, ways, and methods for companies to the insurance and reinsurance industries bring a data-first approach to the centre of their operations.

Contact r10’s Advisory team to help you comprehend further your digital transformation journey, including understanding, defining and implementing your strategic priorities.

Get in Touch