Bokehstore - stock.adobe.com
How IAG is tapping data streaming
The insurance group is using Kafka’s data streaming capabilities to integrate disparate data sources and provide real-time data services to support its business
Australian insurance group IAG started using Apache Kafka around 2017 because the emergence of streaming data made sense to the company.
The first opportunity to use it was in support of a mobile application – the various stakeholders liked the idea, and the project was successful, said Burak Hoban, IAG executive manager of data platforms for data and risk.
The next step was to use Kafka to provide a common application programming interface (API) for all data warehouse ingestions. Hoban admitted it was probably early days for that, but there were at least two main reasons for going ahead.
First, a plan to move into the cloud meant data streaming appeared to be an appropriate technology, and second, the company’s history of mergers and acquisitions meant the IT environment consisted of “one of everything” and needed a way to combine data from these disparate sources.
IAG subsequently moved from Apache Kafka to Confluent as it wanted an enterprise-grade partner that could provide a managed platform so that its own team of around 40 people could focus on higher-value work.
That migration was “pretty easy”, said Hoban, because at that time there was just a small number of uses.
Working with Confluent, IAG is going through a multi-year transformation with individual projects such as consolidating 12 policy systems into one.
Read more about IT in Australia
- Bond University has injected a new lease of life into its old hardware by using the Nutanix platform to run 700 on-premise servers.
- Customer names, dates of birth and email addresses of Ticketek Australia account holders reportedly impacted in latest data breach affecting event ticketing firm.
- Fujifilm Business Innovation Australia’s automated scheduling application has been able to schedule unplanned service calls automatically and reduce travel times for its technicians.
- Nib has closed the last of its seven datacentres, marking the end of its nine-year cloud migration programme.
Kafka is being used for synchronous and asynchronous integrations, and as part of the migration from legacy systems. One of the goals is to provide policy and claims events as a real-time data service that can be consumed by analytics software. “We’re starting to get a lot of traction,” said Hoban.
As for the future, he said IAG is going “all in on Confluent Cloud”. Kafka is already mission-critical (for example, an outage affects the company’s ability to provide quotes), but Confluent Cloud boasts 99.99% availability, and IAG staff are freed from performing routine tasks such as software upgrades.
The company is continuing its data products journey (a data product is a reliable, trustworthy and governed package of data from one or more sources that has been curated to meet a business need), and will be exploring the “very conceptual” world of streaming data meshes, a decentralised approach to data that is an alternative to data lakes and warehouses.
One of the goals is to provide a single view of policies, which would mean assembling data from 10 different systems.
Hoban stressed that the aim is to provide genuine real-time data, not merely more frequent batch updates.
Meanwhile, IAG is also investing heavily in artificial intelligence to support “next-best action” decisions, and real-time decisions require real-time data.
Hoban’s advice to those starting out with Kafka is to keep it simple and stick to the well-trodden path rather than trying to do something unique.