NicoElNino - stock.adobe.com

Embedded analytics emerges to offer new level of business intelligence

Analytics suppliers are increasingly offering embedded analytics – based inside applications, often real-time, and driven by APIs – as a way to get to the next level of business intelligence

This article can also be found in the Premium Editorial Download: Computer Weekly: Is Twitter still safe, and should you stop using the platform?

Business analytics is an increasingly powerful tool for organisations, but one that is associated with steep learning curves and significant investments in infrastructure.

The idea of using data to drive better decision-making is well established. But the conventional approach – centred around reporting and analysis tools – relies on specialist applications and highly trained staff. Often, firms find they have to form teams of data scientists to gather the data, manage the tools and build queries.

This creates bottlenecks in the flow of information, as business units rely on specialist teams to interrogate the data and report back. Even though reporting tools have improved dramatically over the past decade, with a move from spreadsheets to visual dashboards, there is still too much distance between the data and the decision-makers.

Companies and organisations also face dealing with myriad data sources. A study from IDC found that close to four in five firms used more than 100 data sources and just under one-third had more than 1,000. Often, this data exists in silos.

As a result, suppliers have developed embedded analytics to bring users closer to the data and, hopefully, lead to faster and more accurate decision-making. Suppliers in the market include ThoughtSpot, Qlik and Tableau, but business intelligence (BI) and data stalwarts such as Informatica, SAS, IBM and Microsoft also have relevant capabilities.

Embedded analytics adds functionality to existing enterprise software and web applications. That way, users no longer need to swap into another application – typically a dashboard, or even a BI tool itself – to look at data. Instead, analytics suppliers provide application programming interfaces (APIs) to link their tools to the host application.

Embedded analytics can be used to give mobile and remote workers access to decision support information, and even potentially data, on the move. This goes beyond simple alerting tools – systems with embedded analytics built in allow users to see visualisations and to drill down into live data.

Read more about embedded analytics

The technology is even being used to provide context-aware information to consumers. Google, for example, uses analytics to present information about how busy a location or service will be, based on variables such as the time of day.

Indeed, some suppliers describe embedded analytics as a “Google for business” because it allows users to access data without technical know-how or an understanding of analytical queries.

“My definition generally is having analytics available in the system,” says Adam Mayer, technical product director at Qlik. “That’s not your dedicated kind of BI tool, but more to the point, I think it’s when you don’t realise that you’re analysing data. It’s just there.”

The trend towards embedding analytics into other applications or web services reflects the reality that there are many more people in enterprises who could benefit from the insights offered by BI than there are users of conventional BI systems.

“We’re bringing that consumer-like, Google-like experience to enterprise data...giving thousands of people access to data, as opposed to five or 10 analysts in the business who then produce content for the rest of the business”
Damien Brophy, ThoughtSpot

Firms also want to improve their return on investment in data collection and storage by giving more areas of the business access to the information they hold. And with the growth of machine learning and artificial intelligence (AI), some of the heavy lifting associated with querying data is being automated.

“We are trying to give non-technical users the ability to engage with data,” says Damien Brophy, vice-president for Europe, the Middle East and Africa (EMEA) at ThoughtSpot. “We’re bringing that consumer-like, Google-like experience to enterprise data. It is giving thousands of people access to data, as opposed to five or 10 analysts in the business who then produce content for the rest of the business.”

At one level, embedded analytics stands to replace static reports and potentially dashboards too, without the need to switch applications. That way, an HR or supply chain specialist can view and – to a degree – query data from within their HR or enterprise resource planning (ERP) system, for example.

A field service engineer could use an embedded analysis module within a maintenance application to run basic “what if” queries, to check whether it is better to replace a part now or carry out a minor repair and do a full replacement later.

Embedded analytics to help decision-making

Also, customer service agents are using embedded analytics to help with decision-making and tailor offers to customers.

Embedded systems are designed to work with live data and even data streams, even where users do not need to drill down into the data. Enterprises are likely to use the same data to drive multiple analysis tools: the analytics, business development or finance teams will use their own tools to carry out complex queries, and a field service or customer service agent might need little more than a red or green traffic light on their screen.

“The basic idea is that every time your traditional reporting process finds the root cause of a business problem, you train your software, either by formal if-then-else rules or via machine learning, to alert you the next time a similar situation is about to arise,” says Duncan Jones, vice-president and principal analyst at Forrester.

“For instance, suppose you need to investigate suppliers that are late delivering important items. In the old approach, you would create reports about supplier performance, with on-time-delivery KPI [key performance indicators] and trends, and you’d pore through them looking for poor performers.

“The new approach is to create that as a view within your home screen or dashboard, continually alerting you to the worst performers or rapidly deteriorating ones, and triggering a formal workflow for you to record the actions you’ve taken – such as to contact that supplier to find out what it is doing to fix its problems.”

This type of alerting helps businesses because it speeds up the decision-making process by providing better access to data the organisation already holds.

“It’s partly businesses’ need to move faster, to react more quickly to issues,” says Jones. “It’s also an evolution of the technology to make embedded alert-up analytics easier to deliver.”

Case study: Data for insurance

Insurance companies need quick and reliable access to data if they are to provide the right policies to their customers at the right price. They also need an accurate and up-to-date picture of the risks they are covering.

The sector has vast volumes of data, but this is split across disparate systems and is often hard for staff to find.

CDL is an insurance technology (insurtech) company that provides a software platform, Strata, to retail insurers. Its customers need to monitor movements in the market and generate large volumes of quotes, often for price comparison sites.

CDL wanted to speed up access to data for Strata users. Previously, it could take 24 hours to download relevant data – an unacceptable timescale for consumer insurance. Instead, it wanted to offer “on-demand” data.

It also wanted to help customers move to a cloud environment, which removes some of the delays in sourcing data for analytics and provides customers with more flexibility for their IT platforms.

As a result, the company developed Kingfisher, a cloud-based business intelligence system running on Amazon Web Services. CDL needed to find a way to integrate data from multiple sources and provide it to multiple endpoints.

CDL uses Qlik’s Data Integration technology to bring real-time data into Kingfisher. Qlik Data Integration is a data replication tool that optimises data from multiple sources and can integrate with most of the main big data analytics tools.

The system also replicates Oracle data – widely used by insurers – to the cloud, so they have a single way to view their information.

Kingfisher now holds about 100TB (terabytes) of data, and CDL is running more than 50 instances of the system. Cloud technology is also cutting costs for its customers.

Most of all, though, Kingfisher is giving insurance customers near-real-time access to product performance and pricing, but without the need to go into, or learn, a new set of tools.

This, in turn, allows insurers to offer policies at the best price, based on the latest business data.

Embedded analytics suppliers are also taking advantage of the trend for businesses to store more of their data in the cloud, making it easier to link to multiple applications via APIs. Some are going a step further and offering analytical services, too – a firm might no longer need expertise in BI, as the supplier can offer its own analytical capabilities.

Again, this could be via the cloud, but serving the results back to the users in their own application. And it could even go further by allowing different users to analyse data in their own workflow-native applications.

A “smart” medical device, such as an asthma inhaler, could provide an individual’s clinical data to their doctor, but anonymised and aggregated data to the manufacturer to allow them to plan drug manufacturing capacity better.

“Data now is changing so quickly, you really need intraday reporting,” says Lee Howells, an analytics specialist at PA Consulting. “If we can put that in on a portal and allow people to see it as it happens, or interact with it, they are then able to drill down on it.

“It’s putting that data where employees can use it, and those employees can be anyone from the CEO to people on operations.”

But if the advantage of embedded analytics lies in its ability to tailor data to the users’ roles and day-to-day applications, it still relies on the fundamentals of robust BI systems.

Firms considering embedded analytics need to look at data quality, data protection and data governance.

They also need to pay attention to security and privacy. The central data warehouse or data lake might have robust security controls, but does the application connect via an API? Client software embedding the data should have equal security levels.

Cleaner data is critical

And, although cleaning data is always important for effective analytics and business intelligence, it becomes all the more critical when the users are not data scientists. They need to know that they can trust the data, and if the data is imperfect or incomplete, this needs to be flagged.

A data scientist working on an analytics team will have an instinctive feel for data quality and reliability, and will understand that data need not be 100% complete to improve decision-making. But a user in the field, or a senior manager, might not.

“Embedded analytics continues the democratisation of data, bringing data and insight directly to the business user within their natural workflow,” says Greg Hanson, vice-president for EMEA at Informatica.

“This fosters a culture of data-driven decision-making and can speed time to value. However, for CDOs [chief data officers] and CIOs, the crucial question must be: ‘Is it accurate, is it trustworthy and can I rely on it?’ For embedded analytics programmes to be a success, organisations need confidence that the data fuelling them is from the right sources, is high quality and the lineage is understood.”

CDOs should also consider starting small and scaling up. The usefulness of real-time data will vary from workflow to workflow. Some suppliers’ APIs will integrate better with the host application than others. And users will need time to become comfortable making decisions based on the data they see, but also to develop a feel for when questions are better passed on to the analytics or data science team.

“Organisations, as part of their next step forward, have come to us with their cloud infrastructure or data lakes already in place, and they started to transform their data engineering into something that can be used,” says PA’s Howell. “Sometimes they put several small use cases in place as proof of concept and the proof of value. Some data isn’t as well used as it could be. I think that’s going to be a continually evolving capability.”

Case study: Analysing sentiment data

New York-based artificial intelligence (AI) company Accern helps organisations form a picture of their environmental, social and governance (ESG) risks, as well as exposure to cryptocurrency and sustainability issues. It caters primarily to firms in the finance and banking markets.

Founded in 2014, Accern counts businesses including Allianz and IBM among its customers.

Accern gives its customers access to a no-code AI environment, based on adaptive forecasting and adaptive natural language processing (NLP) models. Financial services firms can then use these tools to trawl through large volumes of unstructured data, from news articles to company financial statements.

By using pre-trained models for the financial services sector, Accern believed it could significantly cut the time takes to train an AI tool. But the challenge Accern faced was helping customers visualise and interpret that data. A single dashboard was not enough.

The firm started by building a data model in Snowflake to allow for scaling up the service. It then added ThoughtSpot Everywhere to provide embedded interrogation and visualisation of the data. Accern used ThoughtSpot’s Modelling Language to script deployments and Liveboard worksheets, a visualisation tool, to speed up customisation.

By combining this with its own no-code AI, Accern says it now has a fully featured self-service analytics platform.

Users can now drill down into the data, and personalise reports and recommendations, without needing any technical knowledge. Accern predicts that the easier-to-use analytics functions will increase uptake of the platform 20-fold by the end of 2022. The firm hopes these capabilities will also allow it to develop premium, subscription-based services.

Read more on Business intelligence and analytics