Rymden - stock.adobe.com
The future of data analytics
The rise of no-code and event-driven architectures are among the developments that will shape the future of data analytics, according to Tibco’s regional chief technology officer
A large bank in the Asia-Pacific region once wanted to hire many data scientists to form a team focused on driving data analytics initiatives.
Such a move would not have been scalable or effective, given that the insights about the business are best tapped by employees on the ground, according to Robert Merlicek, chief technology officer of Tibco Asia-Pacific and Japan.
Speaking to Computer Weekly on the future of data analytics, Merlicek said a more tenable approach would be to empower call centre, fraud, helpdesk and customer acquisition teams with the tools to glean analytics insights.
“Those people may not have the coding skills, and so we are starting to see the rise of visual analytics and no-code that will allow people to interact with data and build insights,” he said. “They can connect to and monitor data sources and drive their digital capability – that was one of the areas that has been accelerating throughout the region.”
Merlicek said data science will also evolve to a point where it will be embedded natively across the board. This will enable organisations to leverage artificial intelligence (AI) and machine learning capabilities extensively, including at edge locations.
The pervasive use of AI, however, will pose challenges for many enterprises which still tend to approach AI deployments in silos using components that are tightly coupled, limiting their reuse across different projects.
“The early days in machine learning are exactly the same,” said Merlicek.“ If I’m building machine learning and vision recognition models to analyse video feeds, I want to make sure I can reuse my base level components. So, the very concepts we applied in software development around service-oriented architecture and competency centres are the things that are now being deployed in the analytics realm to make it easier, faster and simpler.”
Event-driven architecture
Another growing trend in data analytics is the rise of event-driven architecture, a framework that orchestrates behaviour around the production, detection and consumption of events, along with the responses they evoke.
While event-driven architecture is often seen from the lens of IT management, it is making its way into data analytics in the form of the event-driven data fabric, which brings together different event, domain, integration and semantic data models, along with governance, to support data-driven operations.
Merlicek said such a fabric will pose new challenges for enterprises which will need to make sense of a large amount of data in their data fabric and consume just enough information with the right governance to support their analytics needs.
“And so going forward, we’re going to see native streaming abilities in these analytics platforms rapidly accelerate,” he added, noting that the technology landscape has evolved to make streaming viable.
These include open-source technologies such as Apache Pulsar, a cloud-native, distributed messaging and event streaming platform, as well as similar capabilities from hyperscale cloud players such as Amazon Web Services, Microsoft and Google Cloud.
Read more about data analytics in APAC
- Grab is using Apache Kafka in its fraud detection and prevention platform to ingest event streams from its mobile software development kits and client backends to pick up fraudulent activities.
- Australia’s Macquarie Bank has moved all its data and analytics to the cloud and is applying machine learning to detect fraud and improve customer experience.
- Healthcare providers are harnessing data analytics to improve clinical and operational outcomes even as they continue to face challenges in data aggregation and data protection.
- Informatica has consolidated its operations in four key Asia-Pacific markets in a move that will enable it to better meet the demand for cloud-based data management software.
Machine learning, event streaming and data fabrics are all aimed at enabling data to be acted upon to achieve a business outcome, increasingly through intelligent automation.
In September 2021, Vista Equity Partners, Tibco’s private equity owner, announced plans to acquire robotic process automation (RPA) supplier Blue Prism for $1.5bn.
Calling the move a perfect fit, Merlicek said RPA can benefit from Tibco’s strengths in data unification and governance, machine learning, and process capabilities.
In a blog post, Forrester analysts gave the impending acquisition a “thumbs-up”, noting that Tibco’s technical competence in integration, business intelligence, streaming analytics and data management, among other capabilities, can infuse AI smarts into the Blue Prism platform.
According to an IDC study commissioned by Alteryx, many organisations in Asia-Pacific have a data analytics strategy, but they lack the necessary skills and process capabilities to derive business value from data analytics.
They are also struggling with hard-to-use tools and the lack of timely access to data, as well as data lineage and integrity issues.
These challenges were exacerbated by growing demands for data analytics to be delivered at greater speed and scale, with the average enterprise currently facing internal requests to include 26 new data sources and 30 new data types per month.