The composable stack - InterSystems: The software consumption pendulum

This is a guest post for the Computer Weekly Developer Network written by Scott Gnau in his capacity as VP of data platforms at InterSystems – a company known for its software systems and technology for high-performance database management, rapid application development, integration and healthcare information systems

The software consumption pendulum is currently swinging towards ephemeral composability. Gone are the days where developers would buy an integrated stack from a single enterprise player. The “there’s an app for that” mentality now drives application development at a cultural level.

The developers of today grew their skills in the cloud and have an ‘app store’ of APIs, tech stacks and open source code at their fingertips. The solution to the problem they want to solve is often a download away. As this trend steamrolls forward, at what point does managing the composable stack become unsustainable?

Real-time accessibility and visibility

Successful enterprises will decompose their integrated stacks to support composability. So how do architectures need to evolve to bring these services and applications together?

In order to get ahead of the next pendulum swing, organisations must adopt new approaches to data management that increase real-time accessibility and visibility across the entire stack.

The next generation of innovation and automation must be built on strong data foundations. Emerging technologies, such as artificial intelligence and machine learning, require a large volume of current, clean and accurate data from different business silos to function. Yet seamless access across an organisation’s multiple data silos has proven extremely difficult.

While data lakes attempt to solve many data management challenges, many data lakes have become data swamps – murky repositories with disorganised data that is difficult to access and leverage for actionable insights. Without a real-time, consistent and secure data layer to deliver the required information to the relevant stakeholders and applications at the right time, the composable stack becomes less effective than promised.

Native interoperability

Sunsetting legacy applications requires a lot of time and effort, but firms shouldn’t be held back by these limitations. Leading organisations are running legacy systems and stitching together distributed data from across the enterprise to provide analytical capabilities and real-time insights. Leveraging smart data fabrics presents one such approach where organisations can process, transform, secure and orchestrate data from disparate sources.

Implementing an effective data fabric requires exposing and integrating the data and systems that will ultimately provide immediate and significant value to the organisation. This means data interoperability is essential. Because disparate systems format data differently, the lack of native interoperability adds friction, slows time-to-value for data stakeholders and introduces the need to harmonise, deduplicate and cleanse data.

For these initiatives to be successful, organisations need to overcome technical and organisational challenges. Organisations need to understand their data consumption and regulatory compliance needs in order to make proper use of new architectural paradigms. A lack of understanding often creates complexity, or worse, points of failure. Appointing a Chief Data Officer (CDO) is one strategy to foster top-down data governance and provide necessary organisational support for a cohesive data strategy.

The next enterprise applications that bring composable services together will be the cornerstone of modern software development.