Circular IT series - InterSystems: Peak performance via tight feedback loops
This is a guest post written for Computer Weekly’s series of features dedicated to examining the ‘circular IT economy’ authored by Scott Gnau in his capacity as head of data platforms at InterSystems.
Gnau writes as follows…
Motorsport athletes and fans know that the engine is the catalyst for success. Drivers and their crews spend countless hours tweaking and tuning their engines to reach and maintain that peak performance sweet spot, in terms of speed and durability.
This ensures that the vehicle is equipped to achieve optimal performance every time it hits the track, without wasting effort, resources, or fuel.
Hitting that sweet spot is about equally prioritising performance and efficiency. If the engine revs hard but burns tons of energy while doing so, performance will suffer.
Circular IT engine parallels
As we examine the circular IT economy, the same rationale applies.
On-premises deployment models offer the opportunity to pull expenses forward with the idea that it will be cheaper over the course of, say, three years. But, over the years, most have favoured the cloud because of the prevalence of on-prem over-provisioning. The more unused capacity one reserves to increase writing speeds, the less capacity there is for storage. This creates a vicious cycle of favouring performance over efficiency – a tradeoff that most are willing to make to maintain business and product continuity.
However, in this new cloud-based microservices environment, organisations have an even greater opportunity to hit that peak performance sweet spot, ringing with efficiency like a perfectly tuned engine. By tightening the software feedback loop, organisations can drive greater efficiency in the IT economy.
Keeping with the car theme, think about the engine as the driver of peak performance… and all the bells and whistles around it as drivers of customer satisfaction.
Let’s say an auto manufacturer (let’s call them SuperCar) released its new model and hoped for the best possible impact on the market without any customer insight. Sure, it would likely still sell a lot of cars on the SuperCar name alone, but what about the next model, or the model after that? Without proper field testing and feedback loops, SuperCar’s vehicles might not meet customer expectations for performance and could gradually lose customers over time. Time is also a key factor – if they [it, the company] took four-years to develop its new model, that’s another cause of customer attrition.
That’s a risk that no business can afford.
Instead, my personal favourite auto manufacturer puts measures in place to continuously test and tune every feature in the vehicle to ensure satisfaction and intelligently inform the design of the next model. For example, they can continuously track what buttons people press in the car most frequently to inform the inclusion of features people want – and where to locate them – and what features appear to be an added luxury.
The same can be applied to IT – by continuously monitoring and enabling easy adjustments in production, IT teams can ensure that their system, or engine, is always operating at peak performance. However, actually doing this in practice isn’t as easy as it sounds in words on this page.
Data democratisation & interoperability
Data democratisation is the process of providing everyone within an organisation access to the data they need to best do their job. The goal is to help increase individuals’ access to the right data, improve people’s ability to make data-driven decisions (regardless of their technical aptitude) and, in turn, improve customer experiences.
This translates into enabling the breakdown of data silos to improve data access for everyone in the organisation. With proper data democratisation, organisations can then benefit from embedded analytics and embedded interoperability to drive more efficiency in the IT economy.
Take Agile Fractal Grid (AFG) as an example. AFG’s mission is to help rural and campus communities achieve energy security and meet renewable energy goals while also providing gigabit broadband access. In doing so, AFG analyses streaming data from thousands of devices, via embedded interoperability that integrates data and business processes between systems, that require milli-second decision and response. With these actionable insights, delivered through embedded analytics, AFG can address disruptions and constraints caused by geopolitical events and technical failures across the entire power grid, from source to consumer.
This added visibility and enhanced control of the power grid fuels AFG’s development of microgrids (a network of systems that behave like a utility), enabling better energy security and the ability to meet renewable energy goals at scale.