Pulumi Copilot steers AI to cloud infrastructure management 

AI will be multifarious, multi-modal, multi-model and multi-tier. This is the promise from the enterprise software industry vendors now vying for share of voice above the cloud-based clamour resonating right now. It’s a message characterised by the way generative artificial intelligence and “traditional” AI for decision reasoning will now be applied. 

Because our forward-looking use of AI is multi-tier, it will be applied at every level of the software stack from the user interface, through the meat in the software code sandwich, across the connectivity points emanating from Application Programming Interfaces (APIs) and deeper downward into the base infrastructure that underpins modern computing systems.

Seeking to put AI to work at an Infrastructure-as-Code (IaC) level is Pulumi with its eponymously named Pulumi Copilot technology. Described as an AI for general cloud infrastructure management, this technology uses Large Language Model (LLM)-based capabilities to accelerate IaC adoption and usage alongside broader cloud management tasks.

According to Joe Duffy, co-founder and CEO of Pulumi, this copilot service combines LLMs with a semantic understanding of the way cloud computing instances and individual services are architected, structured, deployed and executed inside continuous computing environments. In doing so, he insists that Pulumi Copilot is able to unlock useful methods to control cloud infrastructure management. 

Familiar GPT experience 

Making use of the now increasingly familiar GPT chat experience, Duffy and team say that platform engineering engineers can grasp this service to find (and take action on) any resource in their cloud environments. If fully effective as described, Pulumi Copilot may become part of the AI accessibility and democratisation push that sees more users (including those with lower or rudimentary level technical skills) now able to get involved with complex tasks – in this case, cloud infrastructure management tasks.

“We immediately saw the immense promise of large language models for all things cloud,” said CEO Duffy. “We got started nearly two years ago, first with powerful Infrastructure-as-Code scenarios, but ultimately also built our core Pulumi AI technology to be capable of powering value-add features. Pulumi Copilot is a huge swing for [user accessibility and functionality] and we believe this is how all cloud management at scale will be done in the future.”

Pulumi Copilot understands the entirety of over 160 clouds, including public clouds (AWS, Azure, Google Cloud), cloud-native technologies (Kubernetes, Helm), SaaS providers (Snowflake, Cloudflare, Datadog) and more. Pulumi Copilot directly interfaces with these cloud APIs and data models. This allows users to query, modify and manipulate resources on any cloud directly through the copilot interface. 

Pulumi Cloud features a new interactive copilot user interface that follows engineers wherever they go inside the cloud infrastructure layer and understands the context of queries being made (including their organisation’s usage patterns) and the chat history. Users can ask any question and the copilot (which is taught to use the underlying Pulumi data model and REST APIs), which will the company claims will provide relevant responses to queries across the entire platform. 

Pulumi Copilot enables automation across a range of scenarios. This includes writing code; creating and managing infrastructure; analysing and updating infrastructure for cost, compliance and security concerns; answering questions about cloud usage and team development velocity; troubleshooting problems; detecting anomalies through observability information etc. New AI-driven quirks in the Pulumi Command Line Interface (CLI) help with common Infrastructure-as-Code challenges, such as diagnosing cloud errors, comprehending cloud infrastructure resources, generating code and so on.

These capabilities are powered by Pulumi Copilot and they deep link into the Pulumi Cloud service itself.

All things to all people?

There are key trends on show here such as lowering the barrier for user access and AI chatbot functionality that bring the technology use case upwards. Yet, at the same time, there are elements of this technology aligned to hard core developers at the command line and only really part of the lunchtime conversation we might tune into if we walked past the cloud platform engineering team. While Pulumi isn’t being accused of trying to be all things to all people, its exact user usage point will be telling if this technology thrives. This may also help signal what the company does next with functionality extensions. Let’s stay democratic and watch the story unfold.