LLM series - Perforce: How to culture an AI mindset
This is a guest post for the Computer Weekly Developer Network written by Rod Cope in his capacity as chief technology officer at Perforce.
Cope writes in full as follows…
Generative AI is giving developers superpowers.
The use of LLMs in line with new AI models means that they can write code faster, create more innovative features and focus on the stuff that adds value. We are not talking about just being able to achieve twice as much: this is the acceleration of results and generation of massive code bases and project footprints on an unprecedented scale.
I call it 10 x AI because that is the very least we will see: think about 100 x AI.
Plan to plan, please
The potential is massive, but so are the implications on organisations and that takes some planning.
The level of growth means that existing systems and processes must be evaluated. Even if tools and methods are keeping up with demand today, will that still be the case in six months? Then there is also the reality that many organisations still have manual processes and there is no way that having teams fill in spreadsheets or keep up with rapid changes via email can keep up with 10 x AI.
Existing working methods could create bottlenecks and unless all parts of a process are at the same pace, that undermines the ROI and benefits of AI.
A mindset for AI
Of course, there is often resistance to change, particularly among organisations that have yet to adopt Agile or go through digital transformation. Engendering the right cultural mindset for AI is just as important as the technology and processes. Everyone needs to understand the advantages to the business and themselves personally, but also the implications. They also need to be ready for the pace of change: LLMs are just the current flavour of AI… and there will be others.
So, should some organisations hold off from adopting LLMs and first get their houses in order? My recommendation would be to paint a picture to senior management to show how AI will help an organisation compete better and, at the same time, here is an opportunity to move away from the outdated processes that are wasting time and slowing down progress.
Start with identifying the tedious, error-prone tasks, such as when someone is typing in something manually for another person to re-key elsewhere or export to a spreadsheet. Give those people extra time to focus on higher-value work and the organisation benefits from improved productivity. Second, demonstrate the potential. Instead of just improving a process, how about being able to skip that process entirely?
Data cleanup
There are a few other essential issues to consider, such as ensuring that data is ‘clean’. Data is the heart of AI: give it bad data and it will output bad conclusions. There are some compliance issues to consider here, such as not using personal identifying information and other sensitive data. This is why there is currently a focus on ways to anonymise or mask data so that it can still be used without breaching privacy requirements.
Then there is storage.
Projects that are ten times larger need ten times more storage, which can become expensive and difficult to manage. Virtualisation and subsetting tactics are ways to combat this, but none of this is overnight out-of-the-box: it has to be thought about and carefully planned.
Energy consumption is also a challenge because the additional data and network traffic that AI creates all need more power. So, the datacentre industry and governments are getting creative. For instance, there are conversations in Europe about how data centres will be required to recycle the heat they generate and make it available elsewhere. Imagine rerouting that otherwise wasted energy elsewhere, even to help consumers heat their homes and be a step closer to carbon neutrality.
Chip manufacturers are also developing more energy-efficient chips, which also helps, though in the short term, that has to be balanced against the limited supply of chips. Analysts have said that AI could be constrained during 2024 due to the chip and energy challenges, as well as available skill sets, but that will change.
With that in mind, 2024 is the year to get all the elements in place so that development teams and their organisations are in good shape to harness the power of LLMs and AI in general without being held back by bottlenecks or manual processes. If AI is set to turn development teams into superheroes, then they need the right supporting environment to be their organisations’ and customers’ heroes.