NicoElNino - Fotolia
Build 2020: Avoiding AI problems
It is early days for responsible artificial intelligence, but Microsoft aims to help companies avoid problems and improve the performance and quality of AI applications
I have been asked many times during the past month whether the heightened pressure that enterprises are now facing as a result of the Covid-19 pandemic will cause them to short-cut aspects such as responsible machine learning in order to get pilots into production more quickly.
This is certainly a possibility, but in my opinion, people’s memories of the actions that enterprises are taking now will run much deeper than many of the better-planned projects that came before the pandemic or have yet to start. More organisations will therefore aim to get artificial intelligence (AI) right during the crisis as well.
As practitioners get going in this area, here are a few things to consider.
One global bank I spoke to recently has just put in place a policy that no AI model can move into production without some interpretability and bias controls built into the lifecycle of the application.
This is a fantastic approach. Embedding governance into the entire lifecycle of machine learning helps to reduce problems later on and, above all, engenders confidence and trust in the AI that gets built. This ultimately leads to faster deployments, wider adoption and more responsible innovation.
Kjersten Moody, chief data and analytics officer at insurer State Farm, perhaps captures this best when she says: “As we introduce AI into our processes, we have to hold ourselves to the highest standard and we have to hold AI to that high or higher standard that we would hold our people to.”
Although they are in their infancy, tools to counter potential unfairness in data and improve explainability in models are getting better. They are a good place to start in responsible AI. They will help to minimise any negative effects, not only on customers but also on business processes, employees and the surrounding technologies that support AI.
Read more news from Build 2020
- Microsoft has outlined its toolset for making AI models explainable with the InterpretML and AboutML tools.
- Microsoft Build is always a vital event for developers across the IT industry. This year, Microsoft has expanded its AI emphasis.
Customers I speak to care little about the conceit of algorithmic perfection from a supplier. Rather, they want to know that they are on solid foundations with a responsible provider as they advance their AI strategies.
Look to suppliers that prioritise this area, provide access to talent and best practices, and are transparent in their AI services. This includes ensuring that you can extend your framework for responsible AI to models you consume from your provider that you don’t own.
Build 2020 did a good job showcasing the investments Microsoft is making in responsible machine learning. It will be fascinating to watch enterprises progress in this key field in the months ahead.
Nicholas McQuire is a senior vice-president and head of enterprise and AI research at CCS Insight.