visoot - stock.adobe.com

Conservative peer urges government not to limit open source AI

The chair of the Lords Communications and Digital Select Committee calls for greater support for SMEs, competition and economic dynamism as artificial intelligence policies are developed

In a House of Lords debate looking at large language models and generative artificial intelligence (GenAI), Tina Stowell, chair of the Lords Communications and Digital Select Committee, discussed the need for a UK AI strategy that focuses on supporting commercial opportunities, academic research and spin-outs.

“As the government considers AI legislation, it must avoid policies that limit open source AI development or exclude innovative smaller players,” she told MPs.

In September, the committee launched an inquiry into the UK’s scaleup potential covering technology in the creative industries and AI. It is focusing on the specific barriers to small and medium-sized enterprises (SMEs) scaling in those areas.

Stowell believes that while it is difficult to predict the future of AI development, a few trends are becoming clearer. “The shift towards consolidation at the top of the market suggests a growing concentration of power among a few large tech firms. At the same time, we’re likely to see a huge expansion of applications built on top of those – which is perhaps where the UK can thrive,” she said.

Stowell believes open source is also likely to have a role. “I think open source AI development is key to ensuring we can support and safeguard competition and economic dynamism. We need to ensure the UK’s approach to AI is balanced and plays to our strengths. While we need to mitigate risks, of course, we need to ensure that, in doing so, we do not accidentally and unnecessarily stifle innovation,” she added.

Looking deeper into the role the UK can play in AI regulation, Stowell urged policymakers to ensure the UK is able to forge its own path on AI and regulation. “We can learn from, but must not copy wholesale, the EU, US and also China. That means focusing on talent, compute, enabling standards that give businesses confidence to innovate, championing responsible practices, mitigating major risks and ensuring we can benefit from a healthy mix of approaches to developing AI,” she said.

Stowell also stressed the importance of the UK being “cautious and nuanced about AI regulation and where the correct focus is”, adding: “The EU example shows that we mustn’t rush to regulate – particularly given complexities around liability and around anti-competitive practices.”

She also said it was important to ensure smaller organisations feed into policy discussions. “I would encourage everyone to engage with the work of Parliamentary committees and to respond to industry groups and government consultations. This is a technology that will affect us all, so having more people feeding in will create better and more informed outcomes for all of us,” she said.

Stowell, alongside University of Cambridge professor Neil Lawrence and Stability AI, has been shortlisted for the OpenUK award recognising work in artificial intelligence.

Read more from the Communications and Digital Select Committee

House of Lords launches an investigation into generative AI: The government wants the UK to lead AI regulation, but technology is developing at a breakneck pace.

House of Lords focuses on copyright and transparency: Large language models promise to boost UK productivity, but the tech needs greater scrutiny and questions remain over intellectual property.

Read more on Artificial intelligence, automation and robotics