zapp2photo - stock.adobe.com
UK needs to upskill to achieve quantum advantage
Quantum computing bridges the IT and scientific community, and although it may be years away, skills developed during pilot projects are key
New research from EY has found that in the UK, nearly all (97%) of the 501 executives surveyed expect quantum computing to disrupt their sectors. The research also found that nearly half (48%) believe quantum computing will reach sufficient maturity to play a significant role in the activities of most companies in their respective sectors by 2025.
However, according to EY, only one-third (33%) of organisations have already begun strategic planning to prepare for the technology’s commercialisation and less than a quarter (24%) have set up pilot teams to explore its potential or are currently working to do so.
For instance, last week, the Ministry of Defence (MoD) announced a year-long programme working with UK quantum specialist ORCA Computing to develop software for ORCA’s PT-1 quantum computer, which has been designed to operate at room temperature.
Discussing the pilot, Stephen Till, fellow at the Defence Science and Technology Laboratory (Dstl), an executive agency of the MoD, said: “This work with ORCA Computing is a milestone moment for the MoD. Accessing our own quantum computing hardware will not only accelerate our understanding of quantum computing, but the computer’s room-temperature operation will also give us the flexibility to use it in different locations for different requirements.
“We expect the ORCA system to provide significantly improved latency – the speed at which we can read and write to the quantum computer. This is important for hybrid algorithms, which require multiple handovers between quantum and classical systems.”
Piers Clinton-Tarestad, a partner in EY’s technology risk practice, said there is a general consensus that quantum computing will start becoming a reality in 2030. But pilot projects, such as the one being conducted at the MoD, and proof-of-concept applications can help business leaders to understand where quantum technology can be applied. “Quantum computing represents a fundamentally different way to think about data,” he said. “It is truly transformational and is not just about doing [computations] faster.”
The EY study also found that there is a scarcity of quantum talent. It said bridging the skills gap will require companies to pursue a multi-pronged strategy, developing skills in-house and recruiting from established physics and related degree and postgraduate courses, while simultaneously helping education providers to build future talent pools.
Addressing the skills gap is among the focus areas in the Department for Digital, Culture, Media and Sport (DCMS)’s UK Digital Strategy policy paper. The DCMS said it would work with the Department for Education and other government departments to identify specific topics that are critical to supporting future technologies, such as quantum computing and advanced semiconductors.
“We must also actively seek to grow the UK’s existing expertise in the foundational deep technologies of the future, such as artificial intelligence, next-generation semiconductors, digital twins, autonomous systems and quantum computing,” said the DCMS.
Although 2030 is regarded as the start date for commercial quantum computing, there are some who doubt it will ever become mainstream. The industry has ambitions to achieve mainstream quantum computing by 2040.
But even if these milestones are not achieved, Clinton-Tarestad believes that organisations building up quantum skills today will be in a stronger position going forward. “If quantum computing never happens, we end up with a bunch of very bright STEM people who understand it and want to change the world,” he said.
Read more quantum computing stories
- Using quantum computing to solve complex non-linear differential equations offers a way to speed up digital-twin simulations.
- Among the significant implications of larger quantum machines is that it will soon be no longer possible to simulate algorithms on a classical computer architecture.