nobeastsofierce - stock.adobe.com

ITER workshop highlights the role of computing technology in nuclear fusion

One of the themes from the private sector workshop run in France this May is that information technology remains a key enabler for nuclear fusion

The idea behind ITER started with Gorbachev and Reagan in the 1980s, during the height of the cold war. The two leaders agreed to join forces to develop nuclear fusion, clean energy technology that could one day power the world. The project was launched one year later when the European Union, Japan, the Soviet Union and the US began to design a large international fusion facility.  

In the early 2000s, China, Korea and India joined the other four members to help build and run an experimental reactor that can be used by scientists and engineers from around the world to learn enough about fusion to design prototypical commercial reactors sometime in the next few decades. The reactor is still under construction in Cadarache, France, just outside of Aix-en-Provence. 

The seven members make both financial and in-kind contributions in the form of components and services that help countries develop local know-how in specific areas. The in-kind contributions help fulfil one of the secondary goals of the project, which is to foster a worldwide industry so that when fusion becomes a commercial reality, many regions of the world will know enough about it to run their own power plants. 

Recently, the ITER council agreed to begin allocating resources to support the growing private fusion industry and ran a large workshop to kick off this new line of work. The inaugural Private Sector Workshop was held from 27 to 29 May in Cadarache, France, at the headquarters of the ITER Organisation.  

According to Laban Coblentz, head of communication at ITER, people from all around the world, representing many of the different roles in the growing ecosystem, were present. This included investors, policymakers, fusion companies and a range of supply chain partners. 

The event was attended by around 300 people from different organisations, many of whom compete on some level and who may not agree on the best approach to fusion, offering an opportunity to share ideas and find some commonality in their vision for the future.

The different approaches to fusion might be lumped into two broad categories. The first is magnetic confinement fusion (MCF), which uses magnetic fields to confine the plasma and requires a big machine, such as the ITER Tokamak, which is still being built, or the Wendelstein 7-X stellarator, which began operation in 2015. The second broad category is inertial confinement fusion (ICF), which applies a powerful shock to compress the plasma enough to cause fusion reactions to occur. 

According to Andrew Holland, CEO of the Fusion Industry Association (FIA), since around 2021 there has been a sharp increase in the number of private sector startups that aim to build a reactor based on their unique approach, and about $4bn has been invested during that same period. While about 80% of the $6bn invested in nuclear fusion around the world is funding American companies, geographical diversity is increasing.

“We do see new companies, especially here in Europe and in Japan, coming more ambitious, driving more investment into them,” Holland said in his opening presentation at the workshop. 

Computer technology supports growing fusion industry  

Regardless of the approach, fusion requires computer technology in at least five key areas of any given project. The first is to run simulations to validate the underlying physics well before the machine is designed. The second is to run one or more control systems that coordinate the different components in the machine. The third is to support diagnostic systems that observe different aspects of the operations so that immediate action can be taken and/or so that the behaviour of the machine can be studied. The fourth is to power the robotics needed to make repairs and perform maintenance in the harsh environment of a reactor. And the fifth is to collect and store data from the diagnostic systems to refine the models in ways that will help design better machines in the future. 

Several kinds of computer technology are needed, including high-performance computing to run the complex simulations and analyse diagnostic data; radiation-hardened integrated circuits to support diagnostic equipment and control the robots used for repair and maintenance; high-speed storage networks to collect and store data during operation; control software to synchronizse components; and specialise​d​ software for simulations. 

Ignition Computing, a Dutch company present at the event, epitomises a very specific niche in the growing ecosystem that involves creating and integrating simulation models. “Existing codes [modelling software] typically focus on a specific physics phenomena – for example, magnetic equilibrium,” Daan van Vugt, Ignition Computing’s founder and CEO, explained to Computer Weekly. “You might have other codes that model transport, the motion of the particles themselves.” 

Regardless of the approach to nuclear fusion, the first step is to “discretise” the data. The world is analogue, but computers require digital information, so both measurements and the time of each measurement must be digitised before being used in simulations. 

“You have to get the data into a discrete number of points so that you can treat them on a computer,” said van Vugt. “And then you​​ convert your physics equations, through this discretisation, into a ​system of equations y​ou’re trying to solve.” 

Most of the time, more than one phenomenon must be studied at a time. This is called multiphysics simulation – and according to van Vugt, there are two different approaches to address multiphysics simulation. One is to solve for each of the phenomena one at a time. For example, you might solve for density and then for temperature, iterating between them. The other approach is to solve them together, putting them into one combined equation and solving it all in one go. 

The codes developed to model one aspect of the physics are sometimes made available as open source. Depending on the approach to fusion, different existing codes may be reused from one project to another, saving a significant amount of time and effort. Hopefully, the lessons learned from the Inaugural Private Sector Workshop will lead to a big increase in sharing for the benefit of humanity.

Read more on IT for government and public sector

CIO
Security
Networking
Data Center
Data Management
Close