OUR SERVICES

OUR services

DiRAC’s objective is to maximize the scientific research and innovation our users can deliver by providing them with cutting-edge, high-performance computing (HPC) resources tailored to their research needs. We work closely with our research communities to “co-design” our services, translating peer-reviewed science programmes into workflow requirements and then technical specifications. As the pages on the individual HPC services describe, DiRAC systems are delivering science breakthroughs on scales comparable to those achieved on much larger supercomputers worldwide, enabling  DiRAC researchers to be international leaders in their fields. Additionally, by delivering more science using less hardware, DiRAC users often achieve lower carbon footprints for their research than those of their competitors.  

OUR RESOURCES

DATA INTENSIVE
CAMBRIDGE (DIAC)

Many DiRAC projects explore high-dimensional parameter spaces using statistical techniques generating large numbers of computationally-intensive models, which then need effective data management. The use of GPU acceleration in simulations is also on the rise, either to support post-processing of the simulation data or to make use of AI-driven models. The Cambridge DI service effectively supports these research workflows by offering a combination of CPU and CPU nodes that operate on a unified parallel file system ensuring seamless movement between both architectures.

dirac at cambridge

CPU CORES
30,412

GPU CORES
746,496

MEMORY
157TB

DATA
8PB FILE SPACE

INTERCONNECT
200Gps HDR 3:1-BLOCKING

DATA INTENSIVE
LEICESTER (DIAL)

The Data Intensive service provides a set of hardware options for projects from across all DiRAC research domains, driving scientific discovery by delivering a step-change in its capability to handle very large datasets.  Leicester’s DI service offers high-memory nodes which seamlessly access a shared tightly-coupled parallel file system, enabling workflows to flexibly utilise both regular and high-memory resources. Leicester also plays a pioneering role within DiRAC by spearheading the exploration of alternative CPU processor technologies that diverge from the traditional x86-based architecture.

dirac at leicester

CPU CORES
40,288

CPU CHIPS
1,216

MEMORY
177TB

DATA
7PB FILE SPACE

INTERCONNECT
200Gps HDR 3:1-BLOCKING

EXTREME SCALING
EDINBURGH (ES)

Tursa is an optimised computing system tailored for particle physics, specifically studying how strongly-interacting particles called hadrons depend on their constituent quarks and gluons. It combines powerful GPUs with low-latency node communication to efficiently simulate these processes. Tursa’s design also aligns very well with the requirements for training large-language models, marking it  the UK’s first example of a system capable of supporting both quantum chromodynamics simulations and cutting-edge AI calculations. This unique dual-capability positions Tursa as a versatile resource for researchers working at the intersection of particle physics and artificial intelligence, enabling groundbreaking, scientific exploration.

DiRAC at EDINBURGH

CPU CORES
5,040

GPU CORES
4,921,344

MEMORY
180TB

DATA
4PB FILE SPACE

INTERCONNECT
200Gps HDR NON-BLOCKING

MEMORY INTENSIVE
DURHAM (MI)

COSMA 7 and 8, which collectively form the Memory Intensive Service, are cutting-edge HPC systems specifically designed for large-scale cosmological simulations. These simulations aim to unravel the intricate dynamics of galaxy formation and evolution and involve diverse physical processes across vast length- and time-scales.  The complex calculations require a significant memory footprint, highly effective inter-node communication and sufficient storage to keep the large amounts of output data. These characteristics make the  tailored architecture of COSMA 7 and 8 a robust platform for cosmological research, enabling researchers to explore the Universe in great detail.  

dirac at durham

CPU CORES
80,240

CPU CHIPS
1,960

MEMORY
731TB

DATA
17.1PB FILE SPACE

INTERCONNECT
200Gps HDR NON-BLOCKING

CURRENT PROJECT ALLOCATIONS

Supernova explosion in the interstellar medium.  Nina Sartorio, Ilse De Looze, Florian Kirchschlager, Mike Barlow, Franziska Schmidt

DATA INTENSIVE
CAMBRIDGE

Statistical fluctuations of energy level spacings for a supersymmetric Yang-Mills-like model. Pavel Buividovich

DATA INTENSIVE LEICESTER

image-49

EXTREME SCALING EDINBURGH

Massless radiation from an axion string. Amelia Drew, Carson Brownlee, Paul Shellard

MEMORY INTENSIVE DURHAM

ELSEWHERE ON OUR SITE

HPC SKILLS TRAINING

DIRAC-3 LAUNCH

DIRAC SCIENCE