DiRAC is recognised as the primary provider of HPC resources to the STFC Particle Physics, astroparticle physics, Astrophysics, cosmology, solar system & Planetary Science and Nuclear physics (PPAN: STFC Frontier Science) theory community. It provides the modelling, simulation, data analysis and data storage capability that underpins the STFC Science Challenges and our researcher’s world-leading science outcomes.

On this page you can find information on:

DiRAC Projects:

Accessing DiRAC:

  • The STFC Resource Allocation Committee which allocates compute time and storage to our open projects.
  • Annual STFC Call for Proposals to request access to DiRAC resources.
  • The Director’s Discretionary Time allows flexible allocation to four classes of project.
  • For researchers who would like to try the DiRAC resources, get a feel for HPC, test codes, benchmark or see what the DiRAC resources can do for you before making a full application for resources, an application can be made for Seedcorn Time.

Acknowledge DiRAC:

Each year we publish a selection of Science Highlights and a full list of publications from all our projects.

If you cant find what you need on these pages, please email the Project Office

DiRAC Projects

DiRAC serves over 35 projects, with more than 400 active users. Our community is diverse and encompasses Particle Physics, Astrophysics, cosmology, Nuclear Physics. Together their research addresses all the STFC Science Challenges.

Project Allocations

After each  Call has been concluded, we publish the compute time allocated to each project, in terms of CPU-Mhours on the DiRAC service on which their calculations are to be run, as well as the Storage Space w[TB] allocated. In DiRAC we have three compute services and due to the different architectures on each service, the CPU-Mhours awarded to each project may not be directly comparable.

User Accounts

Projects and accounts on DiRAC resources are administered through the DiRAC SAFE. SAFE stands for Service Administration From EPCC. It is a large web-based application, provided by EPCC for DiRAC. The same software is used by the Archer UK national supercomputing service and for other facilities at EPCC.

Every DiRAC user has an account on SAFE. You can use your SAFE account to check your use of CPU time and disk space now and in the past, to apply to join other projects and create service machine accounts, to change passwords, to keep your personal details up to date, to check the progress of the helpdesk queries you have submitted, and so on. PIs and project managers can do many other things. such as viewing usage by each member of their project team.

The DiRAC team also uses SAFE to administer the system and to generate reports. The helpdesk software is also part of SAFE. If you are a PI or project manager you can manage your project via SAFE.

You can find all the information you need on SAFE here. For any other information please email DiRAC Support.

Accessing DiRAC

STFC Resource Allocation Committee (RAC)

The Resource Allocation Committee (RAC) is responsible for overseeing the allocation process for all on DiRAC Resources, including compute time, storage and Research Engineering effort.

Call for Proposals

The RAC makes an annual Call for Proposals for requesting time on our Resources. 

Notification about any Call will be posted on our News page, and on Twitter. 

Any Call Announcement, Guidance Notes and Application Forms are available on our Call for Proposals page.

Acknowledge DiRAC

If you have used the new DiRAC Resources which have been available since the 1st May 2018 for your publication, the new acknowledgement statements for each machine can be found here.

If you have used the DiRAC Resources prior to 30th April 2018 for your publication, please use the statements found here.


What makes DiRAC special…

DiRAC was established to provide distributed High Performance Computing (HPC) services to the STFC theory community. HPC-based modelling is an essential tool for the exploitation and interpretation of observational and experimental data generated by astronomy and particle physics facilities support by STFC as this technology allows scientists to test their theories and run simulations from the data gathered in experiments. The UK has an extremely strong HPC community and these powerful computing facilities allow the UK science community to pursue cutting-edge research on a broad range of topics, from simulating the entire evolution of the universe, from the big bang to the present, to modelling the fundamental structure of matter. DiRAC is both an academic-led and an academic-supervised facility and our systems are specifically designed to meet the different high performance computational needs within our scientific community.

DiRAC provides a variety of compute Resources that match machine architecture to the different algorithm design and requirements of the research problems to be solved. There are sound scientific reasons for designing the DiRAC services in this way and the methodology was adopted following a number of in-depth reviews involving the STFC research community. The bespoke demands of the different research domains supported by STFC are such that a distributed installation was the most cost effective way to satisfy the varied scientific requirements.

As a single, federated Facility, DiRAC allows more effective and efficient use of computing resources, supporting the delivery of the science programmes across the STFC research communities, addressing all the STFC Science Challenges. It provides a common training and consultation framework and, crucially, provides critical mass and a coordinating structure for both small and large scale cross-discipline science projects, the technical support needed to run and develop a distributed HPC service, and a pool of expertise to support knowledge transfer and industrial partnership projects. The on-going development and sharing of best-practice for the delivery of productive, national HPC services within DiRAC enables STFC researchers to deliver world-leading science across the entire STFC theory programme in particle physics, astrophysics and cosmology, solar system physics, particle astrophysics and nuclear physics.

As was originally envisaged, DiRAC has become a vibrant research space, both in terms of Science and in terms of technical development. These two aspects of our activities are intimately linked with each feeding back into the other and driving research excellence in theoretical simulation and modellin alongside world-leading technical innovation. DiRAC’s technical achievements are as important as our scientific achievements; they are key to our scientific impact and key to our impact on the UK Economy as a whole.


August 2023

The University of Leicester have two job opportunities within Digital Services for Linux Systems Administrators

Linux Systems Administrator (Research Computing – Fixed Term Contract)

There is an exciting opportunity for a Linux Systems Administrator to join the Research Computing Services team on an initial fixed term contract through to March 2026, where you will get to play a key role in supporting the DiRAC High Performance Computing capability.


Linux Systems Administrator

There is an exciting opportunity for a Linux Systems Administrator to join the established Infrastructure and Operations team, where you will take a lead on Linux user administration, proactive and reactive security patching, monitoring, Azure Cloud tools/Defender for servers antivirus, and the day-to-day administration of Linux servers on physical and VMware hosts.


The closing date for both posts is September 3rd.

Project Office Co-ordinator and Facility Digital Content & Publicity Manager Positions available at DiRAC

DiRAC are currently looking to recruit a Project Office Co-ordinator and Facility Digital Content & Publicity Manager to work within the Project Office.

For further information, and to apply, please follow these links:–publicity-manager.html

Simulations reveal unprecedented details of a stars evolutionary phase

An international team of researchers have used DiRAC to study a “nuclear burning phase” of a star’s evolution in unprecedented levels of detail and realism.

Most of our understanding of stars and their life cycles come from one-dimensional models, which are severely limited in the amount of detail they can provide. The complex processes inside stars also mean there are many uncertainties which can make these simulations unreliable.

The computational power of DiRAC’s COSMA8 machine at Durham has for the first time enabled a 3D simulation of an entire phase in the life of a 20 solar mass star. Working in three dimensions faithfully captures processes such as convection, and the energy transfer from the star’s convective regions to the surrounding regions where the transfer is radiative. The simulation follows a portion of the star from the early development of the neon-burning phase, fuelling the star’s energy production via fusion to different nuclei such as oxygen, silicon and magnesium, through to its complete exhaustion after a matter of hours or days. The findings, published in the Monthly Notices of the Astronomical Society, provide crucial answers to long-debated questions in stellar physics.

Lead author Federico Rizzuti, a PhD student from Keele University, said:

“For this new publication, we have run 3D simulations of stellar interiors for long enough to see the evolution of one entire ‘nuclear burning phase’, allowing us to study in detail how a nuclear burning phase develops and eventually dies, particularly the complex interaction between nuclear reactions and turbulence in the stellar layers, with a new degree of precision and realism.

“We have found that the nuclear reactions are really efficient during this phase, and soon they consume all the fuel, halting also the movement of elements across the star’s different layers. We were also able to study which chemical elements were consumed and produced during this phase.
“This will give us new information on how stars live and die, and whether they produce supernova explosions, neutron stars and black holes when they die. Our work demonstrates it is finally possible to simulate long portions of a star’s life with 3D models, and we are sure that soon we will see more: this is why we call it the ‘dawn of 3D stellar evolution’ “

The research was supported by the Science & Technology Facilities Council (STFC), ChETEC COST Action (CA16117) and the EU Horizon 2020 programme.

Link to full research

July 2023

New supercomputer simulation to test model behind Universe’s formation

An international team of astrophysicists has simulated galaxy formation and large-scale cosmic structure with unprecedented statistical detail to investigate how the Universe formed. 

The team, including Dr Sownak Bose, Professor Carlos Frenk and other researchers at Durham University, say their MillenniumTNG supercomputer simulations will allow scientists to carry out precision tests of the standard cosmological model. 

Known as Lamba-CDM, the standard cosmological model is used by physicists to explain the formation of the Universe following the Big Bang. 

The researchers say their simulations are essential for interpreting existing and new observational studies – such as the surveys being carried out by the James Webb Space Telescope and the recently launched Euclid satellite – allowing scientists to investigate the nature of dark energy and dark matter by comparing the actual Universe to virtual universes created in a supercomputer. 

Dark energy is thought to be behind the accelerating expansion of the Universe, while dark matter is the structural backbone — not visible through telescopes — upon which galaxies eventually form. 

Both make up the majority of the Universe’s total content (with the remaining five per cent being stars, planets and galaxies) but scientists do not know what they are made of. 

The first results of the MillenniumTNG project will be published in a series of ten articles in the journal Monthly Notices of the Royal Astronomical Society. 

Dr Sownak Bose, Assistant Professor (Research), in Durham University’s Institute for Computational Cosmology, said: “The nexus between high precision observational data and ambitious, state-of-the-art cosmological simulations like MillenniumTNG is critical in advancing our understanding of how galaxies form and evolve over cosmic history.  

“This is an important step in the pathway to realising cosmologists’ ultimate ambition: to use the observed galaxy population to test the standard model of cosmology, and decode the mysterious entities of dark matter and dark energy.”  

MillenniumTNG is led by researchers at the Max Planck Institute for Astrophysics, Germany, Harvard University, USA and Durham University. It also includes researchers at York University, Canada, and the Donostia International Physics Center, Spain. 

Building upon previous successes with the Millennium and IllustrisTNG projects, they developed a new suite of simulation models – named MillenniumTNG (Millennium, the next generation)– which trace the physics of cosmic structure formation with considerably higher statistical accuracy than previously possible. 

They employed the code AREPO to follow the processes of galaxy formation directly, throughout volumes still so large that they could be considered representative of the Universe as a whole. 

Comparing simulations with and without galaxies gives a precise assessment of the impact of ”normal” matter related to supernova explosions and supermassive black holes on the total matter distribution.  

This is important when interpreting upcoming observations correctly, such as so-called weak gravitational lensing effects – where light is warped by the mass of another object – which respond to matter whether it is dark or normal.  

The researchers used two extremely powerful supercomputers, the SuperMUC-NG machine at the Leibniz Supercomputing Centre in Garching, Germany, and the Cosma 8 machine hosted by Durham University on behalf of the UK’s DiRAC High-Performance Computing facility.  

MillenniumTNG is tracking the formation of about one hundred million galaxies in a region of the Universe around 2,400 million light-years across. This calculation is about 15 times bigger than the previous best in this category, the TNG300 model of the IllustrisTNG project.  

Using COSMA 8, the team also computed an even bigger volume of the Universe — covering a region nearly 10 billion light-years across — filled with more than a trillion particles to represent dark matter and more than 10 billion particles to track massive neutrinos. To tackle this enormous computational challenge, the team used the advanced cosmological code, GADGET-4, which was custom-built for this purpose. 

Neutrinos are subatomic particles that rarely interact with normal matter and previous simulations had usually omitted them for simplicity, because they make up at most one to two per cent of dark matter’s mass and do not clump together. 

However, cosmological surveys such as Euclid and the Dark Energy Spectroscopic Instrument (DESI) survey, in both of which Durham plays a key role, will be precise enough to detect the percent-level effects. This raises the prospect of measuring the neutrino mass itself, a profound open question in particle physics. 

Professor Volker Springel, of the Max Planck Institute for Astrophysics, said: “MillenniumTNG combines recent advances in simulating galaxy formation with the field of cosmic large-scale structure, allowing an improved theoretical modelling of the connection of galaxies to the dark matter backbone of the Universe.  

“This may well prove instrumental for progress on key questions in cosmology, such as how the mass of neutrinos can be best constrained with large-scale structure data.” 

Dr Bose’s role in the research was funded through a UK Research and Innovation Future Leaders Fellowship grant. Professor Frenk is the recipient of a European Research Council Advanced Investigator Grant. COSMA/DiRAC is funded by the UK’s Science and Technology Facilities Council. 

Figure 1: Projections of gas (top left), dark matter (top right), and stellar light (bottom centre) for a slice in the largest hydrodynamical simulation of MillenniumTNG at the present period of time. The slice is about 35 million light-years thick. The projections show the vast physical scales in the simulation from size, about 2,400 million light-years across, to an individual spiral galaxy (final round inset) with a radius of approximately 150,000 light-years. The underlying calculation is presently the largest high-resolution hydrodynamical simulation of galaxy formation, containing more than 160billion resolution elements. Credit: MPA  

Figure 1(a) – 100mpc.jpg: Top section of Figure 1 minus annotation. Image shows projections of gas (top left), dark matter (top right), and stellar light (bottom centre) from the MillenniumTNG simulation at 100 megaparsecs. Credit: MPA. 

Figure 1 (b) – 10mpc.jpg: Bottom left section of Figure 1 minus annotation. Image shows projections of gas (top left), dark matter (top right), and stellar light (bottom centre) from the MillenniumTNG simulation at ten megaparsecs. Credit: MPA. 

Figure 1 (c) – 1mpc.jpg: Bottom right section of Figure 1 minus annotation. Image shows projections of gas (top left), dark matter (top right), and stellar light (bottom centre) from the MillenniumTNG simulation at one megaparsec. Credit: MPA. 

Figure 2: Comparison of the neutrino (top) and dark matter (bottom) distributions on the past backwards lightcone of a fiducial observer positioned at the centre of the two horizontal stripes. As cosmic expansion slows down the neutrinos at late times (small redshift/distance), they start to weakly cluster around the biggest concentrations of dark matter as shown by a comparison of the zoomed insets. This slightly increases the mass and further growth rate of these largest structures. Credit: MPA 

Figure 3: Galaxy distribution on the past backwards lightcone in MillenniumTNG, where the galaxies are predicted with a sophisticated semi-analytic model on top of the dark matter backbone. Credit: MPA

Full press release available here.

Source Information 

  • The MillenniumTNG Project: High-precision predictions for matter clustering and halo statistics 
    C. Hernández-Aguayo, V. Springel, R. Pakmor, M. Barrera, F. Ferlito, S. D. M. White, L. Hernquist, B. Hadzhiyska, A. M. Delgado, R. Kannan, S. Bose, C. Frenk 
    MNRAS, July 2023 (preprint:  
  • The MillenniumTNG Project: The hydrodynamical full physics simulation and a first look at its galaxy clusters 
    R. Pakmor, V. Springel, J. P. Coles, T. Guillet, C. Pfrommer, S. Bose, M. Barrera, A. M. Delgado, F. Ferlito, C. Frenk, B. Hadzhiyska, C. Hernández-Aguayo, L. Hernquist, R. Kannan, S. D. M. White 
    MNRAS, July 2023 (preprint:
  • The MillenniumTNG Project: Semi-analytic galaxy formation models on the past lightcone 
    M. Barrera, V. Springel, S. White, C. Hernández-Aguayo, L. Hernquist, C. Frenk, R. Pakmor, F. Ferlito, B. Hadzhiyska, A. M. Delgado, R. Kannan, S. Bose 
    MNRAS, submitted (preprint:
  • The MillenniumTNG Project: The galaxy population at z ≥ 8, R. Kannan, V. Springel, L. Hernquist, R. Pakmor, A. M. Delgado, B. Hadzhiyska, C. Hernández-Aguayo, M. Barrera, F. Ferlito, S. Bose, S. D. M. White, C. Frenk, A. Smith, E. Garaldi 
    MNRAS, July 2023 (preprint:
  • The MillenniumTNG Project: Refining the one-halo model of red and blue galaxies at different redshifts 
    B. Hadzhiyska, L. Hernquist, D. Eisenstein, A. M. Delgado, S. Bose, R. Kannan, R. Pakmor, V. Springel, S. Contreras, M. Barrera, F. Ferlito, C. Hernández-Aguayo, S. D. M. White, C. Frenk  
    MNRAS, July 2023 (preprint:
  • The MillenniumTNG Project: An improved two-halo model for the galaxy-halo connection of red and blue galaxies 
    B. Hadzhiyska, D. Eisenstein, L. Hernquist, R. Pakmor, S. Bose, A. M. Delgado, S. Contreras, R. Kannan, S. D. M. White, V. Springel, C. Frenk, C. Hernández-Aguayo, F. Ferlito, M. Barrera 
    MNRAS, July 2023 (preprint:
  • The MillenniumTNG Project: The large-scale clustering of galaxies 
    S. Bose, B. Hadzhiyska, M. Barrera, A. M. Delgado, F. Ferlito, C. Frenk, C. Hernández-Aguayo, L. Hernquist, R. Kannan, R. Pakmor, V. Springel, S. D. M. White 
    MNRAS, July 2023 (preprint:
  • The MillenniumTNG Project: Inferring cosmology from galaxy clustering with accelerated N-body scaling and subhalo abundance matching 
    S. Contreras, R. E. Angulo, V. Springel, S. D. M. White, B. Hadzhiyska, L. Hernquist, R. Pakmor, R. Kannan, C. Hernández-Aguayo, M. Barrera, F. Ferlito, A. M. Delgado, S. Bose, C. Frenk 
    MNRAS, July 2023 (preprint:
  • The MillenniumTNG Project: Intrinsic alignments of galaxies and halos 
    A. M. Delgado, B. Hadzhiyska, S. Bose, V. Springel, L. Hernquist, M. Barrera, R. Pakmor, F. Ferlito, R. Kannan, C. Hernández-Aguayo, S. D. M. White, C. Frenk 
    MNRAS, July 2023 (preprint:
  • The MillenniumTNG Project: The impact of baryons and massive neutrinos on high-resolution weak gravitational lensing convergence maps 
    F. Ferlito, V. Springel, C. T. Davies, C. Hernández-Aguayo, R. Pakmor, M. Barrera, S. D. M. White, A. M. Delgado, B. Hadzhiyska, L. Hernquist, R. Kannan, S. Bose, C. Frenk 
    MNRAS, submitted (preprint:

May 2023

Call for applications to join the DiRAC Technical Directorate

The DiRAC Project Board invites applications for membership of the DiRAC Technical Directorate from June 2023. There is currently one vacant position.

Full details, including job description, person specifications, and application process can be found here.

April 2023

The STFC Membership Call has just been launched and there are a number of vacancies on the DiRAC Resource Allocation Committee for both the Astronomy and Cosmology Sub-Panel and the Particle Physics and Nuclear Theory Sub-Panel.  

All of the information can be found at this link:  Call for applications to STFC advisory bodies and peer review panels – UKRI

March 2023

Numerical ray tracing reveals one of the biggest black holes ever found

A team of astronomers led by Dr. James Nightingale from the Centre for Extragalactic Astronomy at Durham University have discovered one of the biggest black holes ever found, exploiting a phenomenon called gravitational lensing, whereby the gravitational field of a foreground galaxy both bends and magnifies the light from a more distant luminous object, altering the far object’s appearance as viewed from Earth. They used the DiRAC facility to model how light is bent by the distribution of matter inside the galaxy Abell 1201, roughly 2 billion light years distant, and found at its heart an ultramassive black hole (BH), over 30 billion times the mass of our Sun and roughly 30% of the total mass of stars in the Milky Way – a scale rarely seen by astronomers.

Profiting from the computational power available at both the DiRAC Data Intensive Service (CSD3) at Cambridge and the Memory Intensive Service (COSMA8) at Durham, the team used numerical ray-tracing techniques to reconstruct the appearance of the distant object following lensing by the gravitational field of the foreground object, repeating the calculation hundreds of thousands of times in a nested sampling process, each time including a different mass BH to augment the visible and dark matter distribution, until the optimum match to actual images captured by the Hubble Space Telescope was achieved (see accompanying video). This is the first measurement of a BH mass using this technique, and the findings are published in the journal Monthly Notices of the Royal Astronomical Society.

According to Dr Nightingale, this BH is one of the biggest ever detected and close to the upper limit of how large they can grow in realistic cosmological scenarios. Most of the largest BHs that we know of are in an active state, where matter pulled in close to the black hole heats up and releases energy in the form of light, X-rays, and other radiation. The gravitational lensing technique makes it possible to study inactive BHs, something not currently possible in distant galaxies. The study opens up the tantalising possibility that astronomers can discover far more inactive and ultramassive BHs than previously thought, thereby offering a clue as to how they grew so large.

The research was supported by the UK Space Agency, the Royal Society, the Science and Technology Facilities Council (STFC) and the European Research Council.

Press releases pertaining to the research can be found below:–/

James Nightingale was also announced as one of this year’s Ernest Rutherford Fellowship winners, associated with Newcastle University. Information here.

February 2023

Job Opportunity

Technical Manager for High Performance Computing
Department of Physics
Grade 7: – £35,333- £37,474 per annum (Pro Rata)
Fixed Term – Full Time

Job Opportunity

Research Software Engineer
Department of Physics
Grade 7: – £35,333 – £42,155 per annum
Fixed Term – Full Time
Contract Duration: 36 months
Contracted Hours per Week: 35

July 2022

The Government has launched an independent review of the future of compute. This review will consider what the UK’s advanced compute needs will be in the next decade across the entire economy, and how government should meet them.

The review will be grounded in evidence and so the Secretariat has launched a call for evidence. This provides the opportunity to engage with a wide range of experts and interested parties, within the short time available, and ensure the best quality input is received. We are keen for you and your network to contribute.

The deadline for contributions is the 5th August and we’d be hugely grateful if you’re able to both respond, and share the call for evidence with your network where relevant. 

Any queries or evidence should be sent to the dedicated email ( 

Pre-announcement: DiRAC Resource Allocation Committee 15th Call for Proposals 

The DiRAC Resource Allocation Committee 15th Call for Proposals will be opening soon.  The UK theory and modelling communities in Astronomy and Cosmology, Astrophysics, Particle Physics and Nuclear Physics will be invited to apply for computational resources on the STFC DiRAC High Performance Computing Facility.    

The deadline for proposal submissions to the 15th Call will be Tuesday 4th October 2022 16:00 UK time.  The following documents should be sent direct to STFC (via email: by the deadline:     

  • Full scientific proposal – scientific application form and case for support 
  • Technical application form   
  • RSE Support proposals (if RSE support is requested)

Please note that all applicants must submit a technical case otherwise a full proposal submission will not be accepted.  The technical case does not need to be submitted in advance and should be sent direct to STFC along with the full scientific proposal, by the closing date.  

Successful awards will be scheduled to begin on 1st April 2023. 

All proposal types will be accepted at this call, including Research Software Engineering Support.  

Seedcorn proposals can be submitted at any time

The application forms and guidance notes, plus descriptions of the DiRAC services will be available at soon.  


Enquiries should be directed as follows: 

We are partnering with the UCL Centre for Space Exoplanet Data and the ESA Ariel Space Mission to bring you one of the Data Challenges for the Neural Information Processing Systems Conference. The challenge is open to all participants worldwide and offers prizes of $2000, $1000 or $500 for the most successful solutions. In an effort to make this year’s competition even more inclusive, DiRAC is offering high performance computing GPU time for participants around the world who may not have access to HPC resources. The competition is running from 30th June till early October. 

This year’s challenge focuses on the problem of modelling exoplanets’ atmospheres. This is a difficult task with the current MCMC approaches struggling to handle the data from the presently 5000 known exoplanets. The participants can achieve the goal of producing a method to characterise exoplanets from a dataset of synthetic spectra using any algorithm or model in any language or environment.

Find out more about the objective of the challenge here:

And enter the competition at:

May 2022

Professor Christine Davies has been awarded the RSE/Lord Kelvin Medal for her outstanding contribution to theoretical particle physics. Through her research, Professor Davies has developed techniques for accurate calculations in strong interaction physics that enable stringent tests of the Standard Model. Professor Davies has also extensively championed diversity, inclusion and public engagement.

The Royal Society of Edinburgh, Scotland’s National Academy, announced the winners of its highly prestigious medals here.

Congratulations Professor Davies!

Dr Bipasha Chakraborty has been awarded the first Joseph Fourier Prize, presented by Atos and Hartree Centre for her research project ‘Quantum Computation of Quantum Field Theories’.

We would like to warmly congratulate Dr Chakraborty for her being awarded the prize.

More information about her award winning research and the Joseph Fourier Prize can be found here.

April 2022

Mysteries of gas giants known as ‘hot Jupiters’ unravelled

One of the largest ever surveys of exoplanet atmospheres, using high performance computers at DiRAC, has analysed the atmospheres of 25 hot Jupiters using data from about 1,000 hours of telescope observations.

The study involved an international team of researchers from UCL, Queen Mary University of London, University of Exeter, CEA, LISA, the University of Zurich, the University of Florence, the Flatiron Institute in the US, and the National Astronomical Observatory of Japan (NAOJ) and The Graduate University for Advanced Studies, SOKENDAI, Japan.

UCL press-release:

ESA press-release:

Paper link:

February 2022

DiRAC’s Memory Intensive Service at Durham helps scientists produce the largest and most accurate virtual representation of the universe to date.

The SIBELIUS Project ran their constrained realisation simulation, “SIBELIUS-DARK”, over several weeks and produced over one Petabyte of data. This simulation recreates the formation of our current Universe out to 200 Mpc, starting from the Big Bang. It is a Dark-Matter-only simulation using a semi-analytic galaxy formation model, ‘Galform’. Individual massive galaxy clusters we see in the sky around us can be inferred from the simulation! The study has found that our Local Universe has slightly lower density relative to the average over cosmic scales.

Stuart McAlpine, John C Helly, Matthieu Schaller, Till Sawala, Guilhem Lavaux, Jens Jasche, Carlos S Frenk, Adrian Jenkins, John R Lucey, Peter H Johansson, SIBELIUS-DARK: a galaxy catalogue of the Local Volume from a constrained realisation simulation, Monthly Notices of the Royal Astronomical Society, 2022;, stac295,

The full findings were published in the Monthly Notices of the Royal Astronomical Society.

November 2021

Anushka Sharma, Senior Technical Programme Coordinator at DiRAC, in top 100 list of Tech Women

Anushka Sharma has been named in WeAreTechWomen’s 2021 TechWomen100 Awards.

The award celebrates remarkable women within the technology and STEM sectors.

Congratulations Anushka!

October 2021

DiRAC HPC-AI Advisory Council UK Conference: 13/14th October – registration now open!

The 3rd annual HPC-AI Advisory Council UK conference is taking place virtually next Wednesday and Thursday, 13/14th October, from 2-5:30pm.

Co-hosted by DiRAC, we have a very interesting set of talks over the two days:

Registration is free:

We hope that many of you will attend this event, which is a timely opportunity to highlight the importance of large-scale computing for the UK.

Update: DiRAC Resource Allocation Committee 14th Call for Proposals 

We would like to advise the Community that there is no longer any additional time available in RAC14 on CSD3 Skylake. Apologies to any groups who were intending to apply for resources on CSD3 Skylake. In this case, you may wish to consider applying for resources on the CSD3 Cascade Lake or Ice Lake services, as these are both Intel x86 systems. If you need further information, please see the DiRAC website or contact

DiRAC has recorded two Webinars to provide assistance with the completion of the RAC14 Technical application and the RAC 14 RSE Support application. These Webinars can be found on the DiRAC website

Please be reminded that the deadline for proposal submissions to the 14th Call is Tuesday 5th October 2021 at 16:00 UK time. 

Full information including the Call application forms and guidance notes can be found on the DiRAC website


Enquiries should be directed as follows: 

• RAC process and remit: STFC Swindon Office: 

• Technical questions: RSE Team: 

• Direct allocations or discretionary requests: DiRAC Director, Prof Mark Wilkinson ( 

August 2021

STFC-DiRAC Federation Project: Senior Project Manager Role

DiRAC is currently recruiting a Senior Project Manager to oversee the delivery of an exciting £1.9m project, focusing on preparations towards Federation of the facility with other UK Research communities.

The post will be for 12 months, working remotely.

Secondments of experienced people to take up this role would be considered.

Job Description/Person Specification

Please contact Dr Clare Jenner ( with any questions, or with CV and covering letter to apply.

Deadline 31st August 2021.

HPC Vacancies at Durham

We are recruiting for a couple of HPC roles within the national DiRAC HPC facility at Durham University (COSMA).

1. A Technical Manager for HPC
2. A Lead database Designer and Engineer

Both positions are open for full or part time working. Deadline 6th Sept.

The charming strangeness of the W boson

Theorists in the HPQCD collaboration have pinned down Vcs, a key parameter of the Standard Model, using STFC’s DiRAC Data Intensive supercomputer at Cambridge.

Vcs is determined from combining the theoretical calculation with results from particle physics experiments around the world for the proportion of D mesons that decay to a K meson in a process akin to nuclear beta decay.

This rate depends on the coupling strength Vcs between the W boson of the weak interaction and the charm-strange quark pair, but also on the strong interaction physics, encoded by ‘form factors’, that binds the quarks inside the mesons while this process happens.

The numerical techniques of lattice QCD allow the form factors to be calculated, but in the past their uncertainty has limited the precision of Vcs.

Using improved methods for handling quarks, developed by HPQCD, physicists in Cambridge and Glasgow have now obtained a value for Vcs of 0.9663(80), three times more accurate than previous work. This allows Vcs to be distinguished from 1 for the first time, giving tighter constraints on the possibilities for new physics beyond the Standard Model.

Full details found can be found here.

June 2021

DiRAC Director discusses Catalyst UK on This Week in HPC Podcast

DiRAC Director, Professor Mark Wilkinson was part of a panel discussing what the Catalyst UK Project has achieved so far on the latest This Week in HPC Podcast with Addison Snell.

You can listen to it here:

Dark matter is slowing the spin of the Milky Way’s galactic bar

The spin of the Milky Way’s galactic bar, made up of billions of clustered stars, has slowed by about a quarter since its formation, according to a new study by UCL and University of Oxford scientists.

The research, which made use of DiRAC’s CSD3 petascale HPC platform, analysed Gaia space telescope observations of a large group of stars.

Read more via UCL and New Scientists.

For the full publication, please visit the Royal Astronomical Society.

Atos supercomputer to help unlock secrets of the Universe

Atos today announces it has been awarded a contract by the University of Edinburgh to deliver its supercomputer, the BullSequana XH2000, the most energy-efficient supercomputing system on the market. This is the largest system dedicated to GPU computing deployed at a customer site in the UK. 

The new system will constitute the Extreme Scaling Service of the UK’s DiRAC HPC Facility. The state-of-the-art platform will allow scientists across the STFC theory community to drive forward world-leading research in particle physics, among other areas, using NVIDIA Graphics Processing Units (GPUs) and AMD processors.

You can read all about it here.

May 2021

Pre-announcement: DiRAC Resource Allocation Committee Call 13.5: Special Call for Proposals for DiRAC-3 Resources 

STFC will shortly open a Special Call of the DiRAC Resource Allocation Committee (RAC Call 13.5) for computing resources on the new DiRAC-3 systems. The UK theory and modelling communities in Astronomy and Cosmology, Astrophysics, Particle Physics and Nuclear Physics will be invited to apply for access to these computational resources. 

The Special Call will cover computing resources during a 6 month period from 1st October 2021 to 31st March 2022. A summary of the new hardware which will be available in this call are provided in the table below – full details of available core-hours will be provided with the call documents. 

Two types of application will be accepted: 

1. Uplift of an existing allocation: Projects with existing RAC allocations may request an uplift of up to 100% of their current allocation between October 2021 and March 2022. For example, in the case of a 100% uplift, the existing allocation for this 6 month period would then become twice the original allocation. 

2. New science allocations: Applicants may apply for up to 80% of the new resources based on a new science case. 

For both application types, it is essential that a clear management plan must be submitted to show that sufficient staff effort is available within the duration of the award to make full use of the increased allocation. Any existing projects which are under using their allocations are unlikely to be awarded an increased allocation unless there is clear evidence of a material change in anticipated usage (for example, a PDRA taking up a new post from October). 

An Expression of Interest must be submitted for both proposal types. Further information about what to include in the Expression of Interest will follow soon. 

The expected closing dates for proposal submissions to this Call will be as follows: 

Expression of Interest: Early June 2021 

Full proposal (Scientific and Technical Cases): End June 2021 

The Call documents, confirmed closing dates and further information on the peer review process will be available soon. 

This Call is in addition to the 14th RAC Call which will open later in the year for allocations starting from 1st April 2022. 


Enquiries should be directed as follows: 

  • RAC process and remit: STFC Swindon Office ( 
  • Technical questions: Technical Working Group ( 
  • Direct allocations or discretionary requests: DiRAC Director, Prof Mark Wilkinson ( 

Summary of new DiRAC-3 hardware

February 2021

Call for applications for DiRAC Senior Technical Programme Coordinator

The DiRAC Facility Management Team invites applications for the new position of Senior Technical Programme Coordinator.

The 0.6 FTE post is available immediately, and will initially run until 31st March 2023.

The closing date for applications is midnight on 14th March 2021.

Full details of the position and the application process can be found here.

January 2021

Durham University and DiRAC will deepen understanding of the universe with 2nd Gen AMD EPYC™ CPUs

Read the case study about how Durham University used AMD EPYC processors to enable much larger simulation data sets with faster execution to speed up the discovery process during cosmological investigations into the origins of the universe and the Big Bang.

January 2021

Call for applications for DiRAC Community Development Director

The DiRAC Facility Management Team invites applications for the new position of Community Development Director.

The post is available from 1st April 2021, and will initially run until 31st March 2023.

Applications should be submitted by e-mail to the DiRAC Director by 5pm on Monday, 8th February 2021.

Full details of the position and the application process can be found here.

January 2021

Fame Lab – Free Training in Public Engagement

FameLab is an ideal way to boost confidencebuild networks and share research with new audiences. FameLab UK is open to anyone over 21 currently working in STEM, and the winners will participate in the FameLab UK final at Cheltenham Science Festival 2021!  

This is a fantastic opportunity for anyone interested in:

  • Improving their communication skills
  • Sharing their research with a public audience
  • Joining a global network of researchers who enjoy talking about science

Training and heats are taking place in 9 regions around the UK, find your nearest one here.

Participants can take part in FameLab as many times as they like and keep improving each year until they become a UK finalist! 

January 2021

DiRAC placement opportunity: AI in multi-physics/multi-scale cosmological simulations

DiRAC will award one Innovation Placement in 2021 to explore the application of the HPE “SmartSim” AI library to cosmological simulations as a means to replace parameterised sub-grid physics models. The placement will be with Hewlett Packard Enterprise in collaboration with the University of Cambridge.

You have to be working on research that falls within the STFC remit in order to qualify for the placement; however, you can be funded by other organisations besides STFC, as long as the subject area is identifiable as being in Particle Physics, Astronomy & Cosmology, Solar Physics and Planetary Science, Astro-particle Physics, and Nuclear Physics.

To check your eligibility, please contact Mark Wilkinson.

You must get your supervisor’s or PI’s permission before applying for this placement. Participation in the placement scheme is allowed under UKRI’s rules, but only with your supervisor/PI’s consent.

We will do our best to be flexible; part time working can be arranged as long as the placement does not exceed 9 months.

This should be looked on as an opportunity to learn new skills and contribute outside of your research area.

The deadline for applications is 5pm on Monday 8th February 2021.

For more information, see the application form.

January 2021


Learn how to apply AI tools, techniques and algorithms to real-life problems. You will study the core concepts of Deep Neural Networks, how to build Deep Learning models as well as how to measure and improve the accuracy of your models. You will also learn essential data pre-processing techniques to ensure a robust Machine Learning pipeline. 
The Bootcamp is a hands-on learning experience where you will be guided through step-by-step instructions with teaching assistants on hand to help throughout.

This event will use Nvidia experts in a instructor lead program using DiRACs Cambridge HPC GPU system.

For More information & registration:

January 2021

£20m funding boost for science supercomputer will “drive science simulation and UK-wide innovation”

Please find he full UKRI press release here.

The UK government has announced a £20m funding boost to upgrade the capabilities of the DiRAC High Performance Computing facility.

The upgrade will enhance the UK’s scientific leadership and productivity, driving ground-breaking discoveries in scientific research, with opportunities spread across the UK. It will support the training of the next generation of UK researchers and attract the world’s top computational researchers to the UK.

It will also support nationwide innovation with industry to develop solutions for exascale computing and Artificial Intelligence research with broad applications in personalised healthcare, clean energy, government decision-making and solar weather forecasting.

The new systems will be between three times and five time more powerful than the existing DiRAC machines. This will provide crucial computing capacity that can be used to address immediate/emerging issues, like the COVID-19 pandemic.


The DiRAC facility was established in 2009 to provide high performance computing systems optimised for the specialist needs of scientists working at the cutting edge of theoretical astrophysics, particle physics, cosmology and nuclear physics. The DiRAC research community also exploits and interprets observational and experimental data generated by astronomy and particle physics facilities such as the Large Hadron Collider and the LIGO experiment.

DiRAC is a distributed facility, with computing resources hosted by the Universities of Cambridge, Durham, Edinburgh and Leicester. This is overseen by the Project Office at University College London. These powerful computing facilities allow the UK science community to pursue cutting-edge research on a broad range of topics, including simulations of the entire evolution of the universe, from the Big Bang to the present, and models of the fundamental structure of matter.

DiRAC has now been awarded £20million from the World Class Laboratories opportunity to deploy DiRAC-3 – a major upgrade in the computing power at all four DiRAC sites. Crucially, they will also be up to ten times energy efficient than previous generations, an important step towards delivering sustainable computing resources for the UK.

A welcome announcement

Describing the announcement as “very welcome good news for UK science”, the DiRAC Director, Professor Mark Wilkinson from the University of Leicester, explained the importance of this investment for the UK:

“Today, high performance computing (HPC) underpins discoveries in almost all areas of science and innovation. Numerous studies have demonstrated the significant economic benefits of investment in high performance computing and confirmed that ‘to out-compute is to out-compete’.”

“The DiRAC HPC facility is an outstanding example of HPC-driven innovation in action. While it was originally established to support the UK’s world-leading research in particle physics, astrophysics, cosmology and nuclear physics, DiRAC has delivered technological innovations with global impact and developed techniques now being applied in fields as diverse as personalised medicine, government planning and solar weather forecasting.”

DiRAC Project Scientist and Deputy Director, Dr Clare Jenner from University College London, noted that the science areas impacted will “range from the subatomic to the intergalactic. Theoretical research nowadays relies on supercomputers – we can’t do the calculations in any other way. So, the DiRAC computers are vital to the future success of the UK in these fields.”

The UK Science and Technology Facilities Council (STFC) Executive Chair, Professor Mark Thomson, said:

“STFC are delighted at the announcement of new funding for the DiRAC HPC facility, to ensure that it can continue to support research in fields where the UK is world-leading.”

DiRAC Director of Innovation and Technology, Dr Jeremy Yates of University College London also emphasised that the impact of DiRAC extends much further than the scientific breakthroughs it delivers:

“We are also contributing to the delivery of the UK’s innovation agenda. We work with our industry partners to develop novel hardware and software solutions which can be used in many other applications.”

The new computers will be deployed over the coming months, with first scientific results expected to be presented in September at DiRAC Day 2021, the annual community event.

You can find further coverage of this item on:

December 2020

DiRAC Health Data Science and AI Placement Opportunity.

DiRAC will award one Innovation Placement in 2021 in the area of Health Data Science and the application of AI. The nominal length is 6 months and has to be completed by 30 September 2021. In this scheme a final year PhD student or an early career researcher can have a funded placement (up to £25k) with the Getting It Right First Time (GIRFT) programme. GIRFT is funded by the UK Department of Health and Social Care and is a collaboration between NHS England & NHS Improvement and the Royal National Orthopaedic Hospital NHS Trust. GIRFT uses comprehensive benchmarking data analysis to identify unwarranted variation in healthcare provision and outcomes in National Health Service (NHS) hospitals in England and combine this with deep dive visits to the hospital by clinicians with follow up on agreed actions by an improvement team. The programme covers the majority of healthcare specialities.

To qualify you have to be working on research that falls within the STFC remit in order to qualify for the placement; however you can be funded by other organisations besides STFC, as long as the subject area is identifiable as being in the Particle Physics, Astronomy & Cosmology, Solar Physics and Planetary Science, Astro-particle Physics, and Nuclear Physics.

To check your eligibility please contact Jeremy Yates and Maria Marcha.

This should be looked on as an opportunity to learn new skills and contribute outside of your research area.

The deadline for applications is 10am on Monday 11th January 2021.

Further information can be found in this document.

December 2020:

Unlocking the mystery of the Moon’s formation

Astronomers have taken a step towards understanding how the Moon might have formed out of a giant collision between the early Earth and another massive object 4.5 billion years ago.

Scientists led by Durham University UK, used DiRAC supercomputer to simulate Mars-sized planet – called Theia – crashing into the early Earth.

Lead author Sergio Ruiz-Bonilla, a PhD researcher in Durham University’s Institute for Computational Cosmology, said: “By adding different amounts of spin to Theia in simulations, or by having no spin at all, it gives you a whole range of different outcomes for what might have happened”

November 2020:

2021 Code Performance Series: From analysis to insight

Starting in January, Durham University is hosting a 7-monthly series of  workshops based around performance analysis for Exascale software. This could be of interest for anyone working on HPC codes, aiming to upskill researchers in this key area.

To register, please visit this link:

This workshop series is run by the Durham University’s Department of Computer Science in collaboration with the N8 and DiRAC, in close collaboration with the VI-HPS, and made possible by support from the UK’s ExCALIBUR programme.

October 2020:

Professor Carlos S Frenk
Institute of Computational Cosmology, Durham University, & DiRAC

We would like to congratulate Carlos on being awarded the Institute of Physics 2020 Paul Dirac Medal and Prize for theoretical (including mathematical and computational) physics.  

For outstanding contributions to establishing the current standard model for the formation of all cosmic structure, and for leading computational cosmology within the UK for more than three decades.

The full citation is at

Modelling temperature variation on distant stars

A team led by Dr Andrei Igoshev at the University of Leeds is helping to explain one of the big questions that has perplexed astrophysicists for the past 30 years – what causes the changing brightness of distant stars called magnetars.

 A mathematical model was developed that simulates the way the magnetic field disrupts the conventional understanding of heat being distributed uniformly which results in hotter and cooler regions where there may be a difference in temperature of one million degrees Celsius.

The team used the STFC-funded DiRAC supercomputing facilities at the University of Leicester. 

Read more about it here.

September 2020:

The Earth could have lost anywhere between 10 and 60% of its atmosphere in the collision that is thought to have formed the Moon!

New research led by astronomers at Durham University shows how the extent of atmospheric loss depends upon the type of giant impact with Earth.

They ran more than 300 supercomputer simulations to study the consequences of different huge collisions on rocky planets with thin atmospheres.

Real all about it here and here.

September 2020:

DiRAC contributes to a new Calculation that Refines Comparison of Matter with Antimatter

A new calculation performed using the world’s fastest supercomputers allows scientists to more accurately predict the likelihood of two kaon decay pathways, and compare those predictions with experimental measurements. The comparison tests for tiny differences between matter and antimatter that could, with even more computing power and other refinements, point to physics phenomena not explained by the Standard Model.

Read all about it in their press release.

September 2020:

DiRAC Day – Poster Prize Winners: 

After a very enjoyable day being informed of all the first class research DiRAC has supported over the past year, and the exciting plans we have for the years to come. Ending with this year’s poster prize winners, sponsored by Intel.

  • Fionntan Callan from Queen’s University Belfast
  • Rosie Talbot from Cambridge University
  • runner up Josh Borrow from Durham University 

Well done everyone the standards were extremely high this year.

Zooming in on dark matter

Our cosmologists have zoomed in on the smallest clumps of dark matter in a virtual universe – which could help us find the real thing in space.

Using a supercomputer simulation of the universe they achieved a zoom equivalent to being able to see a flea on the surface of the Moon.

This meant they could make detailed pictures and analyses of hundreds of virtual dark matter haloes from the very largest (galaxy clusters) to the tiniest (about the same as Earth’s mass).

Read all about it on their website.

August 2020:

Intel has agreed to sponsor the pre-DiRAC Day hackathon. The event will focus on optimisation with the latest Intel tool set, and looking at a new coding model oneAPI. OneAPI will deliver the tools needed to deploy applications and solutions across different architectures, including CPUs, GPUs, FPGAs, and other accelerators.

Application deadline has been extended to Tuesday the 25th August.

For information see post

Swiftsimio, a Python library for reading SWIFT data developed with support of DiRAC Research Software Engineering time, published in the Journal of Open Source Software.

Read the article here.

July 2020:

Charming physics in a beautiful context.

The HPQCD collaboration have recently completed a study using DiRAC that has appeared as an Editor’s suggestion in Physical Review D (see Phys. Rev. D 102, 014513). They calculated how the charm quark undergoes a weak interaction when paired with a beauty quark inside a Bc meson and subject to strong interaction physics. The LHCb experiment at CERN could soon see this process and the combination of theory and experiment will then shed new light on the quark weak transitions.

June 2020:

Spectra publishes a case study on their long-term storage solution for the DiRAC Memory Intensive Services at Durham.

Read all about it on their website.

May 2020:

New simulations from Imperial College London have revealed the asteroid that doomed the dinosaurs struck Earth at the ‘deadliest possible’ angle.

The simulations show that the asteroid hit Earth at an angle of about 60 degrees, which maximised the amount of climate-changing gases thrust into the upper atmosphere.

Such a strike likely unleashed billions of tonnes of sulphur, blocking the sun and triggering the nuclear winter that killed the dinosaurs and 75 per cent of life on Earth 66 million years ago.

Read all about it on their website, learn more about DiRAC’s contribution in the STFC article, and see the impact on YouTube.

See also the BBC online, Daily Mail, and New Scientist articles.

May 2020:

13th Call for Proposals Pre-announcement

The DiRAC Resource Allocation Committee 13th Call for Proposals will be opening shortly. Find all information and important dates here.

April 2020:

Webinar: Porting and Performance of DiRAC benchmarks on Oracle Bare Metal Cloud

On Wednesday April 29th, from 11.00 to 12.00 am (BST), Any Turner will give a webinar on Porting and Performance of DiRAC benchmarks on Oracle bare metal cloud.

Find more information here.

February 2020:

Government announces new supercomputer for N8 universities

Based at the University of Durham, the new £3.15m Northern Intensive Computing Environment (NICE) will provide a shared facility for academic and industry researchers for all of the N8 universities, shared on an equal basis with each paying towards its operation, while also allowing access to the EPSRC-supported UK-wide community. The announcement is one of seven HPC centres to be supported by a £27 million investment from EPSRC.

Find more information here.

Advance Announcement: September 2020:

DiRAC Day 2020 @ Durham University

This year, the Annual DiRAC Science Day event, will be held at Durham University on the 10th of September. The day provides an opportunity to meet researchers from across the DiRAC community and learn about their recent science achievements. In addition, our industry partners will be these to talk about new hardware and software advances which may benefit DiRAC research.

Full details regarding registration, accommodation etc will be available via the DiRAC website shortly.

We also expect to host a hackathon over the three days leading up to DiRAC day – details will announced soon and will be posted on our Training page.

January 2020:

CodeCamp is back in March!

Interested in knowing if your research will benefit from the power of GPUs? Haven’t done any GPU programming?, or do not know what a GPU is?, then CodeCamp is for you. Come along to Durham on 17th March.

Go to our web page or for details, all are welcome, but spaces are limited.

Application dead line has been extended to the 2nd of March.

UCLan astronomers find a way to form ‘fast and furious’ planets around tiny stars!

Using DiRAC resources, researchers from the University of Central Lancashire (UCLan) found giant planets could form around small stars much faster than previously thought.

As published in the Astronomy and Astrophysics Journal, Dr Anthony Mercer and Dr Dimitris Stamatellos’ new planet formation research challenges our understanding of planet formation.

Computer simulation of planets forming in a protoplanetary disc around a red dwarf star.

Find out more at the UCLan website, or the STFC website.

December 2019:

CodeCamp is starting in December!

Our launch event will feature a technology that is prominent within the HPC community and will be with us into the future, GPUs. Interested in knowing if your research will benefit from the power of GPUs? Haven’t done any GPU programming?, or do not know what a GPU is?, then CodeCamp is for you. Come along to Durham on 11-12th December. for details, all are welcome, but spaces are limited.

Application dead line has been extended to the 20th of November.

October 2019:

Stormy cluster weather could unleash black hole power and explain lack of cosmic cooling

“Weather” in clusters of galaxies may explain a longstanding puzzle, according to a team of researchers at the University of Cambridge. The scientists used sophisticated simulations performed on the DiRAC infrastructure to show how powerful jets from supermassive black holes are disrupted by the motion of hot gas and galaxies, preventing gas from cooling, which could otherwise form stars. The team publish their work in the journal Monthly Notices of the Royal Astronomical Society.

For more information see their website.

Figure 2. An artist’s impression of the jet launched by a supermassive black hole, which inflates lobes of very hot gas that are distorted by the cluster weather. Image credit: Institute of Astronomy, University of Cambridge.

A copy of the paper is available from:

September 2019:

Retirement event for Lydia Heck

Lydia Heck, DiRAC’s former Technical manager, retired this month. Lydia has been with DiRAC for more than 9 years and will be greatly missed. Her career was celebrated with friends and colleagues on a retirement event in Durham. Thank you and good luck, Lydia!

September 2019:

DiRAC ARM Mellanox Hackathon – Pre-event training.

Prior to DiRAC day, ARM and Mellanox are sponsoring a hackathon. This hackathon is to investigate the suitability of the ARM processor, and the Mellanox Bluefield chip for use within the DiRAC HPC community.

To enable all our participants to get the most out of this event there will be a online pre-event training session on Monday 2nd of September at 11:00am GMT.

All are welcome.

With Conference ID: 19336783

August 2019:

STFC Innovation Placements Opportunity.

This Opportunity has now closed.

DiRAC has been awarded 8 STFC Innovation Fellowships that are of duration 6 months and have to be completed by 31 March 2020. In this scheme a final year PhD student or an early career researcher can have a funded placement (up to £21k) with a third-party organisation.    

To qualify you have to be working on research that falls within the STFC remit in order to qualify for the placement; however you can be funded by other organisations besides STFC, as long as the subject area is identifiable as being in the Particle Physics, Astronomy & Cosmology, Solar Physics and Planetary Science, Astro-particle Physics, and Nuclear Physics.

To check your eligibility please contact Mark Wilkinson (miw6 AT and Clare Jenner (c.jenner AT We will do our best to be flexible.

However, the placement can’t be on your research problem, but rather on the offered innovation problem.

This should be looked on as an opportunity to learn new skills and contribute outside of your research area.

We are pleased to offer the following DIRAC STFC Innovation Placements:

The deadline for applications is 10am on Monday 9th September 2019.

July 2019:

DiRAC ARM Mellanox Hackathon prior to DiRAC DAY

Anyone interested in attending the DiRAC ARM Mellanox Hackathon on the 9th to 11th of September needed to submit an application form a.s.a.p. these is limited spaces. Details can be found here.

June 2019:

“Beautiful” DiRAC research features in a Plus Magazine article.

A new particle that has recently been discovered at CERN confirms predictions made by theoretical physicists over six years ago. The result, delivered with a little help from the Darwin supercomputer, confirms existing particle theory, but also opens the door to new physics.

Read the whole article here.

May 2019:

DiRAC deploys Atempo Miria for Archiving.

Recently, DiRAC’s Memory Intensive facility in Durham called on the services of Atempo, the Data Protection and Movement specialists, together with their UK partner, OCF, to implement a multi-petabyte archiving project for their Lustre and Spectrum Scale (GPFS) data.

You can read all about it on the Atempo blog.

May 2019:

Free webinar, Wednesday 22nd May 2019, 15:00 BST: Open Source HPC Benchmarking. Presented by Andy Turner, EPCC.

There is a large and continuing investment in HPC services around the UK, Europe and beyond and this, along with new processor technologies appearing on the HPC market, has led to a wider range of advanced computing architectures available to researchers.

We have undertaken a comparative benchmarking exercise across a range of architectures to help improve our understanding of the performance characteristics of these platforms and help researchers choose the best services for different stages of their research workflows.

We will present results comparing the performance of different architectures for traditional HPC applications (e.g. CFD, periodic electronic structure) and synthetic benchmarks (for assessing I/O and interconnect performance limits). We will also describe how we have used an open research model where all the results and analysis methodologies are publicly available at all times. We will comment on differences between architectures and demonstrate the benefits of working in an open way.  

Full details and join link can be found here.

April 2019:

DiRAC’s Technical Manager gives Headline Talk at local BCS networking event

24th April 2019: DiRAC’s Technical Manager Lydia Heck is giving the Headline Talk at the local British Computer Society (Newcastle and District Branch) networking event this evening.  She will be discussing DiRAC@Durham’s Memory Intensive machine and explaining how this powerful resource is helping to unlock crucial insights into our Universe.

More information on her talk can be found here.

April 2019:

HPC-AI Advisory Council 2019, Swiss Conference & HPCXXL User Group

DiRAC’s Director Dr Mark Wilkinson’s talk from the  HPC-AI Advisory Council 2019 Swiss Conference, entitled: “40 Powers of 10 – Simulating the Universe with the DiRAC HPC Facility“,  is now available on YouTube and also features on the Inside HPC Website.

April 2019:

Theory predictions come up trumps

A particle that is an ‘excited’ bound state of a bottom quark and a charm antiquark has been discovered at the Large Hadron Collider and its mass is in agreement with a prediction made by the HPQCD collaboration back in 2012 using STFC’s DiRAC facility. HPQCD used a numerical technique known as lattice QCD to solve the theory of the strong force, Quantum Chromodynamics. This enabled them to calculate the masses of several bound states of bottom and anticharm, each with the quarks in a different configuration, collectively known as the Bc mesons. The CMS and LHCb collaborations have both now reported in 2019 the first clear evidence for the member of this set called the Bc’ meson. 

The lightest Bc meson, known simply as the Bc, has the bottom and anticharm quarks spinning in opposite directions so that its spin is zero. This is the lowest energy configuration for bottom-anticharm and simplest to calculate in lattice QCD. In 2005 HPQCD (with the Fermilab lattice collaboration) successfully predicted the mass of the Bc meson, ahead of its discovery by the CDF experiment at the Fermilab Tevatron collider. The large mass of this meson, 6.27 GeV/c2(where the proton mass is 0.94 GeV/c2), along with its quark-antiquark content meant that a proton collider was needed to produce it and made it hard to find experimentally.  

In 2012, armed with the computing power of DiRAC and the much-improved QCD calculations that that allowed, HPQCD were able to revisit the topic and calculate the masses of many more states. They predicted the mass of the Bc* meson, a particle with spin because the bottom and anti-charm quarks are spinning in the same direction inside it. They also predicted the masses of excited states of the Bc and Bc* , known as the Bc’ and Bc*’.  These are the analogues of the electronic radial excitations of the hydrogen atom. The mass difference between the Bc’ and the Bc is then a consequence of the way in which the bottom and anti-charm quark are bound together through strong force interactions. To predict this mass difference from QCD requires the numerical techniques of lattice QCD because QCD has such complicated non-linear interactions. In arXiv:1207.5149 HPQCD found the mass difference between Bc’ and Bc to be 0.616(19) GeV/c2; the CMS result for this mass difference in arXiv:1902.00571 (and LHCb in arXiv:1904.00081) is 0.5961(14) GeV/c2, in good agreement

Figure 1: HPQCD’s predictions for the masses of the lightest states in the Bc family (JPgives their spin and parity quantum numbers) of mesons (blue crosses) calculated on DiRAC. The experimental results for the two states that have been seen are shown as red lines (the experimental uncertainties are around 0.001 GeV/c2).

Figure 1 shows the HPQCD predictions for Bc meson masses along with the current experimental values. Mesons containing b quarks are the Achilles heel of the Standard Model since their rare decay processes are sensitive to the existence of new particles. The Bc meson family provides a new chapter in this search that theory and experiment are now beginning to exploit. The HPQCD collaboration remains at the forefront of this work and is pushing ahead with more precise calculations of Bc masses and differential decay rates on DiRAC-2.5. 

April 2019:

Dr Debora Sijacki wins the PRACE Ada Lovelace Award for HPC 2019

Huge Congratulations to DiRAC Researcher Dr Debora Sijacki who has won the PRACE Ada Lovelace Award for HPC 2019.  This prestigious prize is awarded annually to a young female European scientist in recognition of their outstanding impact on HPC research and computational science at a global level and for being a role model for young women beginning their careers in HPC.  Well done Debora!

Debora is based at the Institute of Astronomy, University of Cambridge (personal webpage) and more information on the Partnership for Advanced Computing in Europe (PRACE) and the Ada Lovelace Award for HPC 2019 can be found here.

Advance Announcement: September 2019:

DiRAC Day 2019 @ University of Leicester

This year, the Annual DiRAC Science Day event, will be held at the University of Leicester on the 12th of September. The day provides an opportunity to meet researchers from across the DiRAC community and learn about their recent science achievements. In addition, our industry partners will be these to talk about new hardware and software advances which may benefit DiRAC research.

Full details regarding registration, accommodation etc will be available via the DiRAC website shortly.

We also expect to host a hackathon over the three days leading up to DiRAC day – details will announced soon and will be posted on our Training page.

November 2018:

DiRAC researchers on this year’s Clarivate Analytics Highly Cited Researchers List

Three DiRAC@Durham researchers, Professors Carlos Frenk,  Tom Theuns and Adrian Jenkins, appear on this year’s Clarivate Analytics Highly Cited Researchers List. Highly Cited researchers rank in the top 1% by citations for their field and are making a huge impact in solving the world’s biggest challenges.

We are extremely proud of Carlos, Tom and Adrian as their inclusion in this list is a particularly noteworthy achievement and is a demonstration of their global influence.

For more information see:

June 2018:

RAC 11th Call for Proposals Opens

The RAC makes an annual Call for Proposals for requesting time on our Resources. The 11th Call opened on the 9th July 2018 and will close on the 1st October 2018. The Call Announcement, the Guidance Notes and Application Forms are available on our Call for Proposals page.

Advance Announcement: September 2018:

DiRAC Day 2018 @ Swansea University.

We are looking forward to our 8th Annual DiRAC Science Day event, this year being held at Swansea University on the 12th of September. The day provides an opportunity to meet others from the DiRAC community and learn about the recent research achievements of our different consortia.

Swansea University are also running a number of other co-located training/networking events in the week commencing 9th September and details can be found on our Training page.

Feburary 2018:

New models give insight into the heart of the Rosette Nebula.

Through computer simulations run in part on DiRAC Resources, astronomers at Leeds and at Keele University have found the formation of the Nebula is likely to be in a thin sheet-like molecular cloud rather than in a spherical or thick disc-like shape, as some photographs may suggest. A thin disc-like structure of the cloud focusing the stellar winds away from the cloud’s centre would account for the comparatively small size of the central cavity.

More information can be found on the STFC press release published here and on our 2017 Science Highlights page.


November 2017:

DiRAC @ Supercomputing 2017.

Members of the DiRAC Project Management Team travelled this year to Denver Colorado to attend the SuperComputing 2017 industry conference.  More information on what went on can be found here.


August 2017:

The 7th Annual DiRAC Day event.

Our 2017 Dirac Day event was held at Exeter University on the 30th August. Find out more at the dedicated web page.

April 2017:

DiRAC HPC Manager talks to Computer Scientific World

Dr Lydia Heck, Senior Computer Manager in the Department of Physics at Durham University, talks to Robert Roe of Computer Scientific World in this article looking at managing HPC performance and exploring the options available to optimise the use of resources. Discussing DiRAC’s series of COSMA machines, Lydia talks about the hurdles her team has overcome whilst implementing a new workload management system, SLURM and using a Lustre file system for the latest DiRAC iteration: COSMA 6.

March 2017:

DiRAC partners in Peta-5

DiRAC partners in Peta-5

Six Tier 2 High Performance Computing (HPC) centres were officially launched on Thursday 30 March at the Thinktank science museum in Birmingham. Funded by £20 million from the Engineering and Physical Sciences Research Council (EPSRC) the centres will give academics and industry access to powerful computers to support research in engineering and the physical sciences.

DiRAC will partner in The Petascale Intensive Computation and Analytics facility at the University of Cambridge which will provide the large-scale data simulation and high performance data analytics designed to enable advances in material science, computational chemistry, computational engineering and health informatics.

September 2016:

6th Annual DiRAC Science Day.

On September 8th, the University of Edinburgh hosted the sixth annual DiRAC Science Day. This gave our researchers in the DiRAC HPC Community the opportunity to meet each other and the technical teams from each site, learn about what is being done by all the different projects running on the DiRAC facility and discuss future plans. The Day was generously sponsored by Bull, Atos, Dell, Hewlett Packard Enterprise, Intel, Cray, DDN, Lenovo, Mellanox, OCF and Seagate.

Dr. Jeremy Yates opened the meeting with an update on facility developments and then Prof. Christine Davies led a community discussion on several issues including the training needs of young researchers. The Science presentations then began with a talk on Simulating Realistic Galaxy Clusters, followed by a review of lattice QCD calculations and an exciting presentation from Prof. Mark Hannam on the recent detection of Gravitational Waves and the key role DiRAC played in converting information from the gravitational-wave signal into results for the properties of the colliding black holes.

During lunch 23 posters show-cased some of the other research done on the facility and then the day split into parallel Science and Technical Sessions. In the Science session, presentations were made on: The hadronic vacuum polarisation contribution to the Anomalous Magnetic Moment of the Muon; The Robustness of Inflation to Inhomogeneous Inflation; A Critical View of Interstellar Medium Modelling in Cosmological Simulations and finally, Magnetic Fields in Galaxies. The Technical session presented talks on: Emerging Technologies; Grid; A Next Generation Data Parallel C++ Library; An Overview of the DiRAC-3 Benchmark Suite and a lecture on SWIFT – Scaling on Next Generation Architectures.

LIGO Detections
Figure 1. Dr Andrew Lytle and his poster.

During tea the poster prizes were announced and congratulations go to Dr Andrew Lytle (U. of Glasgow) for his poster on Semileptonic B_c Decays from Full Lattice QCD and to Dr Bernhard Mueller (Queens U. Belfast) for his poster on Core-Collapse Supernova Explosion Models from 3D Progenitors. They each won a £500 Amazon voucher from our kind sponsor DDN. Dr Lytle and his winning poster can be seen in the figure on the right.

Further Science session talks after tea were: Growing Black Holes at High Redshift; Planet Formation and Disc Evolution and finally, Modelling the Birth of a Star. The Technical session included a talk on the Co-design of Cray Software Components and ended with an interesting review of AAAI, Cloud and Data Management: DiRAC in the National E-Infrastructure, given by Dr. Yates. The Day concluded with a Drinks Reception outside the lecture theatres that was well attended and much enjoyed by all.

February 2016:

DiRAC simulations play a key role in gravitational-wave discovery.

LIGO Detections
Figure 1. The top plot shows the signal of gravitational waves detected by the LIGO observatory located in Hanford, USA whist the middle plot shows the waveforms predicted by general relativity. The X-axis plots time and the Y-axis plots the strain, which is the fractional amount by which distances are distorted by the passing gravitational wave. The bottom plot shows the LIGO data matches the predications very closely. (Adapted from Fig. 1 in Physics Review Letters 116, 061102 (2016))

On February 11 2016, the LIGO collaboration announced the first direct detection of gravitational waves and the first observaton of binary black holes. Accurate theoretical models of the signal were needed to find it and, more importantly, to decode the signal to work out what the source was. These models rely on large numbers of numerial solutions of Einstein’s equations for the last orbits and merger of two black holes, for a variety of binary configurations. The DiRAC Data Centric system, COSMA5, was used by researchers at Cardiff University to perform these simuations. With these results, along with international collaborators, they constructed the generic-binary model that was used to measure the masses of the two black holes that were detected, the mass of the final black hole, and to glean some basic information about how fast the black holes were spinning. Their model was crucial in measuring the properties of the gravitational-wave signal, and The DiRAC Data Centric system COSMA5 was crucial in producing that model.

More information on the detection of gravitational waves can be found at the LIGO collaboration website.

In the figure above, the top plot shows the signal of gravitational waves detected by the LIGO observatory located in Hanford, USA whist the middle plot shows the waveforms predicted by general relativity. The X-axis plots time and the Y-axis plots the strain, which is the fractional amount by which distances are distorted by the passing gravitational wave. The bottom plot shows the LIGO data matches the predications very closely. (Adapted from Fig. 1 in Physics Review Letters 116, 061102 (2016)) Read further…

November 2015:

HPCwire Readers’ Choice Award


STFC DIRAC has been recognized in the annual HPCwire Readers’ and Editors’ Choice Awards, presented at the 2015 International Conference for High Performance Computing, Networking, Storage and Analysis (SC15), in Austin, Texas. The list of winners was revealed by HPCwire both at the event, and on the HPCwire website. STFC DiRAC was recognized with the following honor:

Readers’ Choice – Best Use of High Performance Data Analytics – Stephen Hawking Centre for Theoretical Cosmology, Cambridge University, and the STFC DiRAC HPC Facility uses the first Intel Xeon Phi-enabled SGI UV2000 with its co-designed ‘MG Blade’ Phi-housing and achieved 100X speed-up of MODAL code to probe the Cosmic Background Radiation with optimizations in porting the MODAL to the Intel Xeon Phi coprocessor.

The coveted annual HPCwire Readers and Editors’ Choice Awards are determined through a nomination and voting process with the global HPCwire community, as well as selections from the HPCwire editors. The awards are an annual feature of the publication and constitute prestigious recognition from the HPC community. These awards are revealed each year to kick off the annual supercomputing conference, which showcases high performance computing, networking, storage, and data analysis.

We are thrilled that DIRAC and the Cambridge Stephen Hawking Centre for Theoretical Cosmology and our work through the COSMOS Intel Parallel Computing Centre have received this prestigious award in high performance computing.

In particular we congratulate Paul Shellard, Juha Jaykka and James Brigg from Cambridge for their sterling efforts. It is their ingenuity, skill and innovation that has been recognised by this award.

The award is also recognition of the unique synergy that we have developed between world-leading researchers in theoretical physics from the STFC DiRAC HPC Facility and industry-leading vendors like Intel and SGI, which aims to get maximum impact from new many-core technologies in our data analytic pipelines. This involved new parallel programming paradigms, as well as architectural co-design, which yielded impressive speed-ups for our Planck satellite analysis of the cosmic microwave sky, opening new windows on our Universe.

We have built an innovative and working data analytics system based on heterogeneous CPU architectures. This has meant we had to develop and test new forms of parallel code and test the hardware and operational environment. We can now make the best use of CPUs and lower cost, more powerful, but harder to programme, many core Xeon-Phi chips. This ability to offload detailed analysis functions to faster processors as and when needed greatly decreases the time to produce results. This means we can perform more complex analysis to extract more meaning from the data and to make connections (or correlations) that would have been too time consuming before.

We now have the hardware and software blueprint to build similar systems for the detailed analysis of any kind of dataset. It is truly generic and can be applied just as well to medical imaging, social and economic database analysis as to astronomical image analysis.

For enquiries, please contact Dr Mark Wilkinson, DiRAC Project Director

March 2015:

HPQCD: Weighing up Quarks

A new publication by particle physics theorists working on DiRAC has been highlighted as the “Editor’s Suggestion” in a top particle physics journal because it is “particularly important, interesting and well written”. The calculation gives a new, more accurate determination of the masses of quarks using the most realistic simulations of the subatomic world to date. This is an important ingredient in understanding how a deeper theory than our current Standard Model could give rise to these different masses for fundamental particles.

Quark masses are difficult to determine because quarks are never seen as free particles. The strong force interactions between them to keep them bound into composite particles known as hadrons that are seen in particle detectors. This is in contrast to electrons which can be studied directly and their mass measured in experiment. Quark masses instead must be inferred by matching experimental results for the masses of hadrons to those obtained from theoretical calculations using the theory of the strong force, Quantum Chromodynamics (QCD). Progress by the HPQCD collaboration using a numerically intensive technique known as lattice QCD means that this can now be done to better than 1% precision. The publication determines the charm quark mass to high accuracy (shown in the figure) and then uses a ratio of the charm quark mass to other quark masses to determine them.

The research was done by researchers at Cambridge, Glasgow and Plymouth working with collaborators at Cornell University (USA) and Regensburg (Germany) as part of the High Precision QCD (HPQCD) Collaboration. The paper is published in the latest issue of Physics Review D and can be accessed here. The calculations were carried out on the Darwin supercomputer at the University of Cambridge, part of STFC High Performance Computing Facility known as DiRAC. The speed and flexibility of this computer was critical to completing the large set of numerical calculations that had to be done for this project.


DiRAC Services support a significant portion of STFC’s science programme, providing simulation and data modelling resources for the UK Frontier Science theory community in Particle Physics, astroparticle physics, Astrophysics, cosmology, solar system & planetary science and Nuclear physics (PPAN; collectively STFC Frontier Science). DiRAC services are optimised for these research communities and operate as a single distributed facility which provides the range of architectures needed to deliver our world-leading science outcomes.

Based at four University sites (Cambridge; Leicester; Durham & Edinburgh), we host three Services,: Data Intensive Cambridge; Data Intensive Leicester; Memory Intensive and Extreme Scaling.

Information on how to apply for time on our Services can be found here, and how our Services map onto our Science agenda can be found here. The DiRAC Data Management Plan is available for download here.

For general enquires please email DiRAC Support or the Project Office.

Each of our DiRAC-3 sites have user guides which are hosted online, and they provide you with all the information about the service, how to log on and updates on downtime, or upcoming server room or data centre maintenance. 

We have created a user-friendly GitHub Wiki page to share the site and service information in one place as well as sharing the Tips and Tricks, compiled by our Research Software Engineers. 

Data Intensive Service

The Data Intensive Service is jointly hosted by the Universities of Cambridge and Leicester.

Data Intensive@Cambridge

DiRAC has a part share of the CSD3 petascale HPC platform (Cumulus)hosted at the University of Cambridge.


The Cumulus system currently consists of several components:

544 Ice Lake CPU nodes each with 2 x Intel Xeon Platinum 8368Q
processors, 2.60GHz 38-core (76 cores per node):

296 nodes with 256 GiB memory
116 nodes with 512 GiB memory
DiRAC has a share of 267 nodes (20,292 cores)

672 Cascade Lake CPU nodes each with  2 x Intel Xeon Platinum 8276
processors, 2.6GHz 28-core (56 cores per node):

616 nodes with 192 GiB memory
56 nodes with 384 GiB memory
DiRAC has a share of 119 nodes (6664 cores).

80 Ampere GPU nodes (Wilkes-3) each with 4x NVIDIA A100-SXM-80GB GPUs,
2x AMD EPYC 7763 processors, 1.8GHz 64-core (128 cores per node), 1TiB

The HPC interconnect is
Intel OmniPath, 2:1 blocking (Skylake)
Mellanox HDR Infiniband, 3:1 blocking (Cascade Lake, Ice Lake and

Storage consists of 23PiB of disk storage configured as multiple Lustre
parallel filesystems.

The operating system is based on RedHat Enterprise Linux, and resource
management is performed by Slurm.

For more information see the site specific user guide.

Data Intensive@Leicester

Data Intensive 3

The DIaL system spec has:

  • 25,600 AMD cores running at 2.25/3.4GHz
  • 102TB of system memory
  • 200Gbps HDR IB 3:1 blocking interconnect
  • 4TB file space

Each of the 200 nodes has:

  • 2 * AMD EPIC ROME 7742 CPUs each with 64 cores giving 128 cores per node running at 2.25/3.4GHz
  • 512GB of system memory, giving 3.9GB per CPU core
  • 200Gbps HDR IB interconnect
  • Running CentOS7
Data Intensive 2.5x

The DI system has two login nodes, Mellanox EDR interconnect in a 2:1 blocking setup and 3PB Lustre storage.

  • 408 dual-socket nodes with Intel Xeon Skylake 6140, two FMA AVX512, 2.3GHz; 36 cores, 192 GB RAM. 14688 cores  and 3.5PB storage in total.
  • 1 x 6TB server with 144 cores X6154@ 3.0GHz base
  • 10 x 1.5TB server with 36 cores X6240@ 2.3GHz base

The DI System at Leicester is designed to offer fast, responsive I/O.

Further information is available on the web page or by emailing Leicester support.

A site specific user guide is available here.

Memory Intensive Service (COSMA)

The Memory Intensive Service is hosted by the University of Durham at the Institute for Computational Cosmology (ICC). The COSMA support web pages are available here.

Memory Intensive 3 (COSMA8)

The DiRAC-3 Memory Intensive service (COSMA8) was installed in 2021, and becameoperational in October of that year.  It is comprised of:

  • 360 compute nodes each with 128 cores (2x AMD 7H12 processors), 1TB RAMand a non-blocking HDR200 InfiniBand network.
  • 2x 2TB login nodes with 64 cores (dual AMD Rome 7542 processors)
  • Two fat nodes with 4TB RAM and 128 cores
  • GPU nodes with NVIDIA A100, V100 and AMD MI50 and MI100 GPUs
  • 5PB bulk Lustre storage
  • 1.2PB fast scratch storage (~350GBytes/s)

Memory Intensive 2.5x (COSMA7)

DiRAC’s Memory Intensive Resource

The DiRAC-2.5x Memory Intensive service (COSMA7) was installed in 2018.It comprises:

  • 2x 1.5TB and 1x 768GB login nodes with Intel Xeon 5120 Skylake processors, 1FMA AVX512, 2.2GHz, 28 cores

  • 452 compute nodes, each with 512 GB of RAM and 2 x X5120 2.2Ghz per node, offering a total of 12 656 cores.

  • The system is connected via Mellanox EDR in a 2:1 blocking configuration. 512TB of fast I/O scratch space and 3.1PB of Data space on Lustre.

Memory Intensive 2 (Formerly “Data Centric”, now COSMA6)

  • About 9000 cores in the COSMA6 cluster.  Approximately 570 nodes offer 128GB of memory per node and are connected via a Mellanox FDR 10 2:1 Blocking Infiniband fabric. Storage capacity on COSMA6 is 2.6PB.

  • The IB fabric connects COSMA6 to Lustre filesystem, with the I/O performance for both being 10-11GB/s write and 5-6GB/s read

More information on the Memory Intensive (COSMA) system can be found  here and further enquiries on the Memory Intensive Service can be emailed to A site specific user guide can be found here.

Extreme Scaling Service

DiRAC Extreme Scaling ‘Tursa’

Based in Edinburgh and locally named ‘Tursa’, this system is dominated by the GRID team. This service aimed to provide a service for computational intensive codes with relatively small data footprint per core, but with high data transfer. Tursa has two clusters, a large 112 GPU based cluster, and a small AMD based cluster.

The ES system spec has:

  • 14592 AMD CPU cores running at 2.6/3.3GHz
  • 114TB of system memory
  • 448 * A100 Nvidia GPUs
  • 200Gbps HDR IB non-blocking interconnect
  • 8PB Tape backup

To support the required workloads each of the 112 GPU nodes has:

  • 2 * AMD EPIC ROME 7H12 CPUs each with 64 cores giving 128 cores per node running at 2.6/3.3GHz
  • 1TB of system memory, giving 7.8GB per CPU core
  • 4 * NVIDIA A100 GPU cards each with 6912 FP32 CUDA cores, 40GB on board memory, and 432 tensor cores running at 765/1410MHz. Giving 27,648 cuda cores and 160GB of GPU memory
  • The GPU cards are connected via NVLink giving memory transfer speed between cards of 4800Gbps
  • 448 * A100 Nvidia GPUs
Further information on the Extreme Scaling Service is available by emailing DiRAC Support. A site specific user guide can be found here. 


Our Services Supporting our Science

DiRAC operates within a framework of well-established science cases which have been fully peer reviewed to deliver a transformative research programme aimed at creating novel and improved computing techniques and facilities. We tailor our Services’ architectures towards solving these science problems and by doing so help underpin research covering the full remit of STFC’s astronomy, particle, nuclear and accelerator physics Science Challenges. Some brief illustrations of how our Services map onto our Science Agenda can be found below and for more information please email theProject Office.

The Data Intensive Service addresses the problems associated with driving scientific discovery through the analysis of large data sets using a combination of modelling and simulation, e.g. the large-volume data sets from flagship astronomical satellites such as Planck and Gaia, and ground-based facilities such as the Square Kilometre Array (SKA).  One project using the Data Intensive Service is looking at breaking resonances between migrating planets.

The Memory Intensive Service supports detailed and complex simulations related to Computational Fluid Dynamic problems, for example cosmological simulations of galaxy formation and evolution, which require access to very large amounts of memory (more than 300 terabytes) to enable codes to ‘follow’ structures as they form.   The innovative design of this Service supports physically detailed simulations which can use an entire DiRAC machine for weeks or months at a time. More on the Virgo project, which uses the Memory Intensive Service can be found here.

The Extreme Scaling Service supports codes that make full use of multi-petaflop HPC systems. DiRAC works with industry on the design of systems using Lattice QCD in theoretical particle physics as a driver.   This field of physics provides theoretical input on the properties of hadrons to assist with the interpretation of data from experiments such as the Large Hadron Collider. To find out more about one of the Lattice QCD projects using the Extreme Scaling Service see the 2017 Science Highlights page.

The DiRAC Data Management Plan can be found here.