DiRAC is recognised as the primary provider of HPC resources to the STFC Particle Physics, astroparticle physics, Astrophysics, cosmology, solar system & Planetary Science and Nuclear physics (PPAN: STFC Frontier Science) theory community.  It provides the modelling, simulation, data analysis and data storage capability that underpins the STFC Science Challenges and our researcher’s world-leading science outcomes.

On this page you can find information on:

DiRAC Projects:

Accessing DiRAC:

Acknowledge DiRAC:

Each year we publish a selection of Science Highlights and a full list of publications from all our projects.

If you cant find what you need on these pages, please email the Project Office

DiRAC Projects

DiRAC serves over 35 projects, with more than 400 active users. Our community is diverse and encompasses Particle Physics, Astrophysics, cosmology, Nuclear Physics. Together their research addresses all the STFC Science Challenges.

Project Allocations

After each  Call has been concluded, we publish the compute time allocated to each project, in terms of CPU-Mhours on the DiRAC service on which their calculations are to be run, as well as the Storage Space w[TB] allocated. In DiRAC we have three compute services and due to the different architectures on each service, the CPU-Mhours awarded to each project may not be directly comparable.

User Accounts

Projects and accounts on DiRAC resources are administered through the DiRAC SAFE. SAFE stands for Service Administration From EPCC. It is a large web-based application, provided by EPCC for DiRAC. The same software is used by the Archer UK national supercomputing service and for other facilities at EPCC.

Every DiRAC user has an account on SAFE. You can use your SAFE account to check your use of CPU time and disk space now and in the past, to apply to join other projects and create service machine accounts, to change passwords, to keep your personal details up to date, to check the progress of the helpdesk queries you have submitted, and so on. PIs and project managers can do many other things. such as viewing usage by each member of their project team.

The DiRAC team also uses SAFE to administer the system and to generate reports. The helpdesk software is also part of SAFE. If you are a PI or project manager you can manage your project via SAFE.

You can find all the information you need on SAFE here. For any other information please email DiRAC Support.

Accessing DiRAC

STFC Resource Allocation Committee (RAC)

The Resource Allocation Committee (RAC) is responsible for overseeing the allocation process for all on DiRAC Resources, including compute time, storage and Research Engineering effort.

Call for Proposals

The RAC makes an annual Call for Proposals for requesting time on our Resources. 

Notification about any Call will be posted on our News page, and on Twitter. 

Any Call Announcement, Guidance Notes and Application Forms are available on our Call for Proposals page.

Acknowledge DiRAC

If you have used the new DiRAC Resources which have been available since the 1st May 2018 for your publication, the new acknowledgement statements for each machine can be found here.

If you have used the DiRAC Resources prior to 30th April 2018 for your publication, please use the statements found here.


What makes DiRAC special…

DiRAC was established to provide distributed High Performance Computing (HPC) services to the STFC theory community. HPC-based modelling is an essential tool for the exploitation and interpretation of observational and experimental data generated by astronomy and particle physics facilities support by STFC as this technology allows scientists to test their theories and run simulations from the data gathered in experiments. The UK has an extremely strong HPC community and these powerful computing facilities allow the UK science community to pursue cutting-edge research on a broad range of topics, from simulating the entire evolution of the universe, from the big bang to the present, to modelling the fundamental structure of matter. DiRAC is both an academic-led and an academic-supervised facility and our systems are specifically designed to meet the different high performance computational needs within our scientific community.

DiRAC provides a variety of compute Resources that match machine architecture to the different algorithm design and requirements of the research problems to be solved. There are sound scientific reasons for designing the DiRAC services in this way and the methodology was adopted following a number of in-depth reviews involving the STFC research community. The bespoke demands of the different research domains supported by STFC are such that a distributed installation was the most cost effective way to satisfy the varied scientific requirements.

As a single, federated Facility, DiRAC allows more effective and efficient use of computing resources, supporting the delivery of the science programmes across the STFC research communities, addressing all the STFC Science Challenges. It provides a common training and consultation framework and, crucially, provides critical mass and a coordinating structure for both small and large scale cross-discipline science projects, the technical support needed to run and develop a distributed HPC service, and a pool of expertise to support knowledge transfer and industrial partnership projects. The on-going development and sharing of best-practice for the delivery of productive, national HPC services within DiRAC enables STFC researchers to deliver world-leading science across the entire STFC theory programme in particle physics, astrophysics and cosmology, solar system physics, particle astrophysics and nuclear physics.

As was originally envisaged, DiRAC has become a vibrant research space, both in terms of Science and in terms of technical development. These two aspects of our activities are intimately linked with each feeding back into the other and driving research excellence in theoretical simulation and modellin alongside world-leading technical innovation. DiRAC’s technical achievements are as important as our scientific achievements; they are key to our scientific impact and key to our impact on the UK Economy as a whole.


October 2021

DiRAC HPC-AI Advisory Council UK Conference: 13/14th October – registration now open!

The 3rd annual HPC-AI Advisory Council UK conference is taking place virtually next Wednesday and Thursday, 13/14th October, from 2-5:30pm.

Co-hosted by DiRAC, we have a very interesting set of talks over the two days:

Registration is free:

We hope that many of you will attend this event, which is a timely opportunity to highlight the importance of large-scale computing for the UK.

Update: DiRAC Resource Allocation Committee 14th Call for Proposals 

We would like to advise the Community that there is no longer any additional time available in RAC14 on CSD3 Skylake. Apologies to any groups who were intending to apply for resources on CSD3 Skylake. In this case, you may wish to consider applying for resources on the CSD3 Cascade Lake or Ice Lake services, as these are both Intel x86 systems. If you need further information, please see the DiRAC website or contact

DiRAC has recorded two Webinars to provide assistance with the completion of the RAC14 Technical application and the RAC 14 RSE Support application. These Webinars can be found on the DiRAC website

Please be reminded that the deadline for proposal submissions to the 14th Call is Tuesday 5th October 2021 at 16:00 UK time. 

Full information including the Call application forms and guidance notes can be found on the DiRAC website


Enquiries should be directed as follows: 

• RAC process and remit: STFC Swindon Office: 

• Technical questions: RSE Team: 

• Direct allocations or discretionary requests: DiRAC Director, Prof Mark Wilkinson ( 

August 2021

STFC-DiRAC Federation Project: Senior Project Manager Role

DiRAC is currently recruiting a Senior Project Manager to oversee the delivery of an exciting £1.9m project, focusing on preparations towards Federation of the facility with other UK Research communities.

The post will be for 12 months, working remotely.

Secondments of experienced people to take up this role would be considered.

Job Description/Person Specification

Please contact Dr Clare Jenner ( with any questions, or with CV and covering letter to apply.

Deadline 31st August 2021.

HPC Vacancies at Durham

We are recruiting for a couple of HPC roles within the national DiRAC HPC facility at Durham University (COSMA).

1. A Technical Manager for HPC
2. A Lead database Designer and Engineer

Both positions are open for full or part time working. Deadline 6th Sept.

The charming strangeness of the W boson

Theorists in the HPQCD collaboration have pinned down Vcs, a key parameter of the Standard Model, using STFC’s DiRAC Data Intensive supercomputer at Cambridge.

Vcs is determined from combining the theoretical calculation with results from particle physics experiments around the world for the proportion of D mesons that decay to a K meson in a process akin to nuclear beta decay.

This rate depends on the coupling strength Vcs between the W boson of the weak interaction and the charm-strange quark pair, but also on the strong interaction physics, encoded by ‘form factors’, that binds the quarks inside the mesons while this process happens.

The numerical techniques of lattice QCD allow the form factors to be calculated, but in the past their uncertainty has limited the precision of Vcs.

Using improved methods for handling quarks, developed by HPQCD, physicists in Cambridge and Glasgow have now obtained a value for Vcs of 0.9663(80), three times more accurate than previous work. This allows Vcs to be distinguished from 1 for the first time, giving tighter constraints on the possibilities for new physics beyond the Standard Model.

Full details found can be found here.

June 2021

DiRAC Director discusses Catalyst UK on This Week in HPC Podcast

DiRAC Director, Professor Mark Wilkinson was part of a panel discussing what the Catalyst UK Project has achieved so far on the latest This Week in HPC Podcast with Addison Snell.

You can listen to it here:

Dark matter is slowing the spin of the Milky Way’s galactic bar

The spin of the Milky Way’s galactic bar, made up of billions of clustered stars, has slowed by about a quarter since its formation, according to a new study by UCL and University of Oxford scientists.

The research, which made use of DiRAC’s CSD3 petascale HPC platform, analysed Gaia space telescope observations of a large group of stars.

Read more via UCL and New Scientists.

For the full publication, please visit the Royal Astronomical Society.

Atos supercomputer to help unlock secrets of the Universe

Atos today announces it has been awarded a contract by the University of Edinburgh to deliver its supercomputer, the BullSequana XH2000, the most energy-efficient supercomputing system on the market. This is the largest system dedicated to GPU computing deployed at a customer site in the UK. 

The new system will constitute the Extreme Scaling Service of the UK’s DiRAC HPC Facility. The state-of-the-art platform will allow scientists across the STFC theory community to drive forward world-leading research in particle physics, among other areas, using NVIDIA Graphics Processing Units (GPUs) and AMD processors.

You can read all about it here.

May 2021

Pre-announcement: DiRAC Resource Allocation Committee Call 13.5: Special Call for Proposals for DiRAC-3 Resources 

STFC will shortly open a Special Call of the DiRAC Resource Allocation Committee (RAC Call 13.5) for computing resources on the new DiRAC-3 systems. The UK theory and modelling communities in Astronomy and Cosmology, Astrophysics, Particle Physics and Nuclear Physics will be invited to apply for access to these computational resources. 

The Special Call will cover computing resources during a 6 month period from 1st October 2021 to 31st March 2022. A summary of the new hardware which will be available in this call are provided in the table below – full details of available core-hours will be provided with the call documents. 

Two types of application will be accepted: 

1. Uplift of an existing allocation: Projects with existing RAC allocations may request an uplift of up to 100% of their current allocation between October 2021 and March 2022. For example, in the case of a 100% uplift, the existing allocation for this 6 month period would then become twice the original allocation. 

2. New science allocations: Applicants may apply for up to 80% of the new resources based on a new science case. 

For both application types, it is essential that a clear management plan must be submitted to show that sufficient staff effort is available within the duration of the award to make full use of the increased allocation. Any existing projects which are under using their allocations are unlikely to be awarded an increased allocation unless there is clear evidence of a material change in anticipated usage (for example, a PDRA taking up a new post from October). 

An Expression of Interest must be submitted for both proposal types. Further information about what to include in the Expression of Interest will follow soon. 

The expected closing dates for proposal submissions to this Call will be as follows: 

Expression of Interest: Early June 2021 

Full proposal (Scientific and Technical Cases): End June 2021 

The Call documents, confirmed closing dates and further information on the peer review process will be available soon. 

This Call is in addition to the 14th RAC Call which will open later in the year for allocations starting from 1st April 2022. 


Enquiries should be directed as follows: 

  • RAC process and remit: STFC Swindon Office ( 
  • Technical questions: Technical Working Group ( 
  • Direct allocations or discretionary requests: DiRAC Director, Prof Mark Wilkinson ( 

Summary of new DiRAC-3 hardware

February 2021

Call for applications for DiRAC Senior Technical Programme Coordinator

The DiRAC Facility Management Team invites applications for the new position of Senior Technical Programme Coordinator.

The 0.6 FTE post is available immediately, and will initially run until 31st March 2023.

The closing date for applications is midnight on 14th March 2021.

Full details of the position and the application process can be found here.

January 2021

Durham University and DiRAC will deepen understanding of the universe with 2nd Gen AMD EPYC™ CPUs

Read the case study about how Durham University used AMD EPYC processors to enable much larger simulation data sets with faster execution to speed up the discovery process during cosmological investigations into the origins of the universe and the Big Bang.

January 2021

Call for applications for DiRAC Community Development Director

The DiRAC Facility Management Team invites applications for the new position of Community Development Director.

The post is available from 1st April 2021, and will initially run until 31st March 2023.

Applications should be submitted by e-mail to the DiRAC Director by 5pm on Monday, 8th February 2021.

Full details of the position and the application process can be found here.

January 2021

Fame Lab – Free Training in Public Engagement

FameLab is an ideal way to boost confidencebuild networks and share research with new audiences. FameLab UK is open to anyone over 21 currently working in STEM, and the winners will participate in the FameLab UK final at Cheltenham Science Festival 2021!  

This is a fantastic opportunity for anyone interested in:

  • Improving their communication skills
  • Sharing their research with a public audience
  • Joining a global network of researchers who enjoy talking about science

Training and heats are taking place in 9 regions around the UK, find your nearest one here.

Participants can take part in FameLab as many times as they like and keep improving each year until they become a UK finalist! 

January 2021

DiRAC placement opportunity: AI in multi-physics/multi-scale cosmological simulations

DiRAC will award one Innovation Placement in 2021 to explore the application of the HPE “SmartSim” AI library to cosmological simulations as a means to replace parameterised sub-grid physics models. The placement will be with Hewlett Packard Enterprise in collaboration with the University of Cambridge.

You have to be working on research that falls within the STFC remit in order to qualify for the placement; however, you can be funded by other organisations besides STFC, as long as the subject area is identifiable as being in Particle Physics, Astronomy & Cosmology, Solar Physics and Planetary Science, Astro-particle Physics, and Nuclear Physics.

To check your eligibility, please contact Mark Wilkinson.

You must get your supervisor’s or PI’s permission before applying for this placement. Participation in the placement scheme is allowed under UKRI’s rules, but only with your supervisor/PI’s consent.

We will do our best to be flexible; part time working can be arranged as long as the placement does not exceed 9 months.

This should be looked on as an opportunity to learn new skills and contribute outside of your research area.

The deadline for applications is 5pm on Monday 8th February 2021.

For more information, see the application form.

January 2021


Learn how to apply AI tools, techniques and algorithms to real-life problems. You will study the core concepts of Deep Neural Networks, how to build Deep Learning models as well as how to measure and improve the accuracy of your models. You will also learn essential data pre-processing techniques to ensure a robust Machine Learning pipeline. 
The Bootcamp is a hands-on learning experience where you will be guided through step-by-step instructions with teaching assistants on hand to help throughout.

This event will use Nvidia experts in a instructor lead program using DiRACs Cambridge HPC GPU system.

For More information & registration:

January 2021

£20m funding boost for science supercomputer will “drive science simulation and UK-wide innovation”

Please find he full UKRI press release here.

The UK government has announced a £20m funding boost to upgrade the capabilities of the DiRAC High Performance Computing facility.

The upgrade will enhance the UK’s scientific leadership and productivity, driving ground-breaking discoveries in scientific research, with opportunities spread across the UK. It will support the training of the next generation of UK researchers and attract the world’s top computational researchers to the UK.

It will also support nationwide innovation with industry to develop solutions for exascale computing and Artificial Intelligence research with broad applications in personalised healthcare, clean energy, government decision-making and solar weather forecasting.

The new systems will be between three times and five time more powerful than the existing DiRAC machines. This will provide crucial computing capacity that can be used to address immediate/emerging issues, like the COVID-19 pandemic.


The DiRAC facility was established in 2009 to provide high performance computing systems optimised for the specialist needs of scientists working at the cutting edge of theoretical astrophysics, particle physics, cosmology and nuclear physics. The DiRAC research community also exploits and interprets observational and experimental data generated by astronomy and particle physics facilities such as the Large Hadron Collider and the LIGO experiment.

DiRAC is a distributed facility, with computing resources hosted by the Universities of Cambridge, Durham, Edinburgh and Leicester. This is overseen by the Project Office at University College London. These powerful computing facilities allow the UK science community to pursue cutting-edge research on a broad range of topics, including simulations of the entire evolution of the universe, from the Big Bang to the present, and models of the fundamental structure of matter.

DiRAC has now been awarded £20million from the World Class Laboratories opportunity to deploy DiRAC-3 – a major upgrade in the computing power at all four DiRAC sites. Crucially, they will also be up to ten times energy efficient than previous generations, an important step towards delivering sustainable computing resources for the UK.

A welcome announcement

Describing the announcement as “very welcome good news for UK science”, the DiRAC Director, Professor Mark Wilkinson from the University of Leicester, explained the importance of this investment for the UK:

“Today, high performance computing (HPC) underpins discoveries in almost all areas of science and innovation. Numerous studies have demonstrated the significant economic benefits of investment in high performance computing and confirmed that ‘to out-compute is to out-compete’.”

“The DiRAC HPC facility is an outstanding example of HPC-driven innovation in action. While it was originally established to support the UK’s world-leading research in particle physics, astrophysics, cosmology and nuclear physics, DiRAC has delivered technological innovations with global impact and developed techniques now being applied in fields as diverse as personalised medicine, government planning and solar weather forecasting.”

DiRAC Project Scientist and Deputy Director, Dr Clare Jenner from University College London, noted that the science areas impacted will “range from the subatomic to the intergalactic. Theoretical research nowadays relies on supercomputers – we can’t do the calculations in any other way. So, the DiRAC computers are vital to the future success of the UK in these fields.”

The UK Science and Technology Facilities Council (STFC) Executive Chair, Professor Mark Thomson, said:

“STFC are delighted at the announcement of new funding for the DiRAC HPC facility, to ensure that it can continue to support research in fields where the UK is world-leading.”

DiRAC Director of Innovation and Technology, Dr Jeremy Yates of University College London also emphasised that the impact of DiRAC extends much further than the scientific breakthroughs it delivers:

“We are also contributing to the delivery of the UK’s innovation agenda. We work with our industry partners to develop novel hardware and software solutions which can be used in many other applications.”

The new computers will be deployed over the coming months, with first scientific results expected to be presented in September at DiRAC Day 2021, the annual community event.

You can find further coverage of this item on:

December 2020

DiRAC Health Data Science and AI Placement Opportunity.

DiRAC will award one Innovation Placement in 2021 in the area of Health Data Science and the application of AI. The nominal length is 6 months and has to be completed by 30 September 2021. In this scheme a final year PhD student or an early career researcher can have a funded placement (up to £25k) with the Getting It Right First Time (GIRFT) programme. GIRFT is funded by the UK Department of Health and Social Care and is a collaboration between NHS England & NHS Improvement and the Royal National Orthopaedic Hospital NHS Trust. GIRFT uses comprehensive benchmarking data analysis to identify unwarranted variation in healthcare provision and outcomes in National Health Service (NHS) hospitals in England and combine this with deep dive visits to the hospital by clinicians with follow up on agreed actions by an improvement team. The programme covers the majority of healthcare specialities.

To qualify you have to be working on research that falls within the STFC remit in order to qualify for the placement; however you can be funded by other organisations besides STFC, as long as the subject area is identifiable as being in the Particle Physics, Astronomy & Cosmology, Solar Physics and Planetary Science, Astro-particle Physics, and Nuclear Physics.

To check your eligibility please contact Jeremy Yates and Maria Marcha.

This should be looked on as an opportunity to learn new skills and contribute outside of your research area.

The deadline for applications is 10am on Monday 11th January 2021.

Further information can be found in this document.

December 2020:

Unlocking the mystery of the Moon’s formation

Astronomers have taken a step towards understanding how the Moon might have formed out of a giant collision between the early Earth and another massive object 4.5 billion years ago.

Scientists led by Durham University UK, used DiRAC supercomputer to simulate Mars-sized planet – called Theia – crashing into the early Earth.

Lead author Sergio Ruiz-Bonilla, a PhD researcher in Durham University’s Institute for Computational Cosmology, said: “By adding different amounts of spin to Theia in simulations, or by having no spin at all, it gives you a whole range of different outcomes for what might have happened”

November 2020:

2021 Code Performance Series: From analysis to insight

Starting in January, Durham University is hosting a 7-monthly series of  workshops based around performance analysis for Exascale software. This could be of interest for anyone working on HPC codes, aiming to upskill researchers in this key area.

To register, please visit this link:

This workshop series is run by the Durham University’s Department of Computer Science in collaboration with the N8 and DiRAC, in close collaboration with the VI-HPS, and made possible by support from the UK’s ExCALIBUR programme.

October 2020:

Professor Carlos S Frenk
Institute of Computational Cosmology, Durham University, & DiRAC

We would like to congratulate Carlos on being awarded the Institute of Physics 2020 Paul Dirac Medal and Prize for theoretical (including mathematical and computational) physics.  

For outstanding contributions to establishing the current standard model for the formation of all cosmic structure, and for leading computational cosmology within the UK for more than three decades.

The full citation is at

Modelling temperature variation on distant stars

A team led by Dr Andrei Igoshev at the University of Leeds is helping to explain one of the big questions that has perplexed astrophysicists for the past 30 years – what causes the changing brightness of distant stars called magnetars.

 A mathematical model was developed that simulates the way the magnetic field disrupts the conventional understanding of heat being distributed uniformly which results in hotter and cooler regions where there may be a difference in temperature of one million degrees Celsius.

The team used the STFC-funded DiRAC supercomputing facilities at the University of Leicester. 

Read more about it here.

September 2020:

The Earth could have lost anywhere between 10 and 60% of its atmosphere in the collision that is thought to have formed the Moon!

New research led by astronomers at Durham University shows how the extent of atmospheric loss depends upon the type of giant impact with Earth.

They ran more than 300 supercomputer simulations to study the consequences of different huge collisions on rocky planets with thin atmospheres.

Real all about it here and here.

September 2020:

DiRAC contributes to a new Calculation that Refines Comparison of Matter with Antimatter

A new calculation performed using the world’s fastest supercomputers allows scientists to more accurately predict the likelihood of two kaon decay pathways, and compare those predictions with experimental measurements. The comparison tests for tiny differences between matter and antimatter that could, with even more computing power and other refinements, point to physics phenomena not explained by the Standard Model.

Read all about it in their press release.

September 2020:

DiRAC Day – Poster Prize Winners: 

After a very enjoyable day being informed of all the first class research DiRAC has supported over the past year, and the exciting plans we have for the years to come. Ending with this year’s poster prize winners, sponsored by Intel.

  • Fionntan Callan from Queen’s University Belfast
  • Rosie Talbot from Cambridge University
  • runner up Josh Borrow from Durham University 

Well done everyone the standards were extremely high this year.

Zooming in on dark matter

Our cosmologists have zoomed in on the smallest clumps of dark matter in a virtual universe – which could help us find the real thing in space.

Using a supercomputer simulation of the universe they achieved a zoom equivalent to being able to see a flea on the surface of the Moon.

This meant they could make detailed pictures and analyses of hundreds of virtual dark matter haloes from the very largest (galaxy clusters) to the tiniest (about the same as Earth’s mass).

Read all about it on their website.

August 2020:

Intel has agreed to sponsor the pre-DiRAC Day hackathon. The event will focus on optimisation with the latest Intel tool set, and looking at a new coding model oneAPI. OneAPI will deliver the tools needed to deploy applications and solutions across different architectures, including CPUs, GPUs, FPGAs, and other accelerators.

Application deadline has been extended to Tuesday the 25th August.

For information see post

Swiftsimio, a Python library for reading SWIFT data developed with support of DiRAC Research Software Engineering time, published in the Journal of Open Source Software.

Read the article here.

July 2020:

Charming physics in a beautiful context.

The HPQCD collaboration have recently completed a study using DiRAC that has appeared as an Editor’s suggestion in Physical Review D (see Phys. Rev. D 102, 014513). They calculated how the charm quark undergoes a weak interaction when paired with a beauty quark inside a Bc meson and subject to strong interaction physics. The LHCb experiment at CERN could soon see this process and the combination of theory and experiment will then shed new light on the quark weak transitions.

June 2020:

Spectra publishes a case study on their long-term storage solution for the DiRAC Memory Intensive Services at Durham.

Read all about it on their website.

May 2020:

New simulations from Imperial College London have revealed the asteroid that doomed the dinosaurs struck Earth at the ‘deadliest possible’ angle.

The simulations show that the asteroid hit Earth at an angle of about 60 degrees, which maximised the amount of climate-changing gases thrust into the upper atmosphere.

Such a strike likely unleashed billions of tonnes of sulphur, blocking the sun and triggering the nuclear winter that killed the dinosaurs and 75 per cent of life on Earth 66 million years ago.

Read all about it on their website, learn more about DiRAC’s contribution in the STFC article, and see the impact on YouTube.

See also the BBC online, Daily Mail, and New Scientist articles.

May 2020:

13th Call for Proposals Pre-announcement

The DiRAC Resource Allocation Committee 13th Call for Proposals will be opening shortly. Find all information and important dates here.

April 2020:

Webinar: Porting and Performance of DiRAC benchmarks on Oracle Bare Metal Cloud

On Wednesday April 29th, from 11.00 to 12.00 am (BST), Any Turner will give a webinar on Porting and Performance of DiRAC benchmarks on Oracle bare metal cloud.

Find more information here.

February 2020:

Government announces new supercomputer for N8 universities

Based at the University of Durham, the new £3.15m Northern Intensive Computing Environment (NICE) will provide a shared facility for academic and industry researchers for all of the N8 universities, shared on an equal basis with each paying towards its operation, while also allowing access to the EPSRC-supported UK-wide community. The announcement is one of seven HPC centres to be supported by a £27 million investment from EPSRC.

Find more information here.

Advance Announcement: September 2020:

DiRAC Day 2020 @ Durham University

This year, the Annual DiRAC Science Day event, will be held at Durham University on the 10th of September. The day provides an opportunity to meet researchers from across the DiRAC community and learn about their recent science achievements. In addition, our industry partners will be these to talk about new hardware and software advances which may benefit DiRAC research.

Full details regarding registration, accommodation etc will be available via the DiRAC website shortly.

We also expect to host a hackathon over the three days leading up to DiRAC day – details will announced soon and will be posted on our Training page.

January 2020:

CodeCamp is back in March!

Interested in knowing if your research will benefit from the power of GPUs? Haven’t done any GPU programming?, or do not know what a GPU is?, then CodeCamp is for you. Come along to Durham on 17th March.

Go to our web page or for details, all are welcome, but spaces are limited.

Application dead line has been extended to the 2nd of March.

UCLan astronomers find a way to form ‘fast and furious’ planets around tiny stars!

Using DiRAC resources, researchers from the University of Central Lancashire (UCLan) found giant planets could form around small stars much faster than previously thought.

As published in the Astronomy and Astrophysics Journal, Dr Anthony Mercer and Dr Dimitris Stamatellos’ new planet formation research challenges our understanding of planet formation.

Computer simulation of planets forming in a protoplanetary disc around a red dwarf star.

Find out more at the UCLan website, or the STFC website.

December 2019:

CodeCamp is starting in December!

Our launch event will feature a technology that is prominent within the HPC community and will be with us into the future, GPUs. Interested in knowing if your research will benefit from the power of GPUs? Haven’t done any GPU programming?, or do not know what a GPU is?, then CodeCamp is for you. Come along to Durham on 11-12th December. for details, all are welcome, but spaces are limited.

Application dead line has been extended to the 20th of November.

October 2019:

Stormy cluster weather could unleash black hole power and explain lack of cosmic cooling

“Weather” in clusters of galaxies may explain a longstanding puzzle, according to a team of researchers at the University of Cambridge. The scientists used sophisticated simulations performed on the DiRAC infrastructure to show how powerful jets from supermassive black holes are disrupted by the motion of hot gas and galaxies, preventing gas from cooling, which could otherwise form stars. The team publish their work in the journal Monthly Notices of the Royal Astronomical Society.

For more information see their website.

Figure 2. An artist’s impression of the jet launched by a supermassive black hole, which inflates lobes of very hot gas that are distorted by the cluster weather. Image credit: Institute of Astronomy, University of Cambridge.

A copy of the paper is available from:

September 2019:

Retirement event for Lydia Heck

Lydia Heck, DiRAC’s former Technical manager, retired this month. Lydia has been with DiRAC for more than 9 years and will be greatly missed. Her career was celebrated with friends and colleagues on a retirement event in Durham. Thank you and good luck, Lydia!

September 2019:

DiRAC ARM Mellanox Hackathon – Pre-event training.

Prior to DiRAC day, ARM and Mellanox are sponsoring a hackathon. This hackathon is to investigate the suitability of the ARM processor, and the Mellanox Bluefield chip for use within the DiRAC HPC community.

To enable all our participants to get the most out of this event there will be a online pre-event training session on Monday 2nd of September at 11:00am GMT.

All are welcome.

With Conference ID: 19336783

August 2019:

STFC Innovation Placements Opportunity.

This Opportunity has now closed.

DiRAC has been awarded 8 STFC Innovation Fellowships that are of duration 6 months and have to be completed by 31 March 2020. In this scheme a final year PhD student or an early career researcher can have a funded placement (up to £21k) with a third-party organisation.    

To qualify you have to be working on research that falls within the STFC remit in order to qualify for the placement; however you can be funded by other organisations besides STFC, as long as the subject area is identifiable as being in the Particle Physics, Astronomy & Cosmology, Solar Physics and Planetary Science, Astro-particle Physics, and Nuclear Physics.

To check your eligibility please contact Mark Wilkinson (miw6 AT and Clare Jenner (c.jenner AT We will do our best to be flexible.

However, the placement can’t be on your research problem, but rather on the offered innovation problem.

This should be looked on as an opportunity to learn new skills and contribute outside of your research area.

We are pleased to offer the following DIRAC STFC Innovation Placements:

The deadline for applications is 10am on Monday 9th September 2019.

July 2019:

DiRAC ARM Mellanox Hackathon prior to DiRAC DAY

Anyone interested in attending the DiRAC ARM Mellanox Hackathon on the 9th to 11th of September needed to submit an application form a.s.a.p. these is limited spaces. Details can be found here.

June 2019:

“Beautiful” DiRAC research features in a Plus Magazine article.

A new particle that has recently been discovered at CERN confirms predictions made by theoretical physicists over six years ago. The result, delivered with a little help from the Darwin supercomputer, confirms existing particle theory, but also opens the door to new physics.

Read the whole article here.

May 2019:

DiRAC deploys Atempo Miria for Archiving.

Recently, DiRAC’s Memory Intensive facility in Durham called on the services of Atempo, the Data Protection and Movement specialists, together with their UK partner, OCF, to implement a multi-petabyte archiving project for their Lustre and Spectrum Scale (GPFS) data.

You can read all about it on the Atempo blog.

May 2019:

Free webinar, Wednesday 22nd May 2019, 15:00 BST: Open Source HPC Benchmarking. Presented by Andy Turner, EPCC.

There is a large and continuing investment in HPC services around the UK, Europe and beyond and this, along with new processor technologies appearing on the HPC market, has led to a wider range of advanced computing architectures available to researchers.

We have undertaken a comparative benchmarking exercise across a range of architectures to help improve our understanding of the performance characteristics of these platforms and help researchers choose the best services for different stages of their research workflows.

We will present results comparing the performance of different architectures for traditional HPC applications (e.g. CFD, periodic electronic structure) and synthetic benchmarks (for assessing I/O and interconnect performance limits). We will also describe how we have used an open research model where all the results and analysis methodologies are publicly available at all times. We will comment on differences between architectures and demonstrate the benefits of working in an open way.  

Full details and join link can be found here.

April 2019:

DiRAC’s Technical Manager gives Headline Talk at local BCS networking event

24th April 2019: DiRAC’s Technical Manager Lydia Heck is giving the Headline Talk at the local British Computer Society (Newcastle and District Branch) networking event this evening.  She will be discussing DiRAC@Durham’s Memory Intensive machine and explaining how this powerful resource is helping to unlock crucial insights into our Universe.

More information on her talk can be found here.

April 2019:

HPC-AI Advisory Council 2019, Swiss Conference & HPCXXL User Group

DiRAC’s Director Dr Mark Wilkinson’s talk from the  HPC-AI Advisory Council 2019 Swiss Conference, entitled: “40 Powers of 10 – Simulating the Universe with the DiRAC HPC Facility“,  is now available on YouTube and also features on the Inside HPC Website.

April 2019:

Theory predictions come up trumps

A particle that is an ‘excited’ bound state of a bottom quark and a charm antiquark has been discovered at the Large Hadron Collider and its mass is in agreement with a prediction made by the HPQCD collaboration back in 2012 using STFC’s DiRAC facility. HPQCD used a numerical technique known as lattice QCD to solve the theory of the strong force, Quantum Chromodynamics. This enabled them to calculate the masses of several bound states of bottom and anticharm, each with the quarks in a different configuration, collectively known as the Bc mesons. The CMS and LHCb collaborations have both now reported in 2019 the first clear evidence for the member of this set called the Bc’ meson. 

The lightest Bc meson, known simply as the Bc, has the bottom and anticharm quarks spinning in opposite directions so that its spin is zero. This is the lowest energy configuration for bottom-anticharm and simplest to calculate in lattice QCD. In 2005 HPQCD (with the Fermilab lattice collaboration) successfully predicted the mass of the Bc meson, ahead of its discovery by the CDF experiment at the Fermilab Tevatron collider. The large mass of this meson, 6.27 GeV/c2(where the proton mass is 0.94 GeV/c2), along with its quark-antiquark content meant that a proton collider was needed to produce it and made it hard to find experimentally.  

In 2012, armed with the computing power of DiRAC and the much-improved QCD calculations that that allowed, HPQCD were able to revisit the topic and calculate the masses of many more states. They predicted the mass of the Bc* meson, a particle with spin because the bottom and anti-charm quarks are spinning in the same direction inside it. They also predicted the masses of excited states of the Bc and Bc* , known as the Bc’ and Bc*’.  These are the analogues of the electronic radial excitations of the hydrogen atom. The mass difference between the Bc’ and the Bc is then a consequence of the way in which the bottom and anti-charm quark are bound together through strong force interactions. To predict this mass difference from QCD requires the numerical techniques of lattice QCD because QCD has such complicated non-linear interactions. In arXiv:1207.5149 HPQCD found the mass difference between Bc’ and Bc to be 0.616(19) GeV/c2; the CMS result for this mass difference in arXiv:1902.00571 (and LHCb in arXiv:1904.00081) is 0.5961(14) GeV/c2, in good agreement

Figure 1: HPQCD’s predictions for the masses of the lightest states in the Bc family (JPgives their spin and parity quantum numbers) of mesons (blue crosses) calculated on DiRAC. The experimental results for the two states that have been seen are shown as red lines (the experimental uncertainties are around 0.001 GeV/c2).

Figure 1 shows the HPQCD predictions for Bc meson masses along with the current experimental values. Mesons containing b quarks are the Achilles heel of the Standard Model since their rare decay processes are sensitive to the existence of new particles. The Bc meson family provides a new chapter in this search that theory and experiment are now beginning to exploit. The HPQCD collaboration remains at the forefront of this work and is pushing ahead with more precise calculations of Bc masses and differential decay rates on DiRAC-2.5. 

April 2019:

Dr Debora Sijacki wins the PRACE Ada Lovelace Award for HPC 2019

Huge Congratulations to DiRAC Researcher Dr Debora Sijacki who has won the PRACE Ada Lovelace Award for HPC 2019.  This prestigious prize is awarded annually to a young female European scientist in recognition of their outstanding impact on HPC research and computational science at a global level and for being a role model for young women beginning their careers in HPC.  Well done Debora!

Debora is based at the Institute of Astronomy, University of Cambridge (personal webpage) and more information on the Partnership for Advanced Computing in Europe (PRACE) and the Ada Lovelace Award for HPC 2019 can be found here.

Advance Announcement: September 2019:

DiRAC Day 2019 @ University of Leicester

This year, the Annual DiRAC Science Day event, will be held at the University of Leicester on the 12th of September. The day provides an opportunity to meet researchers from across the DiRAC community and learn about their recent science achievements. In addition, our industry partners will be these to talk about new hardware and software advances which may benefit DiRAC research.

Full details regarding registration, accommodation etc will be available via the DiRAC website shortly.

We also expect to host a hackathon over the three days leading up to DiRAC day – details will announced soon and will be posted on our Training page.

November 2018:

DiRAC researchers on this year’s Clarivate Analytics Highly Cited Researchers List

Three DiRAC@Durham researchers, Professors Carlos Frenk,  Tom Theuns and Adrian Jenkins, appear on this year’s Clarivate Analytics Highly Cited Researchers List. Highly Cited researchers rank in the top 1% by citations for their field and are making a huge impact in solving the world’s biggest challenges.

We are extremely proud of Carlos, Tom and Adrian as their inclusion in this list is a particularly noteworthy achievement and is a demonstration of their global influence.

For more information see:

June 2018:

RAC 11th Call for Proposals Opens

The RAC makes an annual Call for Proposals for requesting time on our Resources. The 11th Call opened on the 9th July 2018 and will close on the 1st October 2018. The Call Announcement, the Guidance Notes and Application Forms are available on our Call for Proposals page.

Advance Announcement: September 2018:

DiRAC Day 2018 @ Swansea University.

We are looking forward to our 8th Annual DiRAC Science Day event, this year being held at Swansea University on the 12th of September. The day provides an opportunity to meet others from the DiRAC community and learn about the recent research achievements of our different consortia.

Swansea University are also running a number of other co-located training/networking events in the week commencing 9th September and details can be found on our Training page.

Feburary 2018:

New models give insight into the heart of the Rosette Nebula.

Through computer simulations run in part on DiRAC Resources, astronomers at Leeds and at Keele University have found the formation of the Nebula is likely to be in a thin sheet-like molecular cloud rather than in a spherical or thick disc-like shape, as some photographs may suggest. A thin disc-like structure of the cloud focusing the stellar winds away from the cloud’s centre would account for the comparatively small size of the central cavity.

More information can be found on the STFC press release published here and on our 2017 Science Highlights page.


November 2017:

DiRAC @ Supercomputing 2017.

Members of the DiRAC Project Management Team travelled this year to Denver Colorado to attend the SuperComputing 2017 industry conference.  More information on what went on can be found here.


August 2017:

The 7th Annual DiRAC Day event.

Our 2017 Dirac Day event was held at Exeter University on the 30th August. Find out more at the dedicated web page.

April 2017:

DiRAC HPC Manager talks to Computer Scientific World

Dr Lydia Heck, Senior Computer Manager in the Department of Physics at Durham University, talks to Robert Roe of Computer Scientific World in this article looking at managing HPC performance and exploring the options available to optimise the use of resources. Discussing DiRAC’s series of COSMA machines, Lydia talks about the hurdles her team has overcome whilst implementing a new workload management system, SLURM and using a Lustre file system for the latest DiRAC iteration: COSMA 6.

March 2017:

DiRAC partners in Peta-5

DiRAC partners in Peta-5

Six Tier 2 High Performance Computing (HPC) centres were officially launched on Thursday 30 March at the Thinktank science museum in Birmingham. Funded by £20 million from the Engineering and Physical Sciences Research Council (EPSRC) the centres will give academics and industry access to powerful computers to support research in engineering and the physical sciences.

DiRAC will partner in The Petascale Intensive Computation and Analytics facility at the University of Cambridge which will provide the large-scale data simulation and high performance data analytics designed to enable advances in material science, computational chemistry, computational engineering and health informatics.

September 2016:

6th Annual DiRAC Science Day.

On September 8th, the University of Edinburgh hosted the sixth annual DiRAC Science Day. This gave our researchers in the DiRAC HPC Community the opportunity to meet each other and the technical teams from each site, learn about what is being done by all the different projects running on the DiRAC facility and discuss future plans. The Day was generously sponsored by Bull, Atos, Dell, Hewlett Packard Enterprise, Intel, Cray, DDN, Lenovo, Mellanox, OCF and Seagate.

Dr. Jeremy Yates opened the meeting with an update on facility developments and then Prof. Christine Davies led a community discussion on several issues including the training needs of young researchers. The Science presentations then began with a talk on Simulating Realistic Galaxy Clusters, followed by a review of lattice QCD calculations and an exciting presentation from Prof. Mark Hannam on the recent detection of Gravitational Waves and the key role DiRAC played in converting information from the gravitational-wave signal into results for the properties of the colliding black holes.

During lunch 23 posters show-cased some of the other research done on the facility and then the day split into parallel Science and Technical Sessions. In the Science session, presentations were made on: The hadronic vacuum polarisation contribution to the Anomalous Magnetic Moment of the Muon; The Robustness of Inflation to Inhomogeneous Inflation; A Critical View of Interstellar Medium Modelling in Cosmological Simulations and finally, Magnetic Fields in Galaxies. The Technical session presented talks on: Emerging Technologies; Grid; A Next Generation Data Parallel C++ Library; An Overview of the DiRAC-3 Benchmark Suite and a lecture on SWIFT – Scaling on Next Generation Architectures.

LIGO Detections
Figure 1. Dr Andrew Lytle and his poster.

During tea the poster prizes were announced and congratulations go to Dr Andrew Lytle (U. of Glasgow) for his poster on Semileptonic B_c Decays from Full Lattice QCD and to Dr Bernhard Mueller (Queens U. Belfast) for his poster on Core-Collapse Supernova Explosion Models from 3D Progenitors. They each won a £500 Amazon voucher from our kind sponsor DDN. Dr Lytle and his winning poster can be seen in the figure on the right.

Further Science session talks after tea were: Growing Black Holes at High Redshift; Planet Formation and Disc Evolution and finally, Modelling the Birth of a Star. The Technical session included a talk on the Co-design of Cray Software Components and ended with an interesting review of AAAI, Cloud and Data Management: DiRAC in the National E-Infrastructure, given by Dr. Yates. The Day concluded with a Drinks Reception outside the lecture theatres that was well attended and much enjoyed by all.

February 2016:

DiRAC simulations play a key role in gravitational-wave discovery.

LIGO Detections
Figure 1. The top plot shows the signal of gravitational waves detected by the LIGO observatory located in Hanford, USA whist the middle plot shows the waveforms predicted by general relativity. The X-axis plots time and the Y-axis plots the strain, which is the fractional amount by which distances are distorted by the passing gravitational wave. The bottom plot shows the LIGO data matches the predications very closely. (Adapted from Fig. 1 in Physics Review Letters 116, 061102 (2016))

On February 11 2016, the LIGO collaboration announced the first direct detection of gravitational waves and the first observaton of binary black holes. Accurate theoretical models of the signal were needed to find it and, more importantly, to decode the signal to work out what the source was. These models rely on large numbers of numerial solutions of Einstein’s equations for the last orbits and merger of two black holes, for a variety of binary configurations. The DiRAC Data Centric system, COSMA5, was used by researchers at Cardiff University to perform these simuations. With these results, along with international collaborators, they constructed the generic-binary model that was used to measure the masses of the two black holes that were detected, the mass of the final black hole, and to glean some basic information about how fast the black holes were spinning. Their model was crucial in measuring the properties of the gravitational-wave signal, and The DiRAC Data Centric system COSMA5 was crucial in producing that model.

More information on the detection of gravitational waves can be found at the LIGO collaboration website.

In the figure above, the top plot shows the signal of gravitational waves detected by the LIGO observatory located in Hanford, USA whist the middle plot shows the waveforms predicted by general relativity. The X-axis plots time and the Y-axis plots the strain, which is the fractional amount by which distances are distorted by the passing gravitational wave. The bottom plot shows the LIGO data matches the predications very closely. (Adapted from Fig. 1 in Physics Review Letters 116, 061102 (2016)) Read further…

November 2015:

HPCwire Readers’ Choice Award


STFC DIRAC has been recognized in the annual HPCwire Readers’ and Editors’ Choice Awards, presented at the 2015 International Conference for High Performance Computing, Networking, Storage and Analysis (SC15), in Austin, Texas. The list of winners was revealed by HPCwire both at the event, and on the HPCwire website. STFC DiRAC was recognized with the following honor:

Readers’ Choice – Best Use of High Performance Data Analytics – Stephen Hawking Centre for Theoretical Cosmology, Cambridge University, and the STFC DiRAC HPC Facility uses the first Intel Xeon Phi-enabled SGI UV2000 with its co-designed ‘MG Blade’ Phi-housing and achieved 100X speed-up of MODAL code to probe the Cosmic Background Radiation with optimizations in porting the MODAL to the Intel Xeon Phi coprocessor.

The coveted annual HPCwire Readers and Editors’ Choice Awards are determined through a nomination and voting process with the global HPCwire community, as well as selections from the HPCwire editors. The awards are an annual feature of the publication and constitute prestigious recognition from the HPC community. These awards are revealed each year to kick off the annual supercomputing conference, which showcases high performance computing, networking, storage, and data analysis.

We are thrilled that DIRAC and the Cambridge Stephen Hawking Centre for Theoretical Cosmology and our work through the COSMOS Intel Parallel Computing Centre have received this prestigious award in high performance computing.

In particular we congratulate Paul Shellard, Juha Jaykka and James Brigg from Cambridge for their sterling efforts. It is their ingenuity, skill and innovation that has been recognised by this award.

The award is also recognition of the unique synergy that we have developed between world-leading researchers in theoretical physics from the STFC DiRAC HPC Facility and industry-leading vendors like Intel and SGI, which aims to get maximum impact from new many-core technologies in our data analytic pipelines. This involved new parallel programming paradigms, as well as architectural co-design, which yielded impressive speed-ups for our Planck satellite analysis of the cosmic microwave sky, opening new windows on our Universe.

We have built an innovative and working data analytics system based on heterogeneous CPU architectures. This has meant we had to develop and test new forms of parallel code and test the hardware and operational environment. We can now make the best use of CPUs and lower cost, more powerful, but harder to programme, many core Xeon-Phi chips. This ability to offload detailed analysis functions to faster processors as and when needed greatly decreases the time to produce results. This means we can perform more complex analysis to extract more meaning from the data and to make connections (or correlations) that would have been too time consuming before.

We now have the hardware and software blueprint to build similar systems for the detailed analysis of any kind of dataset. It is truly generic and can be applied just as well to medical imaging, social and economic database analysis as to astronomical image analysis.

For enquiries, please contact Dr Mark Wilkinson, DiRAC Project Director

March 2015:

HPQCD: Weighing up Quarks

A new publication by particle physics theorists working on DiRAC has been highlighted as the “Editor’s Suggestion” in a top particle physics journal because it is “particularly important, interesting and well written”. The calculation gives a new, more accurate determination of the masses of quarks using the most realistic simulations of the subatomic world to date. This is an important ingredient in understanding how a deeper theory than our current Standard Model could give rise to these different masses for fundamental particles.

Quark masses are difficult to determine because quarks are never seen as free particles. The strong force interactions between them to keep them bound into composite particles known as hadrons that are seen in particle detectors. This is in contrast to electrons which can be studied directly and their mass measured in experiment. Quark masses instead must be inferred by matching experimental results for the masses of hadrons to those obtained from theoretical calculations using the theory of the strong force, Quantum Chromodynamics (QCD). Progress by the HPQCD collaboration using a numerically intensive technique known as lattice QCD means that this can now be done to better than 1% precision. The publication determines the charm quark mass to high accuracy (shown in the figure) and then uses a ratio of the charm quark mass to other quark masses to determine them.

The research was done by researchers at Cambridge, Glasgow and Plymouth working with collaborators at Cornell University (USA) and Regensburg (Germany) as part of the High Precision QCD (HPQCD) Collaboration. The paper is published in the latest issue of Physics Review D and can be accessed here. The calculations were carried out on the Darwin supercomputer at the University of Cambridge, part of STFC High Performance Computing Facility known as DiRAC. The speed and flexibility of this computer was critical to completing the large set of numerical calculations that had to be done for this project.


DiRAC Services support a significant portion of STFC’s science programme, providing simulation and data modelling resources for the UK Frontier Science theory community in Particle Physics, astroparticle physics, Astrophysics, cosmology, solar system & planetary science and Nuclear physics (PPAN; collectively STFC Frontier Science). DiRAC services are optimised for these research communities and operate as a single distributed facility which provides the range of architectures needed to deliver our world-leading science outcomes.

Based at four University sites (Cambridge; Leicester; Durham & Edinburgh), we host three Services,: Data Intensive Cambridge; Data Intensive Leicester; Memory Intensive and Extreme Scaling.

Information on how to apply for time on our Services can be found here, and how our Services map onto our Science agenda can be found here. The DiRAC Data Management Plan is available for download here.

For general enquires please email DiRAC Support or the Project Office.

Data Intensive Service

The Data Intensive Service is jointly hosted by the Universities of Cambridge and Leicester.

Data Intensive@Cambridge

DiRAC has a part share of the CSD3 petascale HPC platform (Cumulus & Wilkes2)  hosted at Cambridge University.


The Cumulus system provides a total of 3.92 Petaflops of compute capability consisting of:    

  • 1152 Skylake nodes each with 2 x Intel Xeon Gold 6142 processors, 2.6GHz 16-core (32 cores per node)

 768 nodes with 192 GB memory
 384 nodes with 384 GB memory

672 Cascade Lake nodes each with  2 x Intel Xeon Platinum 8276 processors, 2.6GHz 28-core (56 cores per node)

616 nodes with 192 GB memory

56 nodes with 384 GB memory

  • 342 C6320p node Intel KNL Cluster (Intel Xeon Phi CPU 7210 @ 1.30GHz) with 96GB of RAM per node
  • The HPC interconnect is
    • Intel OmniPath, 2:1 blocking (Skylake and KNL)
    • Mellanox HDR Infiniband, 3:1 blocking (Cascade Lake)
  • The DiRAC share of Skylake is 22146 CPUs and of KNL its 44 Nodes

With 2.2714 PFlops the Cumulus CPU/KNL cluster is at position number 87 in the November 2018 Top500
list of the 500 most powerful commercially available computer systems.


The Wilkes2 system provides 1.19 petaflops of compute capability 

  • 360 NVIDIA GPU cluster with four NVIDIA Tesla P100 GPUs, in 90 Dell EMC server nodes, each with 96GB memory connected by Mellanox EDR Infiniband, providing 1.19 petaflops of computational performance.
  • The DiRAC share of Wilkes2 is 46GPUs
  • Storage available to DiRAC consists of 3.94PB of storage storage offering a Lustre parallel filesystem and 650GB of tape. 

      For more information email Cambridge Support

      Data Intensive@Leicester


      Data Intensive 2.5x

      The DI system has two login nodes, Mellanox EDR interconnect in a 2:1 blocking setup and 3PB Lustre storage.

      Main Cluster

      • 408 dual-socket nodes with Intel Xeon Skylake 6140, two FMA AVX512, 2.3GHz; 36 cores, 192 GB RAM. 14688 cores  and 3.5PB storage in total.


      • 1 x 6TB server with 144 cores X6154@ 3.0GHz base
      • 10 x 1.5TB server with 36 cores X6240@ 2.3GHz base

      The DI System at Leicester is designed to offer fast, responsive I/O.

      Further information is available on the web page or by emailing Leicester support.

      Memory Intensive Service

      The Memory Intensive Service is hosted by the University of Durham at the Institute for Computational Cosmology (ICC).

      Memory Intensive 2.5x

      DiRAC’s Memory Intensive Resource

      • 2x 1.5TB and 1x 768GB login nodes with Intel Xeon 5120 Skylake processors, 1FMA AVX512, 2.2GHz, 28 cores

      • 452 compute nodes, each with 512 GB of RAM and 2 x X5120 2.2Ghz per node, offering a total of 12 656 cores.

      • The system is connected via Mellanox EDR in a 2:1 blocking configuration. 512TB of fast I/O scratch space and 3.1PB of Data space on Lustre.

      Memory Intensive 2 (Formerly “Data Centric”)

      • About 9000 cores in the COSMA6 cluster.  Approximately 570 nodes offer 128GB of memory per node and are connected via a Mellanox FDR 10 2:1 Blocking Infiniband fabric. Storage capacity on COSMA6 is 2.6PB.

      • The IB fabric connects COSMA6 to Lustre filesystem, with the I/O performance for both being 10-11GB/s write and 5-6GB/s read

      More information on the Memory Intensive 2 system can be found  here  and further enquiries on the Memory Intensive Service can be emailed to ICC Support

      Extreme Scaling Service

      The Extreme Scaling Service is hosted by the University of Edinburgh. DiRAC Extreme Scaling (also know as Tesseract) is available to industry, commerce and academic researchers. General information on Tesseract, as well as the User Guide, is available here.

      The Tesseract compute service is based around an HPE SGI 8600 system with 1476 compute nodes.

      There are 1468 standard compute nodes, each with two 2.1 GHz, 12-core Intel Xeon (Skylake) Silver 4116 processors and 96 GB of memory. In addition, there are 8 GPU compute nodes each with two 2.1 GHz, 12-core Intel Xeon (Skylake) Silver 4116 processors; 96 GB of memory; and 4 NVidia V100 (Volta) GPU accelerators connected over NVlink.

      All compute nodes are connected together by a single Intel Omni-Path fabric and all nodes access the 3 PB Lustre file system.

      As well as the fast, parallel Lustre storage, Tesseract also provides a tiered storage solution based on zero watt disk storage and tape storage built on the HPE DMF solution.

      Further information on the Extreme Scaling Service is available by emailing DiRAC Support.


      Our Services Supporting our Science

      DiRAC operates within a framework of well-established science cases which have been fully peer reviewed to deliver a transformative research programme aimed at creating novel and improved computing techniques and facilities. We tailor our Services’ architectures towards solving these science problems and by doing so help underpin research covering the full remit of STFC’s astronomy, particle, nuclear and accelerator physics Science Challenges. Some brief illustrations of how our Services map onto our Science Agenda can be found below and for more information please email theProject Office.

      The Data Intensive Service addresses the problems associated with driving scientific discovery through the analysis of large data sets using a combination of modelling and simulation, e.g. the large-volume data sets from flagship astronomical satellites such as Planck and Gaia, and ground-based facilities such as the Square Kilometre Array (SKA).  One project using the Data Intensive Service is looking at breaking resonances between migrating planets.

      The Memory Intensive Service supports detailed and complex simulations related to Computational Fluid Dynamic problems, for example cosmological simulations of galaxy formation and evolution, which require access to very large amounts of memory (more than 300 terabytes) to enable codes to ‘follow’ structures as they form.   The innovative design of this Service supports physically detailed simulations which can use an entire DiRAC machine for weeks or months at a time. More on the Virgo project, which uses the Memory Intensive Service can be found here.

      The Extreme Scaling Service supports codes that make full use of multi-petaflop HPC systems. DiRAC works with industry on the design of systems using Lattice QCD in theoretical particle physics as a driver.   This field of physics provides theoretical input on the properties of hadrons to assist with the interpretation of data from experiments such as the Large Hadron Collider. To find out more about one of the Lattice QCD projects using the Extreme Scaling Service see the 2017 Science Highlights page.

      The DiRAC Data Management Plan can be found here.