DiRAC was established to provide distributed High Performance Computing (HPC) services to the STFC theory community. HPC-based modelling is an essential tool for the exploitation and interpretation of observational and experimental data generated by astronomy and particle physics facilities support by STFC as this technology allows scientists to test their theories and run simulations from the data gathered in experiments. The UK has an extremely strong HPC community and these powerful computing facilities allow the UK science community to pursue cutting-edge research on a broad range of topics, from simulating the entire evolution of the universe, from the big bang to the present, to modelling the fundamental structure of matter. DiRAC is both an academic-led and an academic-supervised facility and our systems are specifically designed to meet the different high performance computational needs within our scientific community.
DiRAC provides a variety of compute Resources that match machine architecture to the different algorithm design and requirements of the research problems to be solved. There are sound scientific reasons for designing the DiRAC services in this way and the methodology was adopted following a number of in-depth reviews involving the STFC research community. The bespoke demands of the different research domains supported by STFC are such that a distributed installation was the most cost effective way to satisfy the varied scientific requirements.
As a single, federated Facility, DiRAC allows more effective and efficient use of computing resources, supporting the delivery of the science programmes across the STFC research communities, addressing all the STFC Science Challenges. It provides a common training and consultation framework and, crucially, provides critical mass and a coordinating structure for both small and large scale cross-discipline science projects, the technical support needed to run and develop a distributed HPC service, and a pool of expertise to support knowledge transfer and industrial partnership projects. The on-going development and sharing of best-practice for the delivery of productive, national HPC services within DiRAC enables STFC researchers to deliver world-leading science across the entire STFC theory programme in particle physics, astrophysics and cosmology, solar system physics, particle astrophysics and nuclear physics.
As was originally envisaged, DiRAC has become a vibrant research space, both in terms of Science and in terms of technical development. These two aspects of our activities are intimately linked with each feeding back into the other and driving research excellence in theoretical simulation and modellin alongside world-leading technical innovation. DiRAC’s technical achievements are as important as our scientific achievements; they are key to our scientific impact and key to our impact on the UK Economy as a whole.
A new particle that has recently been discovered at CERN confirms predictions made by theoretical physicists over six years ago. The result, delivered with a little help from the Darwin supercomputer, confirms existing particle theory, but also opens the door to new physics.
DiRAC has been awarded 8 STFC Innovation Fellowships that are of duration 6 months and have to be completed by 31 March 2020. In this scheme a final year PhD student or an early career researcher can have a funded placement (up to £21k) with a third-party organisation.
qualify you have to be working on research that falls within the STFC
remit in order to qualify for the placement; however you can be
funded by other organisations besides STFC, as long as the subject
area is identifiable as being in the Particle Physics, Astronomy &
Cosmology, Solar Physics and Planetary Science, Astro-particle
Physics, and Nuclear Physics.
The deadline for applications is 10am on Monday 8th July 2019.
DiRAC deploys Atempo Miria for Archiving.
Recently, DiRAC’s Memory Intensive facility in Durham called on the services of Atempo, the Data Protection and Movement specialists, together with their UK partner, OCF, to implement a multi-petabyte archiving project for their Lustre and Spectrum Scale (GPFS) data.
Free webinar, Wednesday 22nd May 2019, 15:00 BST: Open Source HPC Benchmarking. Presented by Andy Turner, EPCC.
There is a large and continuing investment in HPC services around the UK, Europe and beyond and this, along with new processor technologies appearing on the HPC market, has led to a wider range of advanced computing architectures available to researchers.
We have undertaken a comparative benchmarking exercise across a range of architectures to help improve our understanding of the performance characteristics of these platforms and help researchers choose the best services for different stages of their research workflows.
We will present results comparing the performance of different architectures for traditional HPC applications (e.g. CFD, periodic electronic structure) and synthetic benchmarks (for assessing I/O and interconnect performance limits). We will also describe how we have used an open research model where all the results and analysis methodologies are publicly available at all times. We will comment on differences between architectures and demonstrate the benefits of working in an open way.
DiRAC’s Technical Manager gives Headline Talk at local BCS networking event
24th April 2019: DiRAC’s Technical Manager Lydia Heck is giving the Headline Talk at the local British Computer Society (Newcastle and District Branch) networking event this evening. She will be discussing DiRAC@Durham’s Memory Intensive machine and explaining how this powerful resource is helping to unlock crucial insights into our Universe.
HPC-AI Advisory Council 2019, Swiss Conference & HPCXXL User Group
DiRAC’s Director Dr Mark Wilkinson’s talk from the HPC-AI Advisory Council 2019 Swiss Conference, entitled: “40 Powers of 10 – Simulating the Universe with the DiRAC HPC Facility“, is now available on YouTube and also features on the Inside HPC Website.
Theory predictions come up trumps
A particle that is an ‘excited’ bound state of a bottom quark and a charm antiquark has been discovered at the Large Hadron Collider and its mass is in agreement with a prediction made by the HPQCD collaboration back in 2012 using STFC’s DiRAC facility. HPQCD used a numerical technique known as lattice QCD to solve the theory of the strong force, Quantum Chromodynamics. This enabled them to calculate the masses of several bound states of bottom and anticharm, each with the quarks in a different configuration, collectively known as the Bc mesons. The CMS and LHCb collaborations have both now reported in 2019 the first clear evidence for the member of this set called the Bc’ meson.
The lightest Bc meson, known simply as the Bc, has the bottom and anticharm quarks spinning in opposite directions so that its spin is zero. This is the lowest energy configuration for bottom-anticharm and simplest to calculate in lattice QCD. In 2005 HPQCD (with the Fermilab lattice collaboration) successfully predicted the mass of the Bc meson, ahead of its discovery by the CDF experiment at the Fermilab Tevatron collider. The large mass of this meson, 6.27 GeV/c2(where the proton mass is 0.94 GeV/c2), along with its quark-antiquark content meant that a proton collider was needed to produce it and made it hard to find experimentally.
In 2012, armed with the computing power of DiRAC and the much-improved QCD calculations that that allowed, HPQCD were able to revisit the topic and calculate the masses of many more states. They predicted the mass of the Bc* meson, a particle with spin because the bottom and anti-charm quarks are spinning in the same direction inside it. They also predicted the masses of excited states of the Bc and Bc* , known as the Bc’ and Bc*’. These are the analogues of the electronic radial excitations of the hydrogen atom. The mass difference between the Bc’ and the Bc is then a consequence of the way in which the bottom and anti-charm quark are bound together through strong force interactions. To predict this mass difference from QCD requires the numerical techniques of lattice QCD because QCD has such complicated non-linear interactions. In arXiv:1207.5149 HPQCD found the mass difference between Bc’ and Bc to be 0.616(19) GeV/c2; the CMS result for this mass difference in arXiv:1902.00571 (and LHCb in arXiv:1904.00081) is 0.5961(14) GeV/c2, in good agreement
Figure 1 shows the HPQCD predictions for Bc meson masses along with the current experimental values. Mesons containing b quarks are the Achilles heel of the Standard Model since their rare decay processes are sensitive to the existence of new particles. The Bc meson family provides a new chapter in this search that theory and experiment are now beginning to exploit. The HPQCD collaboration remains at the forefront of this work and is pushing ahead with more precise calculations of Bc masses and differential decay rates on DiRAC-2.5.
Dr Debora Sijacki wins the PRACE Ada Lovelace Award for HPC 2019
Huge Congratulations to DiRAC Researcher Dr Debora Sijacki who has won the PRACE Ada Lovelace Award for HPC 2019. This prestigious prize is awarded annually to a young female European scientist in recognition of their outstanding impact on HPC research and computational science at a global level and for being a role model for young women beginning their careers in HPC. Well done Debora!
Debora is based at the Institute of Astronomy, University of Cambridge (personal webpage) and more information on the Partnership for Advanced Computing in Europe (PRACE) and the Ada Lovelace Award for HPC 2019 can be found here.
Advance Announcement: September 2019:
DiRAC Day 2019 @ University of Leicester
This year, the Annual DiRAC Science Day event, will be held at the University of Leicester on the 12th of September. The day provides an opportunity to meet researchers from across the DiRAC community and learn about their recent science achievements. In addition, our industry partners will be these to talk about new hardware and software advances which may benefit DiRAC research.
Full details regarding registration, accommodation etc will be available via the DiRAC website shortly.
We also expect to host a hackathon over the three days leading up to DiRAC day – details will announced soon and will be posted on our Training page.
DiRAC researchers on this year’s Clarivate Analytics Highly Cited Researchers List
Three DiRAC@Durham researchers, Professors Carlos Frenk, Tom Theuns and Adrian Jenkins, appear on this year’s Clarivate Analytics Highly Cited Researchers List. Highly Cited researchers rank in the top 1% by citations for their field and are making a huge impact in solving the world’s biggest challenges.
We are extremely proud of Carlos, Tom and Adrian as their inclusion in this list is a particularly noteworthy achievement and is a demonstration of their global influence.
The RAC makes an annual Call for Proposals for requesting time on our Resources. The 11th Call opened on the 9th July 2018 and will close on the 1st October 2018. The Call Announcement, the Guidance Notes and Application Forms are available on our Call for Proposals page.
Advance Announcement: September 2018:
DiRAC Day 2018 @ Swansea University.
We are looking forward to our 8th Annual DiRAC Science Day event, this year being held at Swansea University on the 12th of September. The day provides an opportunity to meet others from the DiRAC community and learn about the recent research achievements of our different consortia.
Swansea University are also running a number of other co-located training/networking events in the week commencing 9th September and details can be found on our Training page.
New models give insight into the heart of the Rosette Nebula.
Through computer simulations run in part on DiRAC Resources, astronomers at Leeds and at Keele University have found the formation of the Nebula is likely to be in a thin sheet-like molecular cloud rather than in a spherical or thick disc-like shape, as some photographs may suggest. A thin disc-like structure of the cloud focusing the stellar winds away from the cloud’s centre would account for the comparatively small size of the central cavity.
Members of the DiRAC Project Management Team travelled this year to Denver Colorado to attend the SuperComputing 2017 industry conference. More information on what went on can be found here.
The 7th Annual DiRAC Day event.
Our 2017 Dirac Day event was held at Exeter University on the 30th August. Find out more at the dedicated web page.
DiRAC HPC Manager talks to Computer Scientific World
Dr Lydia Heck, Senior Computer Manager in the Department of Physics at
Durham University, talks to Robert Roe of Computer Scientific World in this article
looking at managing HPC performance and exploring the options available
to optimise the use of resources. Discussing DiRAC’s series of COSMA
machines, Lydia talks about the hurdles her team has overcome whilst
implementing a new workload management system, SLURM and using a Lustre
file system for the latest DiRAC iteration: COSMA 6.
DiRAC partners in Peta-5
DiRAC partners in Peta-5
Six Tier 2 High Performance Computing (HPC) centres were officially launched on Thursday 30 March at the Thinktank science museum in Birmingham. Funded by £20 million from the Engineering and Physical Sciences Research Council (EPSRC) the centres will give academics and industry access to powerful computers to support research in engineering and the physical sciences.
DiRAC will partner in The Petascale Intensive Computation and Analytics facility at the University of Cambridge which will provide the large-scale data simulation and high performance data analytics designed to enable advances in material science, computational chemistry, computational engineering and health informatics.
6th Annual DiRAC Science Day.
On September 8th, the University of Edinburgh hosted the sixth annual DiRAC Science Day. This gave our researchers in the DiRAC HPC Community the opportunity to meet each other and the technical teams from each site, learn about what is being done by all the different projects running on the DiRAC facility and discuss future plans. The Day was generously sponsored by Bull, Atos, Dell, Hewlett Packard Enterprise, Intel, Cray, DDN, Lenovo, Mellanox, OCF and Seagate.
Dr. Jeremy Yates opened the meeting with an update on facility developments and then Prof. Christine Davies led a community discussion on several issues including the training needs of young researchers. The Science presentations then began with a talk on Simulating Realistic Galaxy Clusters, followed by a review of lattice QCD calculations and an exciting presentation from Prof. Mark Hannam on the recent detection of Gravitational Waves and the key role DiRAC played in converting information from the gravitational-wave signal into results for the properties of the colliding black holes.
During lunch 23 posters show-cased some of the other research done on the facility and then the day split into parallel Science and Technical Sessions. In the Science session, presentations were made on: The hadronic vacuum polarisation contribution to the Anomalous Magnetic Moment of the Muon; The Robustness of Inflation to Inhomogeneous Inflation; A Critical View of Interstellar Medium Modelling in Cosmological Simulations and finally, Magnetic Fields in Galaxies. The Technical session presented talks on: Emerging Technologies; Grid; A Next Generation Data Parallel C++ Library; An Overview of the DiRAC-3 Benchmark Suite and a lecture on SWIFT – Scaling on Next Generation Architectures.
Figure 1. Dr Andrew Lytle and his poster.
During tea the poster prizes were announced and congratulations go to Dr Andrew Lytle (U. of Glasgow) for his poster on Semileptonic B_c Decays from Full Lattice QCD and to Dr Bernhard Mueller (Queens U. Belfast) for his poster on Core-Collapse Supernova Explosion Models from 3D Progenitors. They each won a £500 Amazon voucher from our kind sponsor DDN. Dr Lytle and his winning poster can be seen in the figure on the right.
Further Science session talks after tea were: Growing Black Holes at High Redshift; Planet Formation and Disc Evolution and finally, Modelling the Birth of a Star. The Technical session included a talk on the Co-design of Cray Software Components and ended with an interesting review of AAAI, Cloud and Data Management: DiRAC in the National E-Infrastructure, given by Dr. Yates. The Day concluded with a Drinks Reception outside the lecture theatres that was well attended and much enjoyed by all.
DiRAC simulations play a key role in gravitational-wave discovery.
On February 11 2016, the LIGO collaboration announced the first direct detection of gravitational waves and the first observaton of binary black holes. Accurate theoretical models of the signal were needed to find it and, more importantly, to decode the signal to work out what the source was. These models rely on large numbers of numerial solutions of Einstein’s equations for the last orbits and merger of two black holes, for a variety of binary configurations. The DiRAC Data Centric system, COSMA5, was used by researchers at Cardiff University to perform these simuations. With these results, along with international collaborators, they constructed the generic-binary model that was used to measure the masses of the two black holes that were detected, the mass of the final black hole, and to glean some basic information about how fast the black holes were spinning. Their model was crucial in measuring the properties of the gravitational-wave signal, and The DiRAC Data Centric system COSMA5 was crucial in producing that model.
In the figure above, the top plot shows the signal of gravitational waves detected by the LIGO observatory located in Hanford, USA whist the middle plot shows the waveforms predicted by general relativity. The X-axis plots time and the Y-axis plots the strain, which is the fractional amount by which distances are distorted by the passing gravitational wave. The bottom plot shows the LIGO data matches the predications very closely. (Adapted from Fig. 1 in Physics Review Letters 116, 061102 (2016)) Read further…
HPCwire Readers’ Choice Award
STFC DIRAC has been recognized in the annual HPCwire Readers’ and Editors’ Choice Awards, presented at the 2015 International Conference for High Performance Computing, Networking, Storage and Analysis (SC15), in Austin, Texas. The list of winners was revealed by HPCwire both at the event, and on the HPCwire website. STFC DiRAC was recognized with the following honor:
Readers’ Choice – Best Use of High Performance Data Analytics – Stephen Hawking Centre for Theoretical Cosmology, Cambridge University, and the STFC DiRAC HPC Facility uses the first Intel Xeon Phi-enabled SGI UV2000 with its co-designed ‘MG Blade’ Phi-housing and achieved 100X speed-up of MODAL code to probe the Cosmic Background Radiation with optimizations in porting the MODAL to the Intel Xeon Phi coprocessor.
The coveted annual HPCwire Readers and Editors’ Choice Awards are determined through a nomination and voting process with the global HPCwire community, as well as selections from the HPCwire editors. The awards are an annual feature of the publication and constitute prestigious recognition from the HPC community. These awards are revealed each year to kick off the annual supercomputing conference, which showcases high performance computing, networking, storage, and data analysis.
We are thrilled that DIRAC and the Cambridge Stephen Hawking Centre for Theoretical Cosmology and our work through the COSMOS Intel Parallel Computing Centre have received this prestigious award in high performance computing.
In particular we congratulate Paul Shellard, Juha Jaykka and James Brigg from Cambridge for their sterling efforts. It is their ingenuity, skill and innovation that has been recognised by this award.
The award is also recognition of the unique synergy that we have developed between world-leading researchers in theoretical physics from the STFC DiRAC HPC Facility and industry-leading vendors like Intel and SGI, which aims to get maximum impact from new many-core technologies in our data analytic pipelines. This involved new parallel programming paradigms, as well as architectural co-design, which yielded impressive speed-ups for our Planck satellite analysis of the cosmic microwave sky, opening new windows on our Universe.
We have built an innovative and working data analytics system based on heterogeneous CPU architectures. This has meant we had to develop and test new forms of parallel code and test the hardware and operational environment. We can now make the best use of CPUs and lower cost, more powerful, but harder to programme, many core Xeon-Phi chips. This ability to offload detailed analysis functions to faster processors as and when needed greatly decreases the time to produce results. This means we can perform more complex analysis to extract more meaning from the data and to make connections (or correlations) that would have been too time consuming before.
We now have the hardware and software blueprint to build similar systems for the detailed analysis of any kind of dataset. It is truly generic and can be applied just as well to medical imaging, social and economic database analysis as to astronomical image analysis.
A new publication by particle physics theorists working on DiRAC has been highlighted as the “Editor’s Suggestion” in a top particle physics journal because it is “particularly important, interesting and well written”. The calculation gives a new, more accurate determination of the masses of quarks using the most realistic simulations of the subatomic world to date. This is an important ingredient in understanding how a deeper theory than our current Standard Model could give rise to these different masses for fundamental particles.
Quark masses are difficult to determine because quarks are never seen as free particles. The strong force interactions between them to keep them bound into composite particles known as hadrons that are seen in particle detectors. This is in contrast to electrons which can be studied directly and their mass measured in experiment. Quark masses instead must be inferred by matching experimental results for the masses of hadrons to those obtained from theoretical calculations using the theory of the strong force, Quantum Chromodynamics (QCD). Progress by the HPQCD collaboration using a numerically intensive technique known as lattice QCD means that this can now be done to better than 1% precision. The publication determines the charm quark mass to high accuracy (shown in the figure) and then uses a ratio of the charm quark mass to other quark masses to determine them.
The research was done by researchers at Cambridge, Glasgow and Plymouth working with collaborators at Cornell University (USA) and Regensburg (Germany) as part of the High Precision QCD (HPQCD) Collaboration. The paper is published in the latest issue of Physics Review D and can be accessed here. The calculations were carried out on the Darwin supercomputer at the University of Cambridge, part of STFC High Performance Computing Facility known as DiRAC. The speed and flexibility of this computer was critical to completing the large set of numerical calculations that had to be done for this project.
DiRAC Services support a significant portion of STFC’s science programme, providing simulation and data modelling resources for the UK Frontier Science theory community in Particle Physics, astroparticle physics, Astrophysics, cosmology, solar system & planetary science and Nuclear physics (PPAN; collectively STFC Frontier Science). DiRAC services are optimised for these research communities and operate as a single distributed facility which provides the range of architectures needed to deliver our world-leading science outcomes.
Information on how to apply for time on our Services can be found here, and how our Services map onto our Science agenda can be found here. The DiRAC Data Management Plan is available for download here.
2 x 1.5TB login nodes with 5120 Intel Xeon Skylake processors, 1FMA AVX512, 2.2GHz, 28 cores
452 compute nodes, each with 512 GB of RAM and 2 x X5120 2.2Ghz per node, offering a total of 12 656 cores.
The system is connected via Mellanox EDR in a 2:1 blocking configuration. 333TB of fast I/O scratch space and 1.6PB of Data space on Lustre.
Memory Intensive 2 (Formerly “Data Centric”)
9184 cores in the COSMA6 cluster. The 574 nodes offer 128GB of memory per node and are connected via a Mellanox FDR 10 2:1 Blocking Infiniband fabric. Storage capacity on COSMA6 is 2.5PB.
The IB fabric connects COSMA6 to Lustre filesystem, with the I/O performance for both being 10-11GB/s write and 5-6GB/s read
More information on the Memory Intensive 2 system can be found here and further enquiries on the Memory Intensive Service can be emailed to ICC Support.
Extreme Scaling Service
The Extreme Scaling Service is hosted by the University of Edinburgh. DiRAC Extreme Scaling (also know as Tesseract) is available to industry, commerce and academic researchers. General information on Tesseract, as well as the User Guide, is available here.
4116 Intel Xeon Skylake processors, 1468 nodes, 12 cores per socket, two sockets per node, FMA AVX512, 2.2GHz base, 3.0Ghz turbo, 96GB RAM
8 GPU compute nodes with two 2.1GHz, 12-core Intel Xeon (Skylake) Silver 4166 processors; 96 GB of memory; and 4 Nvidia V100 (Volta) GPU accelerators connected over PCIe
3PB lustre storage and Hypercube OPA interconnect.
This system is configured for good to excellent strong scaling and vectorised codes and has High Performance I/O and Interconnect.
Further information on the Extreme Scaling Service is available by emailing DiRAC Support.
Our Services Supporting our Science
DiRAC operates within a framework of well-established science cases which have been fully peer reviewed to deliver a transformative research programme aimed at creating novel and improved computing techniques and facilities. We tailor our Services’ architectures towards solving these science problems and by doing so help underpin research covering the full remit of STFC’s astronomy, particle, nuclear and accelerator physics Science Challenges. Some brief illustrations of how our Services map onto our Science Agenda can be found below and for more information please email theProject Office.
The Data Intensive Serviceaddresses the problems associated with driving scientific discovery through the analysis of large data sets using a combination of modelling and simulation, e.g. the large-volume data sets from flagship astronomical satellites such as Planck and Gaia, and ground-based facilities such as the Square Kilometre Array (SKA). One project using the Data Intensive Service is looking at breaking resonances between migrating planets.
The Memory Intensive Servicesupports detailed and complex simulations related to Computational Fluid Dynamic problems, for example cosmological simulations of galaxy formation and evolution, which require access to very large amounts of memory (more than 300 terabytes) to enable codes to ‘follow’ structures as they form. The innovative design of this Service supports physically detailed simulations which can use an entire DiRAC machine for weeks or months at a time. More on the Virgo project, which uses the Memory Intensive Service can be found here.
The Extreme Scaling Servicesupports codes that make full use of multi-petaflop HPC systems. DiRAC works with industry on the design of systems using Lattice QCD in theoretical particle physics as a driver. This field of physics provides theoretical input on the properties of hadrons to assist with the interpretation of data from experiments such as the Large Hadron Collider. To find out more about one of the Lattice QCD projects using the Extreme Scaling Service see the 2017 Science Highlights page.