Call for Proposals

Access to DiRAC is co-ordinated by The STFC’s DiRAC Resource Allocation Committee, which puts out an annual Call for Full Proposals to request time and operates a Seedcorn Time programme for small requests.


Seedcorn Time

More information on our Seedcorn Time programme can be found here.

Full Call for Proposals

For each Full Call for Proposals the RAC produces a set of documentation that includes a number of forms that every applicant must complete. They also provide a comprehensive set of guidance notes.

11th Call for Full Applications (current call)

The 11th Call will open in early July 2018 and close on 1st October 2018 and allocations awarded will begin on 1st April 2019.

In the meantime, the revised RAC 11th Call Guidance Notes are available now.

The Resource Application Form, the Technical Assessement Form and the Request for Software Engineering Resource Form for the 11th Call will be posted here when the Call opens.

–>

 

11th Call for Applications (current call)

The 11th Call opened on the 9th July 2018. The Call will close on the 1st October 2018 and allocations will begin on 1st April 2019. Key dates for the STFC’s internal processing of applications will be posted here later in the summer.

The RAC publishes a new set of documentation for each Call and the Announcement, Forms and Guidance Notes for the 11th Call are below.

Please note that the submission address for the Technical Assessment Form has changed since the forms were first posted on the 9th of July. The submission address is now:
dirac-support@epcc.ed.ac.uk.

The submission address for all other documentation, including the Call Application Form & RSE Request Form, has not changed and remains: DiRACRAC@stfc.ac.uk

The Key Changes implemented in the 11th Call include:

    • Applicants must have submitted and discussed a technical assessment (per proposal) with the DiRAC TWG;

       

      • Applicants will only be considered if this has been initiated a month before the closing date of 1st October 2018 (i.e. the technical assessment must be submitted by 1st September 2018).

 

    • No single application can request more than 80% of the availability of a DiRAC machine within a given year;

 

    • Currently existing Thematic proposals can submit proposals with the following options;

 

        • Applications with scientific themes distinct from the existing award can be submitted as a separate proposal.

       

      • Applications building on the same scientific theme as an existing award should apply as a new project, this new award would then replace any existing compute award. Applications in this category are expected to be observed if significant new resources become available which would greatly effect a project, or if significant scientific developments warrant a project update. PI’s requesting for a revised or updated long project must justify this request fully; the RAC will take into account all currently active projects which hold a comparable science case. This method cannot be used to top-up an existing project award if there is no change in the science case.

 

    • The application and scoring matrix has been updated to include a strong emphasis on Project Management and Data Management Plans;

 

    • Research Software Engingeering Request Forms should be submitted with the Call Application Form, on or before the Call Closing date of 1st October.

 


10th Call Application Documentation (closed)

The 10th Call for Proposals closed on 14th November 2017 and the allocations started on the 1st May 2018. The awards from the 10th Call are available here.

The forms and guidance from the 10th Call for Proposals (closed) are below. The RAC publishes a new set of documentation for each call.


For help with application queries please contact the DiRAC Director: Dr Mark Wilkinson and/or the DiRAC Technical Director: Professor Peter Boyle.

For all other enquiries please contact our Project Office


Uncategorized

Science

DiRAC caters for a significant portion of STFC’s science, providing simulation and data modelling resources for the UK Frontier Science theory communities in Particlar Physics, astroparticle physics, Astrophysics, cosmology, solar system science and Nuclear physics (PPAN; collectively, STFC Frontier Science). Each year we published a selection of our science highlights and these can be found below.

For information on how our Science maps onto our Services, check out our Resources page.


2016 Highlights

In February the LIGO collaboration announced the first direct detection of gravitational waves and the first observation of binary black holes, and the DiRAC Data Centric System COSMA5 was used Read more…

The HPQCD Group also continued their research into Lattice QCD with the team using DiRAC to develop a new method for measuring the hadronic vacuum polariation (HVP). They were able to determine Read more…


2015 Highlights

Our HPQCD group members continue the search for new physics in the magnetic moment of the muon. They used DiRAC simulations to develop a new method of determining  Read more…

Colleagues from the HORIZON UK-Consortium furthered their quest to improve the interpretation of future imaging surveys from the Euclid satellite and the Large Synoptic Survey Telescope. These surveys aim to Read more…


2014 Highlights

Our UKMHD Consortium members have been looking at the Origins of Magnetism in the Quiet Sun and used DiRAC to run computationally challenging massively parallel simulations of convective  Read more…

The ViRGO Consortium continued with its flagship EAGLE simulation project, which is opening a window on the role and physics of baryons in the universe by creating hi-fidelity hydrodynamic simulations  Read more…


2013 Highlights

Our HOT QCD members have been investigating the Quark-Gluon Plasma phase that is created when quarks become free. Looking specifically at how the plasma expands and flows as a bulk material Read more…

The ECOGAL users have performed large-scale numerical simulations that can resolve the dense regions where stars form, and hence directly study the physics that drives star formation. These complex simulations Read more…


Uncategorized
Virgo simulation Virgo simulation

About

What makes DiRAC special…

DiRAC was established to provide distributed High Performance Computing (HPC) services to the STFC theory community. HPC-based modelling is an essential tool for the exploitation and interpretation of observational and experimental data generated by astronomy and particle physics facilities support by STFC as this technology allows scientists to test their theories and run simulations from the data gathered in experiments. The UK has an extremely strong HPC community and these powerful computing facilities allow the UK science community to pursue cutting-edge research on a broad range of topics, from simulating the entire evolution of the universe, from the big bang to the present, to modelling the fundamental structure of matter. DiRAC is both an academic-led and an academic-supervised facility and our systems are specifically designed to meet the different high performance computational needs within our scientific community.

DiRAC provides a variety of compute Resources that match machine architecture to the different algorithm design and requirements of the research problems to be solved. There are sound scientific reasons for designing the DiRAC services in this way and the methodology was adopted following a number of in-depth reviews involving the STFC research community. The bespoke demands of the different research domains supported by STFC are such that a distributed installation was the most cost effective way to satisfy the varied scientific requirements.

As a single, federated Facility, DiRAC allows more effective and efficient use of computing resources, supporting the delivery of the science programmes across the STFC research communities, addressing all the STFC Science Challenges. It provides a common training and consultation framework and, crucially, provides critical mass and a coordinating structure for both small and large scale cross-discipline science projects, the technical support needed to run and develop a distributed HPC service, and a pool of expertise to support knowledge transfer and industrial partnership projects. The on-going development and sharing of best-practice for the delivery of productive, national HPC services within DiRAC enables STFC researchers to deliver world-leading science across the entire STFC theory programme in particle physics, astrophysics and cosmology, solar system physics, particle astrophysics and nuclear physics.

As was originally envisaged, DiRAC has become a vibrant research space, both in terms of Science and in terms of technical development. These two aspects of our activities are intimately linked with each feeding back into the other and driving research excellence in theoretical simulation and modellin alongside world-leading technical innovation. DiRAC’s technical achievements are as important as our scientific achievements; they are key to our scientific impact and key to our impact on the UK Economy as a whole.

Home

News

November 2018:

DiRAC researchers on this year’s Clarivate Analytics Highly Cited Researchers List

Three DiRAC@Durham researchers, Professors Carlos Frenk,  Tom Theuns and Adrian Jenkins, appear on this year’s Clarivate Analytics Highly Cited Researchers List. Highly Cited researchers rank in the top 1% by citations for their field and are making a huge impact in solving the world’s biggest challenges.

We are extremely proud of Carlos, Tom and Adrian as their inclusion in this list is a particularly noteworthy achievement and is a demonstration of their global influence.

For more information see: https://hcr.clarivate.com


June 2018:

RAC 11th Call for Proposals Opens

The RAC makes an annual Call for Proposals for requesting time on our Resources. The 11th Call opened on the 9th July 2018 and will close on the 1st October 2018. The Call Announcement, the Guidance Notes and Application Forms are available on our Call for Proposals page.


Advance Announcement: September 2018:

DiRAC Day 2018 @ Swansea University.

We are looking forward to our 8th Annual DiRAC Science Day event, this year being held at Swansea University on the 12th of September. The day provides an opportunity to meet others from the DiRAC community and learn about the recent research achievements of our different consortia.

Swansea University are also running a number of other co-located training/networking events in the week commencing 9th September and details can be found on our Training page.


Feburary 2018:

New models give insight into the heart of the Rosette Nebula.

Through computer simulations run in part on DiRAC Resources, astronomers at Leeds and at Keele University have found the formation of the Nebula is likely to be in a thin sheet-like molecular cloud rather than in a spherical or thick disc-like shape, as some photographs may suggest. A thin disc-like structure of the cloud focusing the stellar winds away from the cloud’s centre would account for the comparatively small size of the central cavity.

More information can be found on the STFC press release published here and on our 2017 Science Highlights page.


sc17full

November 2017:

DiRAC @ Supercomputing 2017.

Members of the DiRAC Project Management Team travelled this year to Denver Colorado to attend the SuperComputing 2017 industry conference.  More information on what went on can be found here.


planet-sim.png

August 2017:

The 7th Annual DiRAC Day event.

Our 2017 Dirac Day event was held at Exeter University on the 30th August.  Find out more at the dedicated web page.


April 2017:

DiRAC HPC Manager talks to Computer Scientific World

Dr Lydia Heck, Senior Computer Manager in the Department of Physics at Durham University, talks to Robert Roe of Computer Scientific World in this article looking at managing HPC performance and exploring the options available to optimise the use of resources. Discussing DiRAC’s series of COSMA machines, Lydia talks about the hurdles her team has overcome whilst implementing a new workload management system, SLURM and using a Lustre file system for the latest DiRAC iteration: COSMA 6.


March 2017:

DiRAC partners in Peta-5

Six Tier 2 High Performance Computing (HPC) centres were officially launched on Thursday 30 March at the Thinktank science museum in Birmingham. Funded by £20 million from the Engineering and Physical Sciences Research Council (EPSRC) the centres will give academics and industry access to powerful computers to support research in engineering and the physical sciences.

DiRAC will partner in The Petascale Intensive Computation and Analytics facility at the University of Cambridge which will provide the large-scale data simulation and high performance data analytics designed to enable advances in material science, computational chemistry, computational engineering and health informatics.


September 2016:

6th Annual DiRAC Science Day.

On September 8th, the University of Edinburgh hosted the sixth annual DiRAC Science Day. This gave our researchers in the DiRAC HPC Community the opportunity to meet each other and the technical teams from each site, learn about what is being done by all the different projects running on the DiRAC facility and discuss future plans. The Day was generously sponsored by Bull, Atos, Dell, Hewlett Packard Enterprise, Intel, Cray, DDN, Lenovo, Mellanox, OCF and Seagate.

Dr. Jeremy Yates opened the meeting with an update on facility developments and then Prof. Christine Davies led a community discussion on several issues including the training needs of young researchers. The Science presentations then began with a talk on Simulating Realistic Galaxy Clusters, followed by a review of lattice QCD calculations and an exciting presentation from Prof. Mark Hannam on the recent detection of Gravitational Waves and the key role DiRAC played in converting information from the gravitational-wave signal into results for the properties of the colliding black holes.

During lunch 23 posters show-cased some of the other research done on the facility and then the day split into parallel Science and Technical Sessions. In the Science session, presentations were made on: The hadronic vacuum polarisation contribution to the Anomalous Magnetic Moment of the Muon; The Robustness of Inflation to Inhomogeneous Inflation; A Critical View of Interstellar Medium Modelling in Cosmological Simulations and finally, Magnetic Fields in Galaxies. The Technical session presented talks on: Emerging Technologies; Grid; A Next Generation Data Parallel C++ Library; An Overview of the DiRAC-3 Benchmark Suite and a lecture on SWIFT – Scaling on Next Generation Architectures.

LIGO Detections Figure 1. Dr Andrew Lytle and his poster.

During tea the poster prizes were announced and congratulations go to Dr Andrew Lytle (U. of Glasgow) for his poster on Semileptonic B_c Decays from Full Lattice QCD and to Dr Bernhard Mueller (Queens U. Belfast) for his poster on Core-Collapse Supernova Explosion Models from 3D Progenitors. They each won a £500 Amazon voucher from our kind sponsor DDN. Dr Lytle and his winning poster can be seen in the figure on the right.

Further Science session talks after tea were: Growing Black Holes at High Redshift; Planet Formation and Disc Evolution and finally, Modelling the Birth of a Star. The Technical session included a talk on the Co-design of Cray Software Components and ended with an interesting review of AAAI, Cloud and Data Management: DiRAC in the National E-Infrastructure, given by Dr. Yates. The Day concluded with a Drinks Reception outside the lecture theatres that was well attended and much enjoyed by all.


February 2016:

DiRAC simulations play a key role in gravitational-wave discovery.

LIGO Detections

Figure 1. The top plot shows the signal of gravitational waves detected by the LIGO observatory located in Hanford, USA whist the middle plot shows the waveforms predicted by general relativity. The X-axis plots time and the Y-axis plots the strain, which is the fractional amount by which distances are distorted by the passing gravitational wave. The bottom plot shows the LIGO data matches the predications very closely. (Adapted from Fig. 1 in Physics Review Letters 116, 061102 (2016))

On February 11 2016, the LIGO collaboration announced the first direct detection of gravitational waves and the first observaton of binary black holes. Accurate theoretical models of the signal were needed to find it and, more importantly, to decode the signal to work out what the source was. These models rely on large numbers of numerial solutions of Einstein’s equations for the last orbits and merger of two black holes, for a variety of binary configurations. The DiRAC Data Centric system, COSMA5, was used by researchers at Cardiff University to perform these simuations. With these results, along with international collaborators, they constructed the generic-binary model that was used to measure the masses of the two black holes that were detected, the mass of the final black hole, and to glean some basic information about how fast the black holes were spinning. Their model was crucial in measuring the properties of the gravitational-wave signal, and The DiRAC Data Centric system COSMA5 was crucial in producing that model.

More information on the detection of gravitational waves can be found at the LIGO collaboration website.

In the figure above, the top plot shows the signal of gravitational waves detected by the LIGO observatory located in Hanford, USA whist the middle plot shows the waveforms predicted by general relativity. The X-axis plots time and the Y-axis plots the strain, which is the fractional amount by which distances are distorted by the passing gravitational wave. The bottom plot shows the LIGO data matches the predications very closely. (Adapted from Fig. 1 in Physics Review Letters 116, 061102 (2016)) Read further…


November 2015:

HPCwire Readers’ Choice Award

wire.png

STFC DIRAC has been recognized in the annual HPCwire Readers’ and Editors’ Choice Awards, presented at the 2015 International Conference for High Performance Computing, Networking, Storage and Analysis (SC15), in Austin, Texas. The list of winners was revealed by HPCwire both at the event, and on the HPCwire website. STFC DiRAC was recognized with the following honor:

Readers’ Choice – Best Use of High Performance Data Analytics – Stephen Hawking Centre for Theoretical Cosmology, Cambridge University, and the STFC DiRAC HPC Facility uses the first Intel Xeon Phi-enabled SGI UV2000 with its co-designed ‘MG Blade’ Phi-housing and achieved 100X speed-up of MODAL code to probe the Cosmic Background Radiation with optimizations in porting the MODAL to the Intel Xeon Phi coprocessor.

The coveted annual HPCwire Readers and Editors’ Choice Awards are determined through a nomination and voting process with the global HPCwire community, as well as selections from the HPCwire editors. The awards are an annual feature of the publication and constitute prestigious recognition from the HPC community. These awards are revealed each year to kick off the annual supercomputing conference, which showcases high performance computing, networking, storage, and data analysis.

We are thrilled that DIRAC and the Cambridge Stephen Hawking Centre for Theoretical Cosmology and our work through the COSMOS Intel Parallel Computing Centre have received this prestigious award in high performance computing.

In particular we congratulate Paul Shellard, Juha Jaykka and James Brigg from Cambridge for their sterling efforts. It is their ingenuity, skill and innovation that has been recognised by this award.

The award is also recognition of the unique synergy that we have developed between world-leading researchers in theoretical physics from the STFC DiRAC HPC Facility and industry-leading vendors like Intel and SGI, which aims to get maximum impact from new many-core technologies in our data analytic pipelines. This involved new parallel programming paradigms, as well as architectural co-design, which yielded impressive speed-ups for our Planck satellite analysis of the cosmic microwave sky, opening new windows on our Universe.

We have built an innovative and working data analytics system based on heterogeneous CPU architectures. This has meant we had to develop and test new forms of parallel code and test the hardware and operational environment. We can now make the best use of CPUs and lower cost, more powerful, but harder to programme, many core Xeon-Phi chips. This ability to offload detailed analysis functions to faster processors as and when needed greatly decreases the time to produce results. This means we can perform more complex analysis to extract more meaning from the data and to make connections (or correlations) that would have been too time consuming before.

We now have the hardware and software blueprint to build similar systems for the detailed analysis of any kind of dataset. It is truly generic and can be applied just as well to medical imaging, social and economic database analysis as to astronomical image analysis.

For enquiries, please contact Dr Mark Wilkinson, DiRAC Project Director


March 2015:

HPQCD: Weighing up Quarks

A new publication by particle physics theorists working on DiRAC has been highlighted as the “Editor’s Suggestion” in a top particle physics journal because it is “particularly important, interesting and well written”. The calculation gives a new, more accurate determination of the masses of quarks using the most realistic simulations of the subatomic world to date. This is an important ingredient in understanding how a deeper theory than our current Standard Model could give rise to these different masses for fundamental particles.

Quark masses are difficult to determine because quarks are never seen as free particles. The strong force interactions between them to keep them bound into composite particles known as hadrons that are seen in particle detectors. This is in contrast to electrons which can be studied directly and their mass measured in experiment. Quark masses instead must be inferred by matching experimental results for the masses of hadrons to those obtained from theoretical calculations using the theory of the strong force, Quantum Chromodynamics (QCD). Progress by the HPQCD collaboration using a numerically intensive technique known as lattice QCD means that this can now be done to better than 1% precision. The publication determines the charm quark mass to high accuracy (shown in the figure) and then uses a ratio of the charm quark mass to other quark masses to determine them.

The research was done by researchers at Cambridge, Glasgow and Plymouth working with collaborators at Cornell University (USA) and Regensburg (Germany) as part of the High Precision QCD (HPQCD) Collaboration. The paper is published in the latest issue of Physics Review D and can be accessed here. The calculations were carried out on the Darwin supercomputer at the University of Cambridge, part of STFC High Performance Computing Facility known as DiRAC. The speed and flexibility of this computer was critical to completing the large set of numerical calculations that had to be done for this project.


Home

Resources

DiRAC Services support a significant portion of STFC’s science programme, providing simulation and data modelling resources for the UK Frontier Science theory community in Particle Physics, astroparticle physics, Astrophysics, cosmology, solar system & planetary science and Nuclear physics (PPAN; collectively STFC Frontier Science). DiRAC services are optimised for these research communities and operate as a single distributed facility which  provides the range of architectures needed to deliver our world-leading science outcomes.

We have three Services: Data Intensive; Memory Intensive and Extreme Scaling, with our machines hosted at four University sites across the UK: Cambridge; Durham; Edinburgh and Leicester.

Information on how to apply for time on our Services can be found here, and how our Services map onto our Science agenda can be found here. The DiRAC Data Management Plan is available for download here.

For general enquires please email DiRAC Support or the Project Office.


Data Intensive Service

The Data Intensive Service is jointly hosted by the Universities of Cambridge and Leicester.

Data Intensive@Cambridge

DiRAC has a 13% share of the CSD3 petascale HPC platform (Peta4 & Wilkes2), hosted at Cambridge University.

Peta4
The Peta4 system provides 1.5 petaflops of compute capability:

  • 342 C6320p node Intel KNL Cluster (Intel Xeon Phi CPU 7210 @1.30Ghz) with 96GB of RAM per node.
  • 768 Skylake nodes each with 2 x Intel Xeon Skylake 6142 processors, 2.6GHz 16-core (32 cores per node
    • 384 nodes with 192 GB memory
    • 384 nodes with 384 GB memory
  • The HPC interconnect is Intel OmniPath in 2:1 Blocking
  • The storage consists of 750 TB of disk storage offering a Lustre parallel filesystem and 750 GB of Tape.

hpcs-home.jpg

With 1.697 PFlops the new CSD3 Peta4 CPU/KNL cluster is at position number 75 in the November 2017 Top500 list of the 500 most powerful commercially available computer systems.

Wilkes2
The Wilkes2 system provides 1.19 petaflops of compute capability:

  • 360 NVIDIA GPU cluster with four NVIDIA Tesla P100 GPUs, in 90 Dell EMC server nodes, each with 96GB memory connected by Mellanox EDR Infiniband, providing 1.19 petaflops of computational performance.

For more information email Cambridge Support


Data Intensive@Leicester

lecester

Data Intensive 2.5x

The DI system has two login nodes, Mellanox EDR interconnect in a 2:1 blocking setup and 3PB Lustre storage.

Main Cluster

  • 136 dual-socket nodes with Intel Xeon Skylake 6140, two FMA AVX512, 2.3GHz; 36 cores, 192 GB RAM. 4896 cores in total.

Large-Memory

  • 1 x 6TB server with 144 cores X6154@ 3.0GHz base
  • 3 x 1.5TB server with 36 cores X6140@ 2.3GHz base

The DI System at Leicester is designed to offer fast, responsive I/O.

Data Intensive 2 (formerly “Complexity”)
  • 272 Intel Xeon Sandybridge nodes with 128 GB RAM per node, 4352 cores (95Tflop/s) connected via non-blocking Mellanox FDR interconnect.

  • This cluster features an innovative Switching architecture designed,  built and delivered by Leicester University and Hewlett Packard.

The total storage available to both systems is in excess of 1PB.

Further information is available on the web page or by emailing Leicester support.


Memory Intensive Service

The Memory Intensive Service is hosted by the University of Durham at the Institute for Computational Cosmology (ICC).

Memory Intensive 2.5x

  • 2 x 1.5TB login nodes with 5120 Intel Xeon Skylake processors, 1FMA AVX512, 2.2GHz, 28 cores

  • 147 compute nodes, each with 768 GB of RAM and 2 x X5120 2.2Ghz per node, offering a total of 4116 cores.

  • The system is connected via Mellanox EDR in a 2:1 blocking configuration. 333TB of fast I/O scratch space and 1PB of Data space on Lustre.

Memory Intensive 2 (Formerly “Data Centric”)

  • 11,000 cores over the two clusters COSMA5 and COSMA6.  The nodes offer 128GB of memory per node and are connected via a Mellanox FDR 10 2:1 Blocking Infiniband fabric.

  • The IB fabric connects COSMA5 to a GPFS and COSMA6 to Lustre filesystem, with the I/O performance for both being 10-11GB/s write and 5-6GB/s read

 

More information on the Memory Intensive 2 system can be found  here  and further enquiries on the Memory Intensive Service can be emailed to ICC Support


Extreme Scaling Service

The Extreme Scaling Service is hosted by the University of Edinburgh. DiRAC Extreme Scaling (also know as Tesseract) is available to industry, commerce and academic researhers. General information on Tesseract, as well as the User Guide, is available here.

  • 4116 Intel Xeon Skylake processors, 844 nodes, 12 cores per socket, two sockets per node,  FMA AVX512, 2.2GHz base, 3.0Ghz turbo,  96GB RAM

  • 2.4PB lustre storage and Hypercube OPA interconnect.

  • This system is configured for good to excellent strong scaling and vectorised codes and has High Performance I/O and Interconnect.

Further information on the Extreme Scaling Service is available by emailing DiRAC Support.


Our Services Supporting our Science

DiRAC operates within a framework of well-established science cases which have been fully peer reviewed to deliver a transformative research programme aimed at creating novel and improved computing techniques and facilities. We tailor our Services’ architectures towards solving these science problems and by doing so help underpin research covering the full remit of STFC’s astronomy, particle, nuclear and accelerator physics Science Challenges. Some brief illustrations of how our Services map onto our Science Agenda can be found below and for more information please email theProject Office.

The Data Intensive Service addresses the problems associated with driving scientific discovery through the analysis of large data sets using a combination of modelling and simulation, e.g. the large-volume data sets from flagship astronomical satellites such as Planck and Gaia, and ground-based facilities such as the Square Kilometre Array (SKA).  One project using the Data Intensive Service is looking at breaking resonances between migrating planets.

The Memory Intensive Service supports detailed and complex simulations related to Computational Fluid Dynamic problems, for example cosmological simulations of galaxy formation and evolution, which require access to very large amounts of memory (more than 300 terabytes) to enable codes to ‘follow’ structures as they form.   The innovative design of this Service supports physically detailed simulations which can use an entire DiRAC machine for weeks or months at a time. More on the Virgo project, which uses the Memory Intensive Service can be found here.

The Extreme Scaling Service supports codes that make full use of multi-petaflop HPC systems. DiRAC works with industry on the design of systems using Lattice QCD in theoretical particle physics as a driver.   This field of physics provides theoretical input on the properties of hadrons to assist with the interpretation of data from experiments such as the Large Hadron Collider. To find out more about one of the Lattice QCD projects using the Extreme Scaling Service see the 2017 Science Highlights page.


The DiRAC Data Management Plan can be found here.


Home