Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Panel

National (Tier-1) Facilities

ARCHER, the UK national supercomputing service offers a capability resource  to allow researchers to run simulations and calculations that require large numbers of processing cores working in a tightly-coupled, parallel fashion. Based around a 118,080 core Cray XC system, the service includes a helpdesk staffed by HPC experts from EPCC with support from Cray Inc. Access is free at point of use for academic researchers working in the EPSRC and NERC domains. Users can also purchase access at a variety of rates.

Panel

International Facilities

Some facilities around the world may also be accessible to UK users. The list below includes facilities or organisations that can provide access to users from the UK.

  • PRACE - PRACE is the pan-European HPC infrastructure through which UK users can get access to some of the largest HPC systems in Europe.
  • DOE INCITE - The US Department of Energy makes access to its Leadership Computing facilities available to users worldwide through the INCITE programme.

Image RemovedImage Added


Panel

Regional (Tier-2) Facilities

The Tier-2 layer of HPC forms a vital part of an integrated e-infrastructure landscape; it addresses the gulf in capability from the university Viking system to ARCHER. 

Facilities

  • (Coming soon!) NICE at Durham will be one of the EPSRC Tier-2 facilities. NICE will primarily comprise of 32 IBM Power 9 dual-CPU nodes, each with 4 NVIDIA V100 GPUs and high performance interconnect. This is the same architecture as the US government’s SUMMIT and SIERRA supercomputers, which occupied the top two places in a recently published list of the world’s fastest supercomputers.
  • Cirrus at EPCC is one of the EPSRC Tier-2 HPC facilities. The main resource is a 10,080 core SGI/HPE ICE XA system. Free access is available to academic researchers working in the EPSRC domain and from some UK universities; academic users from other domains and institutions can purchase access. Industry access is available.
  • Isambard at GW4 is one of the EPSRC Tier-2 HPC facilities. Isambard provides multiple advanced architectures within the same system in order to enable evaluation and comparison across a diverse range of hardware platforms. Free access is available to academic researchers working in the EPSRC domain and from some UK universities; academic users from other domains and institutions can purchase access. Industry access is available.
  • Cambridge Service for Data Driven Discovery (CSD3) is one of the EPSRC Tier-2 HPC facilities. CSD3 is a multi-institution service underpinned by an innovative, petascale, data-centric HPC platform, designed specifically to drive data-intensive simulation and high-performance data analysis. Free access is available to academic researchers working in the EPSRC domain and from some UK universities; academic users from other domains and institutions can purchase access. Industry access is available.
  • Athena at HPC Midlands+ is one of the EPSRC Tier-2 HPC facilities. The main resource is a 14,336 core Huawei X6000 system supplied by Clustervision. The service also includes 5 POWER8 compute nodes with large amounts of RAM to support high performance data analysis. Free access is available to academic researchers working in the EPSRC domain and from some UK universities; academic users from other domains and institutions can purchase access. Industry access is available.
  • JADE (Joint Academic Data science Endeavour) is one of the EPSRC Tier-2 HPC facilities. The system design exploits the capabilities of NVIDIA's DGX-1 Deep Learning System which has eight of its newest Tesla P100 GPUs tightly coupled by its high-speed NVlink interconnect. The DGX-1 runs optimized versions of many standard machine learning software packages such as Caffe, TensorFlow, Theano and Torch. Free access is available to academic researchers working in the EPSRC domain and from some UK universities; academic users from other domains and institutions can purchase access. Industry access is available.
  • MMM Hub (Materials and Molecular Modelling Hub) The theory and simulation of materials is one of the most thriving and vibrant areas of modern scientific research today. Designed specifically for the materials and molecular modelling community, this Tier 2 supercomputing facility is available to HPC users all over the UK. The MMM Hub was established in 2016 with a £4m EPSRC grant awarded to collaborators The Thomas Young Centre (TYC), and the Science and Engineering South Consortium (SES). The MMM Hub is led by University College London on behalf of the eight collaborative partners who sit within the TYC and SES: Imperial, King’s, QMUL, Oxford, Southampton, Kent, Belfast and Cambridge.
  • DiRAC DiRAC is the STFC HPC facility for particle physics and astronomy researchers. It is currently made up of five different systems with different architectures. These range from an extreme scaling IBM BG/Q system, a large SGI/HPE UV SMP system, and a number of Intel Xeon multicore HPC systems. Free access is available to academic researchs working in the STFC domain; academic researchers from other domains can purchase access. Industry access is also available.

Tier-2 HPC is fundamental because it provides a diversity of computing architectures, which are driven by science needs and are not met by the national facilities or universities.

Training of skilled computational scientists and computational software developers is also provided at Tier-2 level. The Tier-2 layer of HPC provides easy, local access and training for users.