VIking will is approaching end of life. With the success of Viking the university has invested in new hardware. Viking2 will be bigger with more CPUs, more memory and more GPUs. In addition to this Viking2 will be moved to a lower than net zero data center to allow staff and students to complete their computational work sustainably. Viking2 will be located in Sweden taking advantage of lower energy costs and the data centers excellent sustainability credentials.
...
Viking2 | Viking1 | |
Compute node only CPU cores | 12,864 | 7000 |
Total standard compute nodes | 134 | 170 |
Compute node generation | AMD EPYC3 | Intel Xeon 6138 |
Cores per processor | 48 | 20 |
Number of processors per node | 2 | 2 |
Memory per compute node | 512 GB | 127 nodes 192GB |
33 nodes 384 GB | ||
High memory node | 2x 2TB | 2 x 768 GB |
High memory node | 1x 4 TB | 2 x 1.5 TB |
GPUs | 48 A40 | 8 V100s |
12 H100 | ||
Scratch (PB) | 1.5 | 2.5 |
Warm storage (PB) | 2.0 | |
Usable NVME storage (TB) | 215 | 48 |
Interconnect type | 100Gb OPA | 100Gb Mellanox |
Supporting Statements for grant submission
If you are putting together a grant application you can sometimes add in-kind contributions from the University. Viking2 has yet to be fully costed but Viking1 was costed at 1p per cpu hour, the power price is not dissimilar for Viking2 .
Short statement
The University of York has invested £2.5 million in a new high performance compute cluster. The new “viking” cluster is a larger replacement for the current Viking HPC system. Viking2 has been designed to meet a wide mix of research requirements. This new system will also be housed in EcoDatacentre in Sweden, taking advantage of their negative carbon sustainability practices and 100% renewable energy sources. Heat generated by the datacentre is re-used to dry wood pellets for district heating.
It is made available to all researchers, including students, and is free of charge to use. It provides:
12,864 AMD cores
48 A40 and 12 H100 GPUs
1.5PB of high performance working storage
- 215 TB NVME storage
High performance networking
The ability to burst into public cloud services
The technical facility is backed up with staff support, including central and departmentally embedded posts, providing assistance with usage of HPC facilities, data management, research software engineering and code optimisation.
Researchers also have access to the full range of centrally provided services, detailed in the service catalogue at https://www.york.ac.uk/it-services/services/
Longer statement
The University of York is committed to enhancing its position as one of the world’s premier institutions for inspirational and life-changing research, and has made this the first of three key objectives in its 2014-20 strategy. Supporting research is one of five programmesin the IT strategy, alongside Departmental and Faculty IT, and as such we are able to offer a number of services to academics. Through doing this we hope to free up time and resource that can be funnelled solely towards research objectives. The University has recently invested in £2.5 million towards a new HPC facility, which will be larger than the current N8 (at 7000 cores) and will provide academics with the latest computing infrastructure for all their computational workloads. This investment is indicative of the drive to give researchers access to the best facilities possible, and has the potential to be a trans-formative resource. As part of this investment we also offer a number of free courses in popular programming languages and help and support to access these facilities.
Additional support is provided through storage for datasets, where we provide 1-2TB of free storage for all principal investigators. We will also be building a cost effective storage solution for large datasets. Recognising the importance of impact in research, we also provide a dynamic web hosting platform, and virtual machines, free of charge, allowing research to be published to the web in dynamic and interesting ways, and allowing the public and researchers to interact with it.
In addition to providing commodity and research services, we also have a team dedicated to research computing. In addition to a Head of Research Computing, based centrally, we have an embedded team comprised of two research software engineers and an HPC Linux expert. These staff are based in Biology, Physics and Chemistry, but as part of a wider strategy they support research across the institution, and are actively involved in central HPC development.
A full list of central services is provided in the service catalogue, available at https://www.york.ac.uk/it-services/services/