Services
HPC Resource Management
Managing CPU and storage allocations on National Facilities
CMS manage HPC resources on behalf of the ~300 HPC users in the atmospheric and polar research community on the following facilities:
- 640 Million Core-hours of ARCHER2 compute
- 655TB of ARCHER2 work disk
- 9.5PB of JASMIN storage
Access to ARCHER2 is through either National Capability or standard NERC research awards. ARCHER and JASMIN resource requests are reviewed by the NERC HPC Steering Committee. Contact CMS for advice on HPC availability, and resourcing compute time and data needs.
Supporting the Met Office, EPCC, NERC and EPSRC on HPC Delivery
The UK atmospheric science and Earth System modelling community have available several HPC platforms on which to run large numerical simulations and data analysis programs, notably
- ARCHER2 Cray EX - the UKRI national service jointly funded by EPSRC/NERC
- Monsoon2 - the NERC/Met Office Cray XC40
- JASMIN - the JASMIN super-data-cluster
- PUMA - the NCAS-CMS system which provides workflow infrastructure and access to ARCHER2, JASMIN and other local system.
CMS provide and maintain the software infrastructure needed to run the Met Office Unified Model on ARCHER2 and work closely with CEDA to deliver JASMIN capability.
Contact CMS for information and advice on accessing these resources for your modelling needs. CMS can also advise on the suitability of other platforms for your modelling project.