wiki:HPCResourceManagement

HPC Resource Management

Managing CPU and storage allocations on national facilities

CMS manage HPC resources on behalf of the 556 HPC users in the atmospheric and polar research community on these facilities:

  • 160 Million Core-hrs of ARCHER compute
  • 14 Million Core-Hours of NEXCS compute
  • 655TB of ARCHER work disc
  • 4.7PB of Research Data Facility GPFS storage
  • 7.5PB of JASMIN storage

Access to ARCHER is through either National Capability or standard NERC research awards. ARCHER, NECXS, RDF and JASMIN resource requests are reviewed by the NERC HPC Steering Committee. Contact CMS for advice on HPC availability, and resourcing computer time and data needs.

Supporting the Met Office, EPCC, NERC and ESPRC on HPC delivery

The UK atmospheric science and Earth System modelling community have available several HPC platforms on which to run large numerical simulations and data analysis programs, notably

  • PUMA the Reading system which provides workflow infrastructure and access to ARCHER, Monsoon & NEXCS compute
  • ARCHER Cray XC30 - the UKRI national service jointly funded by EPSRC/NERC
  • MONSooN the NERC/Met Office Cray XC40
  • NEXCS NERC only share of the Met Office Cray XC40
  • JASMIN the JASMIN super-data-cluster

CMS provide and maintain the software infrastructure needed to run the Met Office Unified Model on ARCHER and MONSooN and work closely with CEDA to deliver JASMIN capability.

Contact CMS for information and advice on accessing these resources for your modelling needs. CMS can also advise on the suitability of other platforms for your modelling project.

Last modified 9 months ago Last modified on 12/02/19 07:49:24