wiki:HPCResourceManagement

Version 12 (modified by ros, 18 months ago) (diff)

HPC Resource Management

Managing CPU and storage allocations on national facilities

CMS manage HPC resources on behalf of the 357 HPC users in the atmospheric and polar research community on these facilities:

  • 160 Million CPU-hrs of ARCHER compute
  • 504TB of ARCHER work disc
  • 2.4PB of Research Data Facility GPFS storage
  • 5.6PB of JASMIN storage

Access to ARCHER is through either National Capability or standard NERC research awards. ARCHER, RDF and JASMIN resource requests are reviewed by the NERC HPC Steering Committee. Contact CMS for advice on HPC availability, and resourcing computer time and data needs.

Supporting the Met Office, EPCC, NERC and ESPRC on HPC delivery

The UK atmospheric science and Earth System modelling community have available several HPC platforms on which to run large numerical simulations and data analysis programs, notably

  • PUMA the Reading system which provides workflow infrastructure and access to ARCHER and MONSooN compute
  • ARCHER the EPSRC/NERC Cray XC30
  • MONSooN the NERC/Met Office Cray XC40
  • JASMIN the JASMIN super-data-cluster

CMS provide and maintain the software infrastructure needed to run the Met Office Unified Model on ARCHER and MONSooN and work closely with CEDA to deliver JASMIN capability.

Contact CMS for information and advice on accessing these resources for your modelling needs. CMS can also advise on the suitability of other platforms for your modelling project.