Changes between Version 12 and Version 13 of HPCResourceManagement
- Timestamp:
- 11/02/19 22:42:39 (2 years ago)
Legend:
- Unmodified
- Added
- Removed
- Modified
-
HPCResourceManagement
v12 v13 2 2 3 3 4 4 == Managing CPU and storage allocations on national facilities == #managecpu 5 5 6 6 CMS manage HPC resources on behalf of the 357 HPC users in the atmospheric and polar research community on these facilities: … … 12 12 13 13 Access to ARCHER is through either National Capability or standard NERC research awards. 14 ARCHER, RDF and JASMIN resource requests are reviewed by the NERC HPC Steering Committee. [wiki:/ContactUs Contact CMS ] for advice on HPC availability, and resourcing computer time and data needs.14 ARCHER, NECXS, RDF and JASMIN resource requests are reviewed by the NERC HPC Steering Committee. [wiki:/ContactUs Contact CMS ] for advice on HPC availability, and resourcing computer time and data needs. 15 15 16 16 17 17 == Supporting the Met Office, EPCC, NERC and ESPRC on HPC delivery == #supportdelivery 18 18 19 19 The UK atmospheric science and Earth System modelling community have available several HPC platforms on which to run large numerical simulations and data analysis programs, notably 20 20 21 * [wiki:/PumaService PUMA] the Reading system which provides workflow infrastructure and access to ARCHER and MONSooN compute 22 * [https://www.archer.ac.uk/ ARCHER ] the EPSRC/NERC Cray XC30 23 * [http://collab.metoffice.gov.uk/twiki/bin/view/Support/WhatIsMONSooN MONSooN ]the NERC/Met Office Cray XC40 21 * [wiki:/PumaService PUMA] the Reading system which provides workflow infrastructure and access to ARCHER, Monsoon & NEXCS compute 22 * [https://www.archer.ac.uk/ ARCHER ] Cray XC30 - the UKRI national service jointly funded by EPSRC/NERC 23 * [http://collab.metoffice.gov.uk/twiki/bin/view/Support/WhatIsMONSooN MONSooN ] the NERC/Met Office Cray XC40 24 * [https://collab.metoffice.gov.uk/twiki/bin/view/Support/NEXCS] NERC only share of the Met Office Cray XC40 24 25 * [http://www.jasmin.ac.uk/ JASMIN ] the JASMIN super-data-cluster 25 26