Opened 9 months ago

Last modified 8 months ago

#3456 new help

fcm fail for u-br916

Reported by: NoelClancy Owned by: jules_support
Component: JULES Keywords:
Cc: Platform:
UM Version:

Description

Hi,

There is an fcm fail for suite u-br916. Don't know what is wrong.

Environment variables set for netCDF Fortran bindings in

/apps/libs/netCDF/intel14/fortran/4.2/

You will also need to link your code to a compatible netCDF C library in

/apps/libs/netCDF/intel14/4.3.2/

[FAIL] mpif90 -oo/water_constants_mod.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/params/standalone/water_constants_mod_jls.F90: command not found
[FAIL] compile 0.0 ! water_constants_mod.o ← jules/src/params/standalone/water_constants_mod_jls.F90
[FAIL] mpif90 -oo/veg_param.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/science/params/veg_param_mod.F90: command not found
[FAIL] compile 0.0 ! veg_param.o ← jules/src/science/params/veg_param_mod.F90
[FAIL] mpif90 -oo/u_v_grid.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/standalone/var/u_v_grid.F90: command not found
[FAIL] compile 0.0 ! u_v_grid.o ← jules/src/control/standalone/var/u_v_grid.F90
[FAIL] mpif90 -oo/trif_vars_mod.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/shared/trif_vars_mod.F90: command not found
[FAIL] compile 0.0 ! trif_vars_mod.o ← jules/src/control/shared/trif_vars_mod.F90
[FAIL] mpif90 -oo/trif.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/science/params/trif_mod.F90: command not found
[FAIL] compile 0.0 ! trif.o ← jules/src/science/params/trif_mod.F90
[FAIL] mpif90 -oo/trifctl.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/shared/trifctl.F90: command not found
[FAIL] compile 0.0 ! trifctl.o ← jules/src/control/shared/trifctl.F90
[FAIL] mpif90 -oo/timestep_mod.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/standalone/var/timestep_mod.F90: command not found
[FAIL] compile 0.0 ! timestep_mod.o ← jules/src/control/standalone/var/timestep_mod.F90
[FAIL] mpif90 -oo/theta_field_sizes.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/science/params/theta_field_sizes_mod.F90: command not found
[FAIL] compile 0.0 ! theta_field_sizes.o ← jules/src/science/params/theta_field_sizes_mod.F90
[FAIL] mpif90 -oo/top_pdm.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/shared/top_pdm.F90: command not found
[FAIL] compile 0.0 ! top_pdm.o ← jules/src/control/shared/top_pdm.F90
[FAIL] mpif90 -oo/soil_ecosse_vars_mod.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/shared/soil_ecosse_vars_mod.F90: command not found
[FAIL] compile 0.0 ! soil_ecosse_vars_mod.o ← jules/src/control/shared/soil_ecosse_vars_mod.F90
[FAIL] mpif90 -oo/switches.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/shared/switches.F90: command not found
[FAIL] compile 0.0 ! switches.o ← jules/src/control/shared/switches.F90
[FAIL] mpif90 -oo/solinc_data.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/standalone/var/solinc_data.F90: command not found
[FAIL] compile 0.0 ! solinc_data.o ← jules/src/control/standalone/var/solinc_data.F90
[FAIL] mpif90 -oo/sf_diags_mod.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/science/surface/sf_diags_mod.F90: command not found
[FAIL] compile 0.0 ! sf_diags_mod.o ← jules/src/science/surface/sf_diags_mod.F90
[FAIL] mpif90 -oo/sind_mod.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/science/river_routing/shared/sind.F90: command not found
[FAIL] compile 0.0 ! sind_mod.o ← jules/src/science/river_routing/shared/sind.F90
[FAIL] mpif90 -oo/rndm.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/imogen/rndm.F90: command not found
[FAIL] compile 0.0 ! rndm.o ← jules/src/control/imogen/rndm.F90
[FAIL] mpif90 -oo/response.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/imogen/response.F90: command not found
[FAIL] compile 0.0 ! response.o ← jules/src/control/imogen/response.F90
[FAIL] mpif90 -oo/redis.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/imogen/redis.F90: command not found
[FAIL] compile 0.0 ! redis.o ← jules/src/control/imogen/redis.F90
[FAIL] mpif90 -oo/radf_co2.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/imogen/radf_co2.F90: command not found
[FAIL] compile 0.0 ! radf_co2.o ← jules/src/control/imogen/radf_co2.F90
[FAIL] mpif90 -oo/prognostics.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/shared/prognostics.F90: command not found
[FAIL] compile 0.0 ! prognostics.o ← jules/src/control/shared/prognostics.F90
[FAIL] mpif90 -oo/precision_mod.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/params/standalone/precision_mod.F90: command not found
[FAIL] compile 0.0 ! precision_mod.o ← jules/src/params/standalone/precision_mod.F90
[FAIL] mpif90 -oo/qsat_data_mod.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/science/surface/qsat_data_mod.F90: command not found
[FAIL] compile 0.0 ! qsat_data_mod.o ← jules/src/science/surface/qsat_data_mod.F90
[FAIL] mpif90 -oo/pftparm.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/science/params/pftparm_mod.F90: command not found
[FAIL] compile 0.0 ! pftparm.o ← jules/src/science/params/pftparm_mod.F90
[FAIL] mpif90 -oo/pdm_vars.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/shared/pdm_vars.F90: command not found
[FAIL] compile 0.0 ! pdm_vars.o ← jules/src/control/shared/pdm_vars.F90
[FAIL] mpif90 -oo/planet_constants_mod.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/params/standalone/planet_constants_mod_jls.F90: command not found
[FAIL] compile 0.0 ! planet_constants_mod.o ← jules/src/params/standalone/planet_constants_mod_jls.F90
[FAIL] mpif90 -oo/p_s_parms.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/shared/p_s_parms.F90: command not found
[FAIL] compile 0.0 ! p_s_parms.o ← jules/src/control/shared/p_s_parms.F90
[FAIL] mpif90 -oo/ozone_vars.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/shared/ozone_vars.F90: command not found
[FAIL] compile 0.0 ! ozone_vars.o ← jules/src/control/shared/ozone_vars.F90
[FAIL] mpif90 -oo/parkind1.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/utils/drhook_dummy/parkind1.F90: command not found
[FAIL] compile 0.0 ! parkind1.o ← jules/utils/drhook_dummy/parkind1.F90
[FAIL] mpif90 -oo/nvegparm.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/science/params/nvegparm_mod.F90: command not found
[FAIL] compile 0.0 ! nvegparm.o ← jules/src/science/params/nvegparm_mod.F90
[FAIL] mpif90 -oo/nesterov.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/science/fire/nesterov_mod.F90: command not found
[FAIL] compile 0.0 ! nesterov.o ← jules/src/science/fire/nesterov_mod.F90
[FAIL] mpif90 -oo/orog.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/standalone/var/orog.F90: command not found
[FAIL] compile 0.0 ! orog.o ← jules/src/control/standalone/var/orog.F90
[FAIL] mpif90 -oo/metstats_mod.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/util/metstats/metstats_mod.F90: command not found
[FAIL] compile 0.0 ! metstats_mod.o ← jules/src/util/metstats/metstats_mod.F90
[FAIL] mpif90 -oo/mcarthur.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/science/fire/mcarthur_mod.F90: command not found
[FAIL] compile 0.0 ! mcarthur.o ← jules/src/science/fire/mcarthur_mod.F90
[FAIL] mpif90 -oo/missing_data_mod.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/params/standalone/missing_data_mod.F90: command not found
[FAIL] compile 0.0 ! missing_data_mod.o ← jules/src/params/standalone/missing_data_mod.F90
[FAIL] mpif90 -oo/lake_mod.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/shared/lake_mod.F90: command not found
[FAIL] compile 0.0 ! lake_mod.o ← jules/src/control/shared/lake_mod.F90
[FAIL] mpif90 -oo/jules_riversparm.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/science/river_routing/shared/jules_riversparm_mod.F90: command not found
[FAIL] compile 0.0 ! jules_riversparm.o ← jules/src/science/river_routing/shared/jules_riversparm_mod.F90
[FAIL] mpif90 -oo/max_dimensions.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/shared/max_dimensions.F90: command not found
[FAIL] compile 0.0 ! max_dimensions.o ← jules/src/control/shared/max_dimensions.F90
[FAIL] mpif90 -oo/jules_internal.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/shared/jules_internal.F90: command not found
[FAIL] compile 0.0 ! jules_internal.o ← jules/src/control/shared/jules_internal.F90
[FAIL] mpif90 -oo/io_constants.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/params/standalone/io_constants.F90: command not found
[FAIL] compile 0.0 ! io_constants.o ← jules/src/params/standalone/io_constants.F90
[FAIL] mpif90 -oo/jules_mod.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/shared/jules_mod.F90: command not found
[FAIL] compile 0.0 ! jules_mod.o ← jules/src/control/shared/jules_mod.F90
[FAIL] mpif90 -oo/imogen_drive_vars.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/imogen/var/imogen_drive_vars.F90: command not found
[FAIL] compile 0.0 ! imogen_drive_vars.o ← jules/src/control/imogen/var/imogen_drive_vars.F90
[FAIL] mpif90 -oo/imogen_constants.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/params/imogen/imogen_constants.F90: command not found
[FAIL] compile 0.0 ! imogen_constants.o ← jules/src/params/imogen/imogen_constants.F90
[FAIL] compile —— ! drdat.o ← jules/src/control/imogen/drdat.F90
[FAIL] mpif90 -oo/imogen_time.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/params/imogen/imogen_time.F90: command not found
[FAIL] compile 0.0 ! imogen_time.o ← jules/src/params/imogen/imogen_time.F90
[FAIL] mpif90 -oo/imogen_clim.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/imogen/var/imogen_clim.F90: command not found
[FAIL] compile 0.0 ! imogen_clim.o ← jules/src/control/imogen/var/imogen_clim.F90
[FAIL] mpif90 -oo/getlon0_mod.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/science/river_routing/shared/getlon0.F90: command not found
[FAIL] compile 0.0 ! getlon0_mod.o ← jules/src/science/river_routing/shared/getlon0.F90
[FAIL] mpif90 -oo/getlat0_mod.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/science/river_routing/shared/getlat0.F90: command not found
[FAIL] compile 0.0 ! getlat0_mod.o ← jules/src/science/river_routing/shared/getlat0.F90
[FAIL] mpif90 -oo/forcing.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/standalone/var/forcing.F90: command not found
[FAIL] compile 0.0 ! forcing.o ← jules/src/control/standalone/var/forcing.F90
[FAIL] compile —— ! cropparm_io.o ← jules/src/science/params/cropparm_io_mod.F90
[FAIL] mpif90 -oo/fire_vars.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/shared/fire_vars_mod.F90: command not found
[FAIL] compile 0.0 ! fire_vars.o ← jules/src/control/shared/fire_vars_mod.F90
[FAIL] mpif90 -oo/fire_mod.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/science/fire/fire_mod.F90: command not found
[FAIL] compile 0.0 ! fire_mod.o ← jules/src/science/fire/fire_mod.F90
[FAIL] mpif90 -oo/fluxes.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/shared/fluxes.F90: command not found
[FAIL] compile 0.0 ! fluxes.o ← jules/src/control/shared/fluxes.F90
[FAIL] mpif90 -oo/ecosse_param_mod.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/science/params/ecosse_param_mod.F90: command not found
[FAIL] compile 0.0 ! ecosse_param_mod.o ← jules/src/science/params/ecosse_param_mod.F90
[FAIL] mpif90 -oo/dust_parameters_mod.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/params/standalone/dust_parameters_mod_jls.F90: command not found
[FAIL] compile 0.0 ! dust_parameters_mod.o ← jules/src/params/standalone/dust_parameters_mod_jls.F90
[FAIL] compile —— ! clim_calc.o ← jules/src/control/imogen/clim_calc.F90
[FAIL] mpif90 -oo/errormessagelength_mod.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/standalone/var/errormessagelength_mod.F90: command not found
[FAIL] compile 0.0 ! errormessagelength_mod.o ← jules/src/control/standalone/var/errormessagelength_mod.F90
[FAIL] mpif90 -oo/diag_swchs.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/params/standalone/diag_swchs.F90: command not found
[FAIL] compile 0.0 ! diag_swchs.o ← jules/src/params/standalone/diag_swchs.F90
[FAIL] mpif90 -oo/descent.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/science/params/descent_mod.F90: command not found
[FAIL] compile 0.0 ! descent.o ← jules/src/science/params/descent_mod.F90
[FAIL] mpif90 -oo/disaggregated_precip.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/standalone/var/disaggregated_precip.F90: command not found
[FAIL] compile 0.0 ! disaggregated_precip.o ← jules/src/control/standalone/var/disaggregated_precip.F90
[FAIL] compile —— ! c_z0h_z0m.o ← jules/src/science/params/c_z0h_z0m_mod.F90
[FAIL] mpif90 -oo/csigma.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/science/params/csigma_mod.F90: command not found
[FAIL] compile 0.0 ! csigma.o ← jules/src/science/params/csigma_mod.F90
[FAIL] mpif90 -oo/cropparm.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/science/params/cropparm_mod.F90: command not found
[FAIL] compile 0.0 ! cropparm.o ← jules/src/science/params/cropparm_mod.F90
[FAIL] mpif90 -oo/datetime_utils_mod.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/shared/datetime_utils_mod.F90: command not found
[FAIL] compile 0.0 ! datetime_utils_mod.o ← jules/src/control/shared/datetime_utils_mod.F90
[FAIL] mpif90 -oo/cosd_mod.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/science/river_routing/shared/cosd.F90: command not found
[FAIL] compile 0.0 ! cosd_mod.o ← jules/src/science/river_routing/shared/cosd.F90
[FAIL] mpif90 -oo/conversions_mod.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/params/standalone/conversions_mod_jls.F90: command not found
[FAIL] compile 0.0 ! conversions_mod.o ← jules/src/params/standalone/conversions_mod_jls.F90
[FAIL] mpif90 -oo/crop_vars_mod.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/shared/crop_vars_mod.F90: command not found
[FAIL] compile 0.0 ! crop_vars_mod.o ← jules/src/control/shared/crop_vars_mod.F90
[FAIL] mpif90 -oo/chemistry_constants_mod.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/params/standalone/chemistry_constants_mod.F90: command not found
[FAIL] compile 0.0 ! chemistry_constants_mod.o ← jules/src/params/standalone/chemistry_constants_mod.F90
[FAIL] mpif90 -oo/ccarbon.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/science/params/ccarbon_mod.F90: command not found
[FAIL] compile 0.0 ! ccarbon.o ← jules/src/science/params/ccarbon_mod.F90
[FAIL] mpif90 -oo/coastal.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/standalone/var/coastal.F90: command not found
[FAIL] compile 0.0 ! coastal.o ← jules/src/control/standalone/var/coastal.F90
[FAIL] mpif90 -oo/calc_litter_flux_mod.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/shared/calc_litter_flux_mod.F90: command not found
[FAIL] compile 0.0 ! calc_litter_flux_mod.o ← jules/src/control/shared/calc_litter_flux_mod.F90
[FAIL] mpif90 -oo/c_topog.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/science/params/c_topog_mod.F90: command not found
[FAIL] compile 0.0 ! c_topog.o ← jules/src/science/params/c_topog_mod.F90
[FAIL] compile —— ! areaver_mod.o ← jules/src/science/river_routing/standalone/areaver_mod.F90
[FAIL] mpif90 -oo/canadian.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/science/fire/canadian_mod.F90: command not found
[FAIL] compile 0.0 ! canadian.o ← jules/src/science/fire/canadian_mod.F90
[FAIL] mpif90 -oo/c_sicehc.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/science/params/c_sicehc_mod.F90: command not found
[FAIL] compile 0.0 ! c_sicehc.o ← jules/src/science/params/c_sicehc_mod.F90
[FAIL] mpif90 -oo/c_rmol.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/science/params/c_rmol_mod.F90: command not found
[FAIL] compile 0.0 ! c_rmol.o ← jules/src/science/params/c_rmol_mod.F90
[FAIL] mpif90 -oo/c_surf.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/science/params/c_surf_mod.F90: command not found
[FAIL] compile 0.0 ! c_surf.o ← jules/src/science/params/c_surf_mod.F90
[FAIL] mpif90 -oo/c_grid2grid_mod.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/science/river_routing/shared/c_grid2grid_mod.F90: command not found
[FAIL] compile 0.0 ! c_grid2grid_mod.o ← jules/src/science/river_routing/shared/c_grid2grid_mod.F90
[FAIL] mpif90 -oo/c_bvoc.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/science/params/c_bvoc_mod.F90: command not found
[FAIL] compile 0.0 ! c_bvoc.o ← jules/src/science/params/c_bvoc_mod.F90
[FAIL] mpif90 -oo/c_kappai.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/science/params/c_kappai_mod.F90: command not found
[FAIL] compile 0.0 ! c_kappai.o ← jules/src/science/params/c_kappai_mod.F90
[FAIL] mpif90 -oo/blend_h.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/science/params/blend_h_mod.F90: command not found
[FAIL] compile 0.0 ! blend_h.o ← jules/src/science/params/blend_h_mod.F90
[FAIL] mpif90 -oo/bl_option_mod.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/params/standalone/bl_option_mod.F90: command not found
[FAIL] compile 0.0 ! bl_option_mod.o ← jules/src/params/standalone/bl_option_mod.F90
[FAIL] mpif90 -oo/bvoc_vars.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/shared/bvoc_vars.F90: command not found
[FAIL] compile 0.0 ! bvoc_vars.o ← jules/src/control/shared/bvoc_vars.F90
[FAIL] mpif90 -oo/ancil_info.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/shared/ancil_info.F90: command not found
[FAIL] compile 0.0 ! ancil_info.o ← jules/src/control/shared/ancil_info.F90
[FAIL] mpif90 -oo/aero.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/standalone/var/aero.F90: command not found
[FAIL] compile 0.0 ! aero.o ← jules/src/control/standalone/var/aero.F90
[FAIL] mpif90 -oo/atm_fields_bounds_mod.o -c -DSCMA -DBL_DIAG_HACK -DINTEL_FORTRAN -I./include -I/apps/libs/netCDF/intel14/fortran/4.2/include -heap-arrays -fp-model precise -traceback /home/users/nmc/cylc-run/u-br916/share/fcm_make/preprocess/src/jules/src/control/standalone/var/atm_fields_bounds_mod.F90: command not found
[FAIL] compile 0.0 ! atm_fields_bounds_mod.o ← jules/src/control/standalone/var/atm_fields_bounds_mod.F90
[FAIL] ! aero.mod : depends on failed target: aero.o
[FAIL] ! aero.o : update task failed
[FAIL] ! ancil_info.mod : depends on failed target: ancil_info.o
[FAIL] ! ancil_info.o : update task failed
[FAIL] ! areaver_mod.mod : depends on failed target: areaver_mod.o
[FAIL] ! areaver_mod.o : depends on failed target: conversions_mod.mod
[FAIL] ! atm_fields_bounds_mod.mod: depends on failed target: atm_fields_bounds_mod.o
[FAIL] ! atm_fields_bounds_mod.o: update task failed
[FAIL] ! bl_option_mod.mod : depends on failed target: bl_option_mod.o
[FAIL] ! bl_option_mod.o : update task failed
[FAIL] ! blend_h.mod : depends on failed target: blend_h.o
[FAIL] ! blend_h.o : update task failed
[FAIL] ! bvoc_vars.mod : depends on failed target: bvoc_vars.o
[FAIL] ! bvoc_vars.o : update task failed
[FAIL] ! c_bvoc.mod : depends on failed target: c_bvoc.o
[FAIL] ! c_bvoc.o : update task failed
[FAIL] ! c_grid2grid_mod.mod : depends on failed target: c_grid2grid_mod.o
[FAIL] ! c_grid2grid_mod.o : update task failed
[FAIL] ! c_kappai.mod : depends on failed target: c_kappai.o
[FAIL] ! c_kappai.o : update task failed
[FAIL] ! c_rmol.mod : depends on failed target: c_rmol.o
[FAIL] ! c_rmol.o : update task failed
[FAIL] ! c_sicehc.mod : depends on failed target: c_sicehc.o
[FAIL] ! c_sicehc.o : update task failed
[FAIL] ! c_surf.mod : depends on failed target: c_surf.o
[FAIL] ! c_surf.o : update task failed
[FAIL] ! c_topog.mod : depends on failed target: c_topog.o
[FAIL] ! c_topog.o : update task failed
[FAIL] ! c_z0h_z0m.mod : depends on failed target: c_z0h_z0m.o
[FAIL] ! c_z0h_z0m.o : depends on failed target: max_dimensions.mod
[FAIL] ! calc_litter_flux_mod.mod: depends on failed target: calc_litter_flux_mod.o
[FAIL] ! calc_litter_flux_mod.o: update task failed
[FAIL] ! canadian.mod : depends on failed target: canadian.o
[FAIL] ! canadian.o : update task failed
[FAIL] ! ccarbon.mod : depends on failed target: ccarbon.o
[FAIL] ! ccarbon.o : update task failed
[FAIL] ! chemistry_constants_mod.mod: depends on failed target: chemistry_constants_mod.o
[FAIL] ! chemistry_constants_mod.o: update task failed
[FAIL] ! clim_calc.o : depends on failed target: missing_data_mod.mod
[FAIL] ! coastal.mod : depends on failed target: coastal.o
[FAIL] ! coastal.o : update task failed
[FAIL] ! conversions_mod.mod : depends on failed target: conversions_mod.o
[FAIL] ! conversions_mod.o : update task failed
[FAIL] ! cosd_mod.mod : depends on failed target: cosd_mod.o
[FAIL] ! cosd_mod.o : update task failed
[FAIL] ! crop_vars_mod.mod : depends on failed target: crop_vars_mod.o
[FAIL] ! crop_vars_mod.o : update task failed
[FAIL] ! cropparm.mod : depends on failed target: cropparm.o
[FAIL] ! cropparm.o : update task failed
[FAIL] ! cropparm_io.mod : depends on failed target: cropparm_io.o
[FAIL] ! cropparm_io.o : depends on failed target: max_dimensions.mod
[FAIL] ! csigma.mod : depends on failed target: csigma.o
[FAIL] ! csigma.o : update task failed
[FAIL] ! datetime_utils_mod.mod: depends on failed target: datetime_utils_mod.o
[FAIL] ! datetime_utils_mod.o: update task failed
[FAIL] ! descent.mod : depends on failed target: descent.o
[FAIL] ! descent.o : update task failed
[FAIL] ! diag_swchs.mod : depends on failed target: diag_swchs.o
[FAIL] ! diag_swchs.o : update task failed
[FAIL] ! disaggregated_precip.mod: depends on failed target: disaggregated_precip.o
[FAIL] ! disaggregated_precip.o: update task failed
[FAIL] ! drdat.o : depends on failed target: io_constants.mod
[FAIL] ! dust_parameters_mod.mod: depends on failed target: dust_parameters_mod.o
[FAIL] ! dust_parameters_mod.o: update task failed
[FAIL] ! ecosse_param_mod.mod: depends on failed target: ecosse_param_mod.o
[FAIL] ! ecosse_param_mod.o : update task failed
[FAIL] ! errormessagelength_mod.mod: depends on failed target: errormessagelength_mod.o
[FAIL] ! errormessagelength_mod.o: update task failed
[FAIL] ! fire_mod.mod : depends on failed target: fire_mod.o
[FAIL] ! fire_mod.o : update task failed
[FAIL] ! fire_vars.mod : depends on failed target: fire_vars.o
[FAIL] ! fire_vars.o : update task failed
[FAIL] ! fluxes.mod : depends on failed target: fluxes.o
[FAIL] ! fluxes.o : update task failed
[FAIL] ! forcing.mod : depends on failed target: forcing.o
[FAIL] ! forcing.o : update task failed
[FAIL] ! getlat0_mod.mod : depends on failed target: getlat0_mod.o
[FAIL] ! getlat0_mod.o : update task failed
[FAIL] ! getlon0_mod.mod : depends on failed target: getlon0_mod.o
[FAIL] ! getlon0_mod.o : update task failed
[FAIL] ! imogen_clim.mod : depends on failed target: imogen_clim.o
[FAIL] ! imogen_clim.o : update task failed
[FAIL] ! imogen_constants.mod: depends on failed target: imogen_constants.o
[FAIL] ! imogen_constants.o : update task failed
[FAIL] ! imogen_drive_vars.mod: depends on failed target: imogen_drive_vars.o
[FAIL] ! imogen_drive_vars.o : update task failed
[FAIL] ! imogen_time.mod : depends on failed target: imogen_time.o
[FAIL] ! imogen_time.o : update task failed
[FAIL] ! io_constants.mod : depends on failed target: io_constants.o
[FAIL] ! io_constants.o : update task failed
[FAIL] ! jules_internal.mod : depends on failed target: jules_internal.o
[FAIL] ! jules_internal.o : update task failed
[FAIL] ! jules_mod.mod : depends on failed target: jules_mod.o
[FAIL] ! jules_mod.o : update task failed
[FAIL] ! jules_riversparm.mod: depends on failed target: jules_riversparm.o
[FAIL] ! jules_riversparm.o : update task failed
[FAIL] ! lake_mod.mod : depends on failed target: lake_mod.o
[FAIL] ! lake_mod.o : update task failed
[FAIL] ! max_dimensions.mod : depends on failed target: max_dimensions.o
[FAIL] ! max_dimensions.o : update task failed
[FAIL] ! mcarthur.mod : depends on failed target: mcarthur.o
[FAIL] ! mcarthur.o : update task failed
[FAIL] ! metstats_mod.mod : depends on failed target: metstats_mod.o
[FAIL] ! metstats_mod.o : update task failed
[FAIL] ! missing_data_mod.mod: depends on failed target: missing_data_mod.o
[FAIL] ! missing_data_mod.o : update task failed
[FAIL] ! nesterov.mod : depends on failed target: nesterov.o
[FAIL] ! nesterov.o : update task failed
[FAIL] ! nvegparm.mod : depends on failed target: nvegparm.o
[FAIL] ! nvegparm.o : update task failed
[FAIL] ! orog.mod : depends on failed target: orog.o
[FAIL] ! orog.o : update task failed
[FAIL] ! ozone_vars.mod : depends on failed target: ozone_vars.o
[FAIL] ! ozone_vars.o : update task failed
[FAIL] ! p_s_parms.mod : depends on failed target: p_s_parms.o
[FAIL] ! p_s_parms.o : update task failed
[FAIL] ! parkind1.mod : depends on failed target: parkind1.o
[FAIL] ! parkind1.o : update task failed
[FAIL] ! pdm_vars.mod : depends on failed target: pdm_vars.o
[FAIL] ! pdm_vars.o : update task failed
[FAIL] ! pftparm.mod : depends on failed target: pftparm.o
[FAIL] ! pftparm.o : update task failed
[FAIL] ! planet_constants_mod.mod: depends on failed target: planet_constants_mod.o
[FAIL] ! planet_constants_mod.o: update task failed
[FAIL] ! precision_mod.mod : depends on failed target: precision_mod.o
[FAIL] ! precision_mod.o : update task failed
[FAIL] ! prognostics.mod : depends on failed target: prognostics.o
[FAIL] ! prognostics.o : update task failed
[FAIL] ! qsat_data_mod.mod : depends on failed target: qsat_data_mod.o
[FAIL] ! qsat_data_mod.o : update task failed
[FAIL] ! radf_co2.o : update task failed
[FAIL] ! redis.o : update task failed
[FAIL] ! response.o : update task failed
[FAIL] ! rndm.o : update task failed
[FAIL] ! sf_diags_mod.mod : depends on failed target: sf_diags_mod.o
[FAIL] ! sf_diags_mod.o : update task failed
[FAIL] ! sind_mod.mod : depends on failed target: sind_mod.o
[FAIL] ! sind_mod.o : update task failed
[FAIL] ! soil_ecosse_vars_mod.mod: depends on failed target: soil_ecosse_vars_mod.o
[FAIL] ! soil_ecosse_vars_mod.o: update task failed
[FAIL] ! solinc_data.mod : depends on failed target: solinc_data.o
[FAIL] ! solinc_data.o : update task failed
[FAIL] ! switches.mod : depends on failed target: switches.o
[FAIL] ! switches.o : update task failed
[FAIL] ! theta_field_sizes.mod: depends on failed target: theta_field_sizes.o
[FAIL] ! theta_field_sizes.o : update task failed
[FAIL] ! timestep_mod.mod : depends on failed target: timestep_mod.o
[FAIL] ! timestep_mod.o : update task failed
[FAIL] ! top_pdm.mod : depends on failed target: top_pdm.o
[FAIL] ! top_pdm.o : update task failed
[FAIL] ! trif.mod : depends on failed target: trif.o
[FAIL] ! trif.o : update task failed
[FAIL] ! trif_vars_mod.mod : depends on failed target: trif_vars_mod.o
[FAIL] ! trif_vars_mod.o : update task failed
[FAIL] ! trifctl.mod : depends on failed target: trifctl.o
[FAIL] ! trifctl.o : update task failed
[FAIL] ! u_v_grid.mod : depends on failed target: u_v_grid.o
[FAIL] ! u_v_grid.o : update task failed
[FAIL] ! veg_param.mod : depends on failed target: veg_param.o
[FAIL] ! veg_param.o : update task failed
[FAIL] ! water_constants_mod.mod: depends on failed target: water_constants_mod.o
[FAIL] ! water_constants_mod.o: update task failed

[FAIL] fcm make -f /work/scratch-pw/nmc/cylc-run/u-br916/work/1/fcm_make/fcm-make.cfg -C /home/users/nmc/cylc-run/u-br916/share/fcm_make -j 4 # return-code=255
2021-01-29T18:06:18Z CRITICAL - failed/EXIT

Change History (27)

comment:1 Changed 9 months ago by NoelClancy

Hi, I've noticed that the pathways to the /work/scratch may be old now, would this stop the suite at the fcm stage?

OUTPUT_FOLDER='/work/scratch/nmc/fluxnet/u-br916/jules_output'
PLOT_FOLDER='/work/scratch/nmc/fluxnet/u-br916/jules_plot'

comment:2 Changed 9 months ago by NoelClancy

Also, what is the new pathway to the following?

SUITE_DATA='/group_workspaces/jasmin2/jules/pmcguire/fluxnet/kwilliam/suite_data/'

comment:3 Changed 9 months ago by pmcguire

Hi Noel:
The info about the new directory path for the FLUXNET data is another ticket of yours:
http://cms.ncas.ac.uk/ticket/3311

Yes, you'll have to update the scratch directories to scratch-nopw probably. Just make sure you use this for no-parallel-write (nopw) runs of JULES, like you have here for the JULES FLUXNET runs.

Have you run the FLUXNET suite since JASMIN was upgraded from LSF to SLURM batch processing in the fall of 2020? If not, then maybe you should try to use rosie to check out and run the new version of the JULES FLUXNET suite u-al752. You probably should check it out as a copy (with a different suite number).
The tutorial for how to run this suite has also been updated:
https://research.reading.ac.uk/landsurfaceprocesses/software-examples/tutorial-rose-cylc-jules-on-jasmin/

This might help your compiling problems.

Patrick

comment:4 Changed 9 months ago by NoelClancy

I checked out a COPY of u-al752.
I made the changes as per the tutorial to this COPY (u-cb828) and it has completed the fcm_make part successfully and is currently running the JULES part on JASMIN.
I then want to run a Global JULES on JASMIN.

comment:5 Changed 9 months ago by NoelClancy

Although the FLUXNET suite completes the fcm_make park successfully, it starts JULES and then manages to complete many sites successfully but eventually, some sites fail with the following error message

Traceback (most recent call last):

File "/apps/jasmin/metomi/cylc-7.8.7/bin/cylc-cat-log", line 439, in <module>

main()

File "/apps/jasmin/metomi/cylc-7.8.7/bin/cylc-cat-log", line 435, in main

tmpfile_edit(out, options.geditor)

File "/apps/jasmin/metomi/cylc-7.8.7/bin/cylc-cat-log", line 268, in tmpfile_edit

proc = Popen(cmd, stderr=PIPE)

File "/usr/lib64/python2.7/subprocess.py", line 711, in init

errread, errwrite)

File "/usr/lib64/python2.7/subprocess.py", line 1327, in _execute_child

raise child_exception

OSError: [Errno 2] No such file or directory

comment:6 Changed 9 months ago by pmcguire

Hi Noel:
That's great!
Which sites are failing?
Do the same sites fail every time that you try?
Can you retrigger the JULES run for the problem sites from the cylc GUI on cylc1.jasmin?
Can you find more useful error messages by looking in the .err and .out files for each site that fails?
Patrick

comment:7 Changed 9 months ago by NoelClancy

Which sites are failing?
US_Los

Do the same sites fail every time that you try?
Just this site

Can you retrigger the JULES run for the problem sites from the cylc GUI on cylc1.jasmin?
I had to re-trigger several sites 2, 3, and 4 times, but having re-triggered US_Los 8 times it still fails.

comment:8 Changed 9 months ago by NoelClancy

.err varies between two messages

Traceback (most recent call last):

File "/apps/jasmin/metomi/cylc-7.8.7/bin/cylc-cat-log", line 439, in <module>

main()

File "/apps/jasmin/metomi/cylc-7.8.7/bin/cylc-cat-log", line 435, in main

tmpfile_edit(out, options.geditor)

File "/apps/jasmin/metomi/cylc-7.8.7/bin/cylc-cat-log", line 268, in tmpfile_edit

proc = Popen(cmd, stderr=PIPE)

File "/usr/lib64/python2.7/subprocess.py", line 711, in init

errread, errwrite)

File "/usr/lib64/python2.7/subprocess.py", line 1327, in _execute_child

raise child_exception

OSError: [Errno 2] No such file or directory

cpu-bind=MASK - host494, task 0 0 [91790]: mask 0x4 set
[WARN] file:ancillaries.nml: skip missing optional source: namelist:jules_rivers_props
[WARN] file:imogen.nml: skip missing optional source: namelist:imogen_anlg_vals_list
[WARN] file:urban.nml: skip missing optional source: namelist:jules_urban_switches
[WARN] file:jules_soilparm_cable.nml: skip missing optional source: namelist:jules_soilparm_cable
[WARN] file:urban.nml: skip missing optional source: namelist:urban_properties
[WARN] file:urban.nml: skip missing optional source: namelist:jules_urban2t_param
[WARN] file:ancillaries.nml: skip missing optional source: namelist:jules_crop_props
[WARN] file:ancillaries.nml: skip missing optional source: namelist:jules_irrig
[WARN] file:crop_params.nml: skip missing optional source: namelist:jules_cropparm
[WARN] file:jules_deposition.nml: skip missing optional source: namelist:jules_deposition_species(:)
[WARN] file:imogen.nml: skip missing optional source: namelist:imogen_run_list
[WARNING] WARN_JULES_TEMP_FIXES:
jules:#610 fix to the radiative roof coupling is not enabled: l_fix_moruses_roof_rad_coupling = .FALSE.
[WARNING] WARN_JULES_TEMP_FIXES:
Model run excludes a change from ticket um:#4581 as

l_fix_osa_chloro=.FALSE.
This will mean that chlorophyll used for the ocean albedo is
used in gm-3 when it should be mg m-3

[WARNING] init_ic: Provided variable 'gs' is not required, so will be ignored
[WARNING] init_ic: Provided variable 'sthzw' is not required, so will be ignored
[WARNING] init_ic: Provided variable 'zw' is not required, so will be ignored
[WARNING] init_ic: frac < frac_min at one or more points - reset to frac_min at those points
[WARNING] CHECK_UNAVAILABLE_OPTIONS: It is recommended that iscrntdiag = 0 in standalone until driving JULES with a decoupled variable is fully tested.
iscrntdiag = 1
[WARNING] init_vars_tmp: lai_pft should be ⇐ lai_bal_pft
[WARNING] init_vars_tmp: lai_pft should be ⇐ lai_bal_pft
[WARNING] init_vars_tmp: lai_pft should be ⇐ lai_bal_pft
[WARNING] init_vars_tmp: lai_pft should be ⇐ lai_bal_pft
[WARNING] init_vars_tmp: lai_pft should be ⇐ lai_bal_pft
[WARNING] init_vars_tmp: lai_pft should be ⇐ lai_bal_pft
[WARNING] init_vars_tmp: lai_pft should be ⇐ lai_bal_pft
[WARNING] init_vars_tmp: lai_pft should be ⇐ lai_bal_pft
[WARNING] next_time: Model is not fully spun up after maximum number of spinup cycles
[WARNING] next_time: Model failed to spin up - continuing with main run
slurmstepd: error: * JOB 38228818 ON host494 CANCELLED AT 2021-02-02T02:51:16 DUE TO TIME LIMIT *
2021-02-02T02:51:18Z CRITICAL - failed/EXIT

comment:9 Changed 9 months ago by pmcguire

Hi Noel:
Yes, it says that it was cancelled "due to time limit".
Which queue are you using? short-serial?
How much wallclock time did you request for each JULES run in that queue?
Maybe you can increase the amount of wallclock time?
Patrick

comment:10 Changed 9 months ago by pmcguire

Hi Noel:
After you increase the wallclock time, you can do a rose suite-run --reload, and then retrigger the JULES run for that one site that you're having problems with. You can look in the job file to make sure that the suite picks up the change in wallclock time.
Patrick

comment:11 Changed 9 months ago by NoelClancy

In /site/suite.rc.CEDA_JASMIN

I see the following

JASMIN?

[job?]

submission polling intervals = PT1M
execution polling intervals = PT1M

JASMIN_LOTUS?

inherit = None, JASMIN

[directives?]

—partition = short-serial-4hr

# —constraint = "ivybridge128G"

JULES_CEDA_JASMIN?

inherit = None, JASMIN_LOTUS

[directives?]

—time = 2:00:00
—ntasks = 1

PLOTTING_CEDA_JASMIN?

inherit = None, JASMIN_LOTUS
env-script = """

eval $(rose task-env)
export PATH=/apps/jasmin/metomi/bin:$PATH
module load jaspy/2.7
module list 2>&1
env | grep LD_LIBRARY_PATH
"""

# [remote?]
# host = sci3

[directives?]

—time = 08:00:00
—ntasks = 1

comment:12 Changed 9 months ago by pmcguire

Hi Noel:
Yes, it's currently set for 8 hours of wallclock time, like you see.
If that's the main reason it's failing, then it should have failed after exactly 8 hours.
If you don't want it to fail, you can increase that wallclock time.
Patrick

comment:13 Changed 9 months ago by NoelClancy

Apologies for the above but copy and paste does not work perfectly here.

It is not failing after exactly 8 hours. It's failing in a much shorter timeframe. I would say it failed after about an hour and a quarter on some instances.

I think it's using the short serial queue of 4 hours

JASMIN_LOTUS?

inherit = None, JASMIN

[directives?]

—partition = short-serial-4hr

# —constraint = "ivybridge128G"

In the JULES_CEDA_JASMIN? section, is this the one that needs to be increased?

JULES_CEDA_JASMIN?

inherit = None, JASMIN_LOTUS

[directives?]

—time = 2:00:00
—ntasks = 1

In the plotting section, is 8 hours enough?

PLOTTING_CEDA_JASMIN?

[directives?]

—time = 08:00:00
—ntasks = 1

comment:14 Changed 9 months ago by pmcguire

Hi Noel:
I'm sorry. I mis-read that.
It's set for 2 hours of wallclock time, and it's not in the short-serial queue. It's in the short-serial-4hr queue instead.
The plotting is set for 8 hours of wallclock time, but we're not worried about that right now.

Maybe it's possible to finish in less than or equal to 4 hours, in which case, you can keep the short-serial-4hr queue, and increase the wallclock time to 4 hours.

If it takes longer than 4 hours, you will need to switch to the short-serial queue.

We use the short-serial-4hr queue sometimes in order to reduce the queueing time. Sometimes the short-serial queue has long queueing times.
Patrick

comment:15 Changed 9 months ago by NoelClancy

I have already re-triggered US_Los for the 10th time, last night I was re-triggering it as often as 1-2 hours. The 8 hours is for the plotting section which I haven't got as far as yet.

comment:16 Changed 9 months ago by NoelClancy

I've changed it from 2 hours to 4 hours, but it's already submitted from earlier. I do I refresh the trigger/submission when it is already at submission before I increased the time?

comment:18 Changed 9 months ago by pmcguire

Hi Noel
I think that as long as it hasn't started running yet, then you can do the reload and retrigger again. But I might be mistaken.

If I am mistaken and/or it has already started running, then you can kill the job for that site, and then a reload and a retrigger.
Patrick

comment:19 Changed 9 months ago by NoelClancy

Thanks for your response. I increased the time to 4 hours and then retriggered the suite. It's been submitted for about almost 90 minutes now, so still not running. Do I need to let this Fluxnet suite (copy of u-al752) complete before can see if it resolves the compiling issues that I've had with the global suites?

comment:20 Changed 9 months ago by pmcguire

Hi Noel
You can start working on the global gridded suites now, too, while the FLUXNET suite is running.
If you had a global gridded suite running before, you'll have to update the global gridded suite so that it uses the same updated libraries for SLURM that were updated in the fall of 2020. Which global gridded suite are you trying to get working? Please start a new ticket, if you have issues with the global gridded suite. You might be able to find info on the CMS Helpdesk about other people who have updated their global gridded suites for JULES on JASMIN.
Patrick

comment:21 Changed 9 months ago by NoelClancy

Suite u-cb828 does not complete for US_Los, stops at 2013

(base) [nmc@cylc1 jules_output]$ cd /work/scratch-nopw/nmc/fluxnet/u-cb828/jules_output
(base) [nmc@cylc1 jules_output]$ ls -ltr US_Los*
-rw-r—r— 1 nmc users 3120 Feb 1 14:56 US_Los-JULES_vn5.8-presc0.dump.20130101.0.nc
-rw-r—r— 1 nmc users 0 Feb 1 18:51 US_Los-JULES_vn5.8-presc0.D.2013.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:10 US_Los-JULES_vn5.8-presc0.dump.spin1.20000102.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:11 US_Los-JULES_vn5.8-presc0.dump.spin1.20010101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:11 US_Los-JULES_vn5.8-presc0.dump.spin1.20020101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:11 US_Los-JULES_vn5.8-presc0.dump.spin1.20030101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:11 US_Los-JULES_vn5.8-presc0.dump.spin1.20040101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:11 US_Los-JULES_vn5.8-presc0.dump.spin1.20050101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:11 US_Los-JULES_vn5.8-presc0.dump.spin1.20060101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:12 US_Los-JULES_vn5.8-presc0.dump.spin1.20070101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:12 US_Los-JULES_vn5.8-presc0.dump.spin1.20080101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:12 US_Los-JULES_vn5.8-presc0.dump.spin1.20090101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:12 US_Los-JULES_vn5.8-presc0.dump.spin1.20100101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:12 US_Los-JULES_vn5.8-presc0.dump.spin1.20110101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:12 US_Los-JULES_vn5.8-presc0.dump.spin1.20120101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:13 US_Los-JULES_vn5.8-presc0.dump.spin1.20130101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:13 US_Los-JULES_vn5.8-presc0.dump.spin1.20140101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:13 US_Los-JULES_vn5.8-presc0.dump.spin2.20000102.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:13 US_Los-JULES_vn5.8-presc0.dump.spin2.20010101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:13 US_Los-JULES_vn5.8-presc0.dump.spin2.20020101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:13 US_Los-JULES_vn5.8-presc0.dump.spin2.20030101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:14 US_Los-JULES_vn5.8-presc0.dump.spin2.20040101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:14 US_Los-JULES_vn5.8-presc0.dump.spin2.20050101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:14 US_Los-JULES_vn5.8-presc0.dump.spin2.20060101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:14 US_Los-JULES_vn5.8-presc0.dump.spin2.20070101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:14 US_Los-JULES_vn5.8-presc0.dump.spin2.20080101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:14 US_Los-JULES_vn5.8-presc0.dump.spin2.20090101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:15 US_Los-JULES_vn5.8-presc0.dump.spin2.20100101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:15 US_Los-JULES_vn5.8-presc0.dump.spin2.20110101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:15 US_Los-JULES_vn5.8-presc0.dump.spin2.20120101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:15 US_Los-JULES_vn5.8-presc0.dump.spin2.20130101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:15 US_Los-JULES_vn5.8-presc0.dump.spin2.20140101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:15 US_Los-JULES_vn5.8-presc0.dump.spin3.20000102.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:16 US_Los-JULES_vn5.8-presc0.dump.spin3.20010101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:16 US_Los-JULES_vn5.8-presc0.dump.spin3.20020101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:16 US_Los-JULES_vn5.8-presc0.dump.spin3.20030101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:16 US_Los-JULES_vn5.8-presc0.dump.spin3.20040101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:16 US_Los-JULES_vn5.8-presc0.dump.spin3.20050101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:16 US_Los-JULES_vn5.8-presc0.dump.spin3.20060101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:17 US_Los-JULES_vn5.8-presc0.dump.spin3.20070101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:17 US_Los-JULES_vn5.8-presc0.dump.spin3.20080101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:17 US_Los-JULES_vn5.8-presc0.dump.spin3.20090101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:17 US_Los-JULES_vn5.8-presc0.dump.spin3.20100101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:17 US_Los-JULES_vn5.8-presc0.dump.spin3.20110101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:17 US_Los-JULES_vn5.8-presc0.dump.spin3.20120101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:18 US_Los-JULES_vn5.8-presc0.dump.spin3.20130101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:18 US_Los-JULES_vn5.8-presc0.dump.spin3.20140101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:18 US_Los-JULES_vn5.8-presc0.dump.spin4.20000102.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:18 US_Los-JULES_vn5.8-presc0.dump.spin4.20010101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:18 US_Los-JULES_vn5.8-presc0.dump.spin4.20020101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:18 US_Los-JULES_vn5.8-presc0.dump.spin4.20030101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:19 US_Los-JULES_vn5.8-presc0.dump.spin4.20040101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:19 US_Los-JULES_vn5.8-presc0.dump.spin4.20050101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:19 US_Los-JULES_vn5.8-presc0.dump.spin4.20060101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:19 US_Los-JULES_vn5.8-presc0.dump.spin4.20070101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:19 US_Los-JULES_vn5.8-presc0.dump.spin4.20080101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:19 US_Los-JULES_vn5.8-presc0.dump.spin4.20090101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:20 US_Los-JULES_vn5.8-presc0.dump.spin4.20100101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:20 US_Los-JULES_vn5.8-presc0.dump.spin4.20110101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:20 US_Los-JULES_vn5.8-presc0.dump.spin4.20120101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:20 US_Los-JULES_vn5.8-presc0.dump.spin4.20130101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:20 US_Los-JULES_vn5.8-presc0.dump.spin4.20140101.0.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:20 US_Los-JULES_vn5.8-presc0.dump.20000102.0.nc
-rw-r—r— 1 nmc users 4830456 Feb 3 23:21 US_Los-JULES_vn5.8-presc0.D.2000.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:21 US_Los-JULES_vn5.8-presc0.dump.20010101.0.nc
-rw-r—r— 1 nmc users 4830456 Feb 3 23:22 US_Los-JULES_vn5.8-presc0.D.2001.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:22 US_Los-JULES_vn5.8-presc0.dump.20020101.0.nc
-rw-r—r— 1 nmc users 4830456 Feb 3 23:23 US_Los-JULES_vn5.8-presc0.D.2002.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:23 US_Los-JULES_vn5.8-presc0.dump.20030101.0.nc
-rw-r—r— 1 nmc users 4830456 Feb 3 23:24 US_Los-JULES_vn5.8-presc0.D.2003.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:24 US_Los-JULES_vn5.8-presc0.dump.20040101.0.nc
-rw-r—r— 1 nmc users 4830632 Feb 3 23:25 US_Los-JULES_vn5.8-presc0.D.2004.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:25 US_Los-JULES_vn5.8-presc0.dump.20050101.0.nc
-rw-r—r— 1 nmc users 4830456 Feb 3 23:26 US_Los-JULES_vn5.8-presc0.D.2005.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:26 US_Los-JULES_vn5.8-presc0.dump.20060101.0.nc
-rw-r—r— 1 nmc users 4830456 Feb 3 23:27 US_Los-JULES_vn5.8-presc0.D.2006.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:27 US_Los-JULES_vn5.8-presc0.dump.20070101.0.nc
-rw-r—r— 1 nmc users 4830456 Feb 3 23:28 US_Los-JULES_vn5.8-presc0.D.2007.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:28 US_Los-JULES_vn5.8-presc0.dump.20080101.0.nc
-rw-r—r— 1 nmc users 4830632 Feb 3 23:29 US_Los-JULES_vn5.8-presc0.D.2008.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:29 US_Los-JULES_vn5.8-presc0.dump.20090101.0.nc
-rw-r—r— 1 nmc users 4830456 Feb 3 23:30 US_Los-JULES_vn5.8-presc0.D.2009.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:30 US_Los-JULES_vn5.8-presc0.dump.20100101.0.nc
-rw-r—r— 1 nmc users 4830456 Feb 3 23:30 US_Los-JULES_vn5.8-presc0.D.2010.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:30 US_Los-JULES_vn5.8-presc0.dump.20110101.0.nc
-rw-r—r— 1 nmc users 4830456 Feb 3 23:31 US_Los-JULES_vn5.8-presc0.D.2011.nc
-rw-r—r— 1 nmc users 3120 Feb 3 23:31 US_Los-JULES_vn5.8-presc0.dump.20120101.0.nc
-rw-r—r— 1 nmc users 4830632 Feb 3 23:32 US_Los-JULES_vn5.8-presc0.D.2012.nc

comment:22 Changed 9 months ago by NoelClancy

.err
cpu-bind=MASK - host240, task 0 0 [10370]: mask 0x200 set
[WARN] file:ancillaries.nml: skip missing optional source: namelist:jules_rivers_props
[WARN] file:imogen.nml: skip missing optional source: namelist:imogen_anlg_vals_list
[WARN] file:urban.nml: skip missing optional source: namelist:jules_urban_switches
[WARN] file:jules_soilparm_cable.nml: skip missing optional source: namelist:jules_soilparm_cable
[WARN] file:urban.nml: skip missing optional source: namelist:urban_properties
[WARN] file:urban.nml: skip missing optional source: namelist:jules_urban2t_param
[WARN] file:ancillaries.nml: skip missing optional source: namelist:jules_crop_props
[WARN] file:ancillaries.nml: skip missing optional source: namelist:jules_irrig
[WARN] file:crop_params.nml: skip missing optional source: namelist:jules_cropparm
[WARN] file:jules_deposition.nml: skip missing optional source: namelist:jules_deposition_species(:)
[WARN] file:imogen.nml: skip missing optional source: namelist:imogen_run_list
[WARNING] WARN_JULES_TEMP_FIXES:
jules:#610 fix to the radiative roof coupling is not enabled: l_fix_moruses_roof_rad_coupling = .FALSE.
[WARNING] WARN_JULES_TEMP_FIXES:
Model run excludes a change from ticket um:#4581 as
l_fix_osa_chloro=.FALSE.
This will mean that chlorophyll used for the ocean albedo is
used in gm-3 when it should be mg m-3
[WARNING] init_ic: Provided variable 'gs' is not required, so will be ignored
[WARNING] init_ic: Provided variable 'sthzw' is not required, so will be ignored
[WARNING] init_ic: Provided variable 'zw' is not required, so will be ignored
[WARNING] init_ic: frac < frac_min at one or more points - reset to frac_min at those points
[WARNING] CHECK_UNAVAILABLE_OPTIONS: It is recommended that iscrntdiag = 0 in standalone until driving JULES with a decoupled variable is fully tested.
iscrntdiag = 1
[WARNING] init_vars_tmp: lai_pft should be ⇐ lai_bal_pft
[WARNING] init_vars_tmp: lai_pft should be ⇐ lai_bal_pft
[WARNING] init_vars_tmp: lai_pft should be ⇐ lai_bal_pft
[WARNING] init_vars_tmp: lai_pft should be ⇐ lai_bal_pft
[WARNING] init_vars_tmp: lai_pft should be ⇐ lai_bal_pft
[WARNING] init_vars_tmp: lai_pft should be ⇐ lai_bal_pft
[WARNING] init_vars_tmp: lai_pft should be ⇐ lai_bal_pft
[WARNING] init_vars_tmp: lai_pft should be ⇐ lai_bal_pft
[WARNING] next_time: Model is not fully spun up after maximum number of spinup cycles
[WARNING] next_time: Model failed to spin up - continuing with main run
slurmstepd: error: * JOB 38663915 ON host240 CANCELLED AT 2021-02-04T07:10:46 DUE TO TIME LIMIT *
2021-02-04T07:10:47Z CRITICAL - failed/EXIT

.out
Traceback (most recent call last):

File "/apps/jasmin/metomi/cylc-7.8.7/bin/cylc-cat-log", line 439, in <module>

main()

File "/apps/jasmin/metomi/cylc-7.8.7/bin/cylc-cat-log", line 435, in main

tmpfile_edit(out, options.geditor)

File "/apps/jasmin/metomi/cylc-7.8.7/bin/cylc-cat-log", line 268, in tmpfile_edit

proc = Popen(cmd, stderr=PIPE)

File "/usr/lib64/python2.7/subprocess.py", line 711, in init

errread, errwrite)

File "/usr/lib64/python2.7/subprocess.py", line 1327, in _execute_child

raise child_exception

OSError: [Errno 2] No such file or directory

comment:23 Changed 9 months ago by NoelClancy

—partition = short-serial
—constraint = "ivybridge128G"

ran at 8 hours

comment:24 Changed 9 months ago by NoelClancy

The suite was re-started from scratch and has not completed the JULES part successfully. Running the make_plots part now.

My solution was to delete all the US_Los output files and start again.

comment:25 Changed 9 months ago by NoelClancy

Has now completed, not not!
Ticket closed

comment:26 Changed 9 months ago by NoelClancy

Thanks very much for the recommendations on which queues and time limits to use, hopefully, the make_plots will complete also

comment:27 Changed 8 months ago by NoelClancy

Hi, the plotting for FLUXNET suite, u-cb828 (copy of u-bx723) is not working.
I've tried running the suite on both the short-serial and multi-par but there seems to be something else wrong.
# —partition = short-serial-4hr
# —partition = short-serial

—partition= par-multi
—constraint = "ivybridge128G"

The job.err snd job.out are below.

job.err
cpu-bind=MASK - host097, task 0 0 [22328]: mask 0x10 set
2021-02-07T23:44:06Z CRITICAL - failed/EXIT

job.out
Suite : u-cb828
Task Job : 1/make_plots/01 (try 1)
User@Host: nmc@…

Currently Loaded Modulefiles:

1) jaspy/2.7/r20190715

comment:28 Changed 8 months ago by NoelClancy

Apologies, u-cb828 is a copy of u-al752 (not u-bx723)

Note: See TracTickets for help on using tickets.