Opened 7 months ago

Last modified 6 months ago

#3494 new help

Generating dump files for GL7 suite for SLURM batch processing on JASMIN

Reported by: NoelClancy Owned by: jules_support
Component: JULES Keywords: JULES, spinup, JASMIN
Cc: Platform: JASMIN
UM Version:

Description

Dear CMS,

What changes would be necessary to a copy of u-bx723 (GL7 suite for SLURM batch processing on JASMIN) in run the suite from scratch without using the previously generated spin-up dump file JULES-GL7.0.vn5.2.CRUNCEPv7.spinup_10.dump.18800101.0.nc in the directory app/jules/opt/rose-app-recon.conf ?

In other words to run a copy of the suite from the beginning and in doing so generate all the spinup dumpfiles.

Change History (82)

comment:1 Changed 7 months ago by pmcguire

  • Component changed from UM Model to JULES
  • Keywords JULES, spinup, JASMIN added
  • Owner changed from um_support to jules_support
  • Platform set to JASMIN

comment:2 Changed 7 months ago by pmcguire

Hi Noel:
Several changes are necessary:
1) in your app/jules/rose-app.conf file. Instead of this:

[namelist:jules_initial]
!!const_val=50.0,50.0,0.1,1.0,100.0,10.0,0.75,272.0,273.0,0.5,3.0,250.0,
           =0.0,0.0,1.0,0.0,0.0,0.0,273.0
dump_file=.true.
use_file=19*.true.

do this:

[namelist:jules_initial]
const_val=50.0,50.0,0.1,1.0,100.0,10.0,0.75,272.0,273.0,0.5,3.0,250.0,
           =0.0,0.0,1.0,0.0,0.0,0.0,273.0
dump_file='$DUMP_FILE'
use_file=19*'$USE_FILE'

2) in your suite.rc file, change this:

{{ ('fcm_make ⇒' + ('fcm_make2 ⇒ ' if MODE_RUN == 'meto-xc40' else ) if BUILD else ) + ('RECON ⇒ spinup_01' if LSPINUP else 'RECON ⇒ S2' ) }}

to this:

{{ ('fcm_make ⇒' + ('fcm_make2 ⇒ ' if MODE_RUN == 'meto-xc40' else ) if BUILD else ) + ('spinup_01' if LSPINUP else 'S2' ) }}

3) in your suite.rc file, change this:

{% if i == 0 %}
     {% set INFILE = "${OUTPUT_FOLDER}/${RUN_ID_STEM}.RECON.dump.18600101.0.nc" %}
  {% else %}
     {% set INFILE = "${OUTPUT_FOLDER}/${ID_STEM2}%02d.dump.${SPIN_END}.0.nc" % i %}
  {% endif %}
    [[spinup_{{'%02d' % (i+1) }}]]
        inherit = SPINUP
        [[[environment]]]
            INITFILE = "{{ INFILE }}"
            USE_FILE = '.true.'
            DUMPFILE = '.true.'

to this:

  {% if i == 0 %}
     {% set USE_FILE = ".false." %}
  {% else %}
     {% set USE_FILE = ".true." %}
     {% set INFILE = "${OUTPUT_FOLDER}/${ID_STEM2}%02d.dump.${SPIN_END}.0.nc" % i %}
  {% endif %}
    [[spinup_{{'%02d' % (i+1) }}]]
        inherit = SPINUP
        [[[environment]]]
            INITFILE = "{{ INFILE }}"
            DUMPFILE = '.true.'

4) And in your rose-suite.conf file, you might want to increase your SPINCYCLES from 1 to say 10 or more.

These changes may not be all that you need, but it might be a good start. You might have to try it and fix any problems that might crop up.
Patrick

comment:3 Changed 7 months ago by NoelClancy

Thanks for that, however, the following error occurs

[FAIL] cylc validate -o /tmp/tmpm2ndNd —strict u-cd180 # return-code=1, stderr=
[FAIL] Jinja2Error:
[FAIL] File "<template>", line 323, in top-level template code
[FAIL] UndefinedError?: 'INFILE' is undefined
[FAIL] Context lines:
[FAIL] spinup_{{'%02d' % (i+1) }}?
[FAIL] inherit = SPINUP
[FAIL] [environment?]
[FAIL] INITFILE = "{{ INFILE }}" ←- Jinja2Error

comment:4 Changed 7 months ago by pmcguire

Hi Noel:
Maybe you can try to revise step 3 (from above), by changing:
3) in your suite.rc file, change this:

{% if i == 0 %}
     {% set INFILE = "${OUTPUT_FOLDER}/${RUN_ID_STEM}.RECON.dump.18600101.0.nc" %}
  {% else %}
     {% set INFILE = "${OUTPUT_FOLDER}/${ID_STEM2}%02d.dump.${SPIN_END}.0.nc" % i %}
  {% endif %}
    [[spinup_{{'%02d' % (i+1) }}]]
        inherit = SPINUP
        [[[environment]]]
            INITFILE = "{{ INFILE }}"
            USE_FILE = '.true.'
            DUMPFILE = '.true.'

to this:

  {% if i == 0 %}
     {% set USE_FILE = ".false." %}
  {% else %}
     {% set USE_FILE = ".true." %}
     {% set INFILE = "${OUTPUT_FOLDER}/${ID_STEM2}%02d.dump.${SPIN_END}.0.nc" % i %}
     {% set INITFILE = "{{ INFILE }}" %}
  {% endif %}
    [[spinup_{{'%02d' % (i+1) }}]]
        inherit = SPINUP
        [[[environment]]]
            DUMPFILE = '.true.'

Patrick

comment:5 Changed 7 months ago by NoelClancy

Thanks for that, it passes the fcm_make now, but fails on the spin-up
with the following error:

job.err

cpu-bind=MASK - host093, task 0 0 [23991]: mask 0x518 set
[FAIL] namelist:jules_initial=dump_file: DUMP_FILE: unbound variable
[FAIL] source: namelist:jules_initial
2021-03-16T10:39:10Z CRITICAL - failed/EXIT

job.out

Suite : u-cd180
Task Job : 18600101T0000Z/spinup_01/01 (try 1)
User@Host: nmc@…

Currently Loaded Modulefiles:

1) intel/cce/19.0.0 4) contrib/gnu/binutils/2.31
2) intel/fce/19.0.0 5) contrib/gnu/gcc/7.3.0
3) intel/19.0.0 6) eb/OpenMPI/intel/3.1.1

LD_LIBRARY_PATH=/apps/eb/software/OpenMPI/3.1.1-iccifort-2018.3.222-GCC-7.3.0-2.30/lib:/apps/contrib/gnu/gcc/7.3.0/lib64:/apps/contrib/gnu/gcc/deps:/apps/intel/2019//itac/2019.0.018/intel64/slib:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/compiler/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/ipp/lib/intel64:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/compiler/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/mkl/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/tbb/lib/intel64/gcc4.7:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/tbb/lib/intel64/gcc4.7:/apps/intel/2019/debugger_2019/libipt/intel64/lib:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/daal/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/daal/../tbb/lib/intel64_lin/gcc4.4
LD_LIBRARY_PATH=/apps/eb/software/OpenMPI/3.1.1-iccifort-2018.3.222-GCC-7.3.0-2.30/lib:/apps/contrib/gnu/gcc/7.3.0/lib64:/apps/contrib/gnu/gcc/deps:/apps/intel/2019//itac/2019.0.018/intel64/slib:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/compiler/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/ipp/lib/intel64:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/compiler/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/mkl/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/tbb/lib/intel64/gcc4.7:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/tbb/lib/intel64/gcc4.7:/apps/intel/2019/debugger_2019/libipt/intel64/lib:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/daal/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/daal/../tbb/lib/intel64_lin/gcc4.4:/home/users/siwilson/netcdf_par/3.1.1/intel.19.0.0/lib
2021-03-16T10:39:08Z INFO - started

comment:6 Changed 7 months ago by pmcguire

Hi Noel:
I had an extra underscore in DUMPFILE in step 1 above. Maybe you can try this step 1 instead?
1) in your app/jules/rose-app.conf file. Instead of this:

[namelist:jules_initial]
!!const_val=50.0,50.0,0.1,1.0,100.0,10.0,0.75,272.0,273.0,0.5,3.0,250.0,
           =0.0,0.0,1.0,0.0,0.0,0.0,273.0
dump_file=.true.
use_file=19*.true.

do this:

[namelist:jules_initial]
const_val=50.0,50.0,0.1,1.0,100.0,10.0,0.75,272.0,273.0,0.5,3.0,250.0,
           =0.0,0.0,1.0,0.0,0.0,0.0,273.0
dump_file='$DUMPFILE'
use_file=19*'$USE_FILE'

There might be other things that you need to get fixed in order for it to work right.
Patrick

comment:7 Changed 7 months ago by NoelClancy

It seems there may be something else also, same error message

job.err.2

cpu-bind=MASK - host455, task 0 0 [211282]: mask 0x38c853 set
[FAIL] namelist:jules_initial=file: INITFILE: unbound variable
[FAIL] source: namelist:jules_initial
2021-03-16T14:52:19Z CRITICAL - failed/EXIT

job.out.2

Suite : u-cd180
Task Job : 18600101T0000Z/spinup_01/01 (try 1)
User@Host: nmc@…

Currently Loaded Modulefiles:

1) intel/cce/19.0.0 4) contrib/gnu/binutils/2.31
2) intel/fce/19.0.0 5) contrib/gnu/gcc/7.3.0
3) intel/19.0.0 6) eb/OpenMPI/intel/3.1.1

LD_LIBRARY_PATH=/apps/eb/software/OpenMPI/3.1.1-iccifort-2018.3.222-GCC-7.3.0-2.30/lib:/apps/contrib/gnu/gcc/7.3.0/lib64:/apps/contrib/gnu/gcc/deps:/apps/intel/2019//itac/2019.0.018/intel64/slib:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/compiler/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/ipp/lib/intel64:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/compiler/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/mkl/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/tbb/lib/intel64/gcc4.7:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/tbb/lib/intel64/gcc4.7:/apps/intel/2019/debugger_2019/libipt/intel64/lib:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/daal/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/daal/../tbb/lib/intel64_lin/gcc4.4
LD_LIBRARY_PATH=/apps/eb/software/OpenMPI/3.1.1-iccifort-2018.3.222-GCC-7.3.0-2.30/lib:/apps/contrib/gnu/gcc/7.3.0/lib64:/apps/contrib/gnu/gcc/deps:/apps/intel/2019//itac/2019.0.018/intel64/slib:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/compiler/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/ipp/lib/intel64:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/compiler/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/mkl/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/tbb/lib/intel64/gcc4.7:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/tbb/lib/intel64/gcc4.7:/apps/intel/2019/debugger_2019/libipt/intel64/lib:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/daal/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/daal/../tbb/lib/intel64_lin/gcc4.4:/home/users/siwilson/netcdf_par/3.1.1/intel.19.0.0/lib
2021-03-16T14:52:16Z INFO - started

comment:8 Changed 7 months ago by pmcguire

Hi Noel:
Can you look for all occurrences of the INITFILE string with grep -ir INITFILE from your ~/roses/u-bx723 directory?
Then, for each occurrence, open the file that contains that occurrence, and study the lines around that occurrence to see why the INITFILE variable is unbound when it is reading the jules_initial namelist? You might have to change a few lines of code to get that variable defined properly for all instances.
Patrick

comment:9 Changed 7 months ago by NoelClancy

INITFILE and INFILE must be different

(base) [nmc@cylc1 u-cd180]$ grep -ir INITFILE
.svn/pristine/c7/c796e3028e802b63854b6aa72c808b20f92d28c8.svn-base:file='$INITFILE'
.svn/pristine/a0/a034a738c7aeeb21bd8a4ce851d37b0ce2be9d0b.svn-base: INITFILE = "${OUTPUT_FOLDER}/${RUN_ID_STEM}.${SPINDUMP}.dump.${DUMPTIME}.0.nc"
.svn/pristine/a0/a034a738c7aeeb21bd8a4ce851d37b0ce2be9d0b.svn-base: INITFILE = "{{ INFILE }}"
app/jules/rose-app.conf:file='$INITFILE'
Binary file .suite.rc.swp matches
suite.rc: INITFILE = "${OUTPUT_FOLDER}/${RUN_ID_STEM}.${SPINDUMP}.dump.${DUMPTIME}.0.nc"
suite.rc: {% set INITFILE = "{{ INFILE }}" %}

comment:10 Changed 7 months ago by NoelClancy

In the directory app/jules/rose-app.conf

[namelist:jules_initial]
const_val=50.0,50.0,0.1,1.0,100.0,10.0,0.75,272.0,273.0,0.5,3.0,250.0,

=0.0,0.0,1.0,0.0,0.0,0.0,273.0

dump_file='$DUMPFILE'
file='$INITFILE'
nvars=19
total_snow=.false.
tpl_name=19*
use_file=19*'$USE_FILE'

However, looking at the following webpage
https://jules-lsm.github.io/vn5.4/namelists/initial_conditions.nml.html

I think that there are only two possibilities (logical T or F)for dump_file= and use_file=
dump_file=.true. or .false.
use_file=19*.true. or 19*.false.

I think I should set dump_file=.false. because I am not using a dump from a previous JULES run.
I think I should set use_file=19*.false. so that each variable is set to the const_val which I have uncommented.

comment:11 Changed 7 months ago by pmcguire

Hi Noel:
Nice work!

Yes, it's true that you want $USE_FILE to be .false. for the initial spinup cycle. But after the initial spinup cycle is over, and the 2nd and 3rd spinup, etc., cycles are going, then you want $USE_FILE to be .true. . You also want $USE_FILE to be .true. for the main run.

But another thing you need to fix is for it to have the $INITFILE variable set properly for the different cases.
Patrick

comment:12 Changed 7 months ago by NoelClancy

So initially, in the directory app/jules/rose-suite.conf
use_file=19*.false.

Then once the first spin up cycle finished, stop the suite and re-start after changing
use_file=19*.true.

But what about the other directory app/jules/opt/rose-app-recon.conf below?

[namelist:jules_initial]
!!const_val=50.0,50.0,0.1,20.0,10.0,0.0,0.75,272.0,273.0,0.5,3.0,250.0,

=0.0,0.0,1.0,0.0,0.0,0.0,273.0

file='$ANCIL_DIREC/JULES-GL7.0.vn5.2.CRUNCEPv7.spinup_10.dump.18800101.0.nc'
tpl_name=29*
var_name='rgrain','rgrainl','canopy','cs','gs','snow_tile','sthuf',

='soiltemp','tstar_tile','sthzw','zw','rho_snow',
='snow_depth','snow_grnd','nsnow','snow_ds','snow_ice',
='snow_liq','tsnow'

[namelist:jules_output]
nprofiles=0

comment:13 Changed 7 months ago by pmcguire

Hi Noel:
You don't need to stop and restart the suite if you use use_file=19*'$USE_FILE' like I suggested that you try.
You just need to fix the $INITFILE issue.
Maybe you can try changing the step 3 above to this:
3) in your suite.rc file, change this:

{% if i == 0 %}
     {% set INFILE = "${OUTPUT_FOLDER}/${RUN_ID_STEM}.RECON.dump.18600101.0.nc" %}
  {% else %}
     {% set INFILE = "${OUTPUT_FOLDER}/${ID_STEM2}%02d.dump.${SPIN_END}.0.nc" % i %}
  {% endif %}
    [[spinup_{{'%02d' % (i+1) }}]]
        inherit = SPINUP
        [[[environment]]]
            INITFILE = "{{ INFILE }}"
            USE_FILE = '.true.'
            DUMPFILE = '.true.'

to this:

  {% if i == 0 %}
     {% set USE_FILE = ".false." %}
     {% set INFILE = "tmp.nc" %}
     {% set INITFILE = "{{ INFILE }}" %}
  {% else %}
     {% set USE_FILE = ".true." %}
     {% set INFILE = "${OUTPUT_FOLDER}/${ID_STEM2}%02d.dump.${SPIN_END}.0.nc" % i %}
     {% set INITFILE = "{{ INFILE }}" %}
  {% endif %}
    [[spinup_{{'%02d' % (i+1) }}]]
        inherit = SPINUP
        [[[environment]]]
            DUMPFILE = '.true.'

Patrick

comment:14 Changed 7 months ago by NoelClancy

Not sure how to fix the $INITFILE issue ??

job.err
cpu-bind=MASK - host279, task 0 0 [25028]: mask 0x24a0 set
[FAIL] namelist:jules_initial=file: INITFILE: unbound variable
[FAIL] source: namelist:jules_initial
2021-03-19T13:03:46Z CRITICAL - failed/EXIT

job.out
Suite : u-cd180
Task Job : 18600101T0000Z/spinup_01/01 (try 1)
User@Host: nmc@…

Currently Loaded Modulefiles:

1) intel/cce/19.0.0 4) contrib/gnu/binutils/2.31
2) intel/fce/19.0.0 5) contrib/gnu/gcc/7.3.0
3) intel/19.0.0 6) eb/OpenMPI/intel/3.1.1

LD_LIBRARY_PATH=/apps/eb/software/OpenMPI/3.1.1-iccifort-2018.3.222-GCC-7.3.0-2.30/lib:/apps/contrib/gnu/gcc/7.3.0/lib64:/apps/contrib/gnu/gcc/deps:/apps/intel/2019//itac/2019.0.018/intel64/slib:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/compiler/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/ipp/lib/intel64:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/compiler/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/mkl/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/tbb/lib/intel64/gcc4.7:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/tbb/lib/intel64/gcc4.7:/apps/intel/2019/debugger_2019/libipt/intel64/lib:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/daal/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/daal/../tbb/lib/intel64_lin/gcc4.4
LD_LIBRARY_PATH=/apps/eb/software/OpenMPI/3.1.1-iccifort-2018.3.222-GCC-7.3.0-2.30/lib:/apps/contrib/gnu/gcc/7.3.0/lib64:/apps/contrib/gnu/gcc/deps:/apps/intel/2019//itac/2019.0.018/intel64/slib:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/compiler/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/ipp/lib/intel64:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/compiler/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/mkl/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/tbb/lib/intel64/gcc4.7:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/tbb/lib/intel64/gcc4.7:/apps/intel/2019/debugger_2019/libipt/intel64/lib:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/daal/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/daal/../tbb/lib/intel64_lin/gcc4.4:/home/users/siwilson/netcdf_par/3.1.1/intel.19.0.0/lib
2021-03-19T13:03:43Z INFO - started
~

comment:15 Changed 7 months ago by ros

Hi Noel, Patrick,

You have to be careful with your use of quotes and positioning in the suite.rc file. For a jinja variable to be expanded you shouldn't have quotes around the {{ }}. And for a variable to be exported in the task it must be set in the [[[environment]]] section. Give the following a try:

  {% if i == 0 %}
     {% set USE_FILE = ".false." %}
     {% set INFILE = "tmp.nc" %}
  {% else %}
     {% set USE_FILE = ".true." %}
     {% set INFILE = "${OUTPUT_FOLDER}/${ID_STEM2}%02d.dump.${SPIN_END}.0.nc" % i %}
  {% endif %}
    [[spinup_{{'%02d' % (i+1) }}]]
        inherit = SPINUP
        [[[environment]]]
            INITFILE = {{ INFILE }}
            USE_FILE = {{ USE_FILE }}
            DUMPFILE = .true.

Cheers,
Ros.

comment:16 Changed 7 months ago by NoelClancy

Thanks very much.

However, after running a rose suite-clean and then rose suite-run —new
The suite ran through the fcm_make but then skipped the RECON stage and went straight onto the SPINUP where it failed with the following error (job.err).

Suite : u-cd180
Task Job : 18600101T0000Z/spinup_01/01 (try 1)
User@Host: nmc@…

Currently Loaded Modulefiles:

1) intel/cce/19.0.0 4) contrib/gnu/binutils/2.31
2) intel/fce/19.0.0 5) contrib/gnu/gcc/7.3.0
3) intel/19.0.0 6) eb/OpenMPI/intel/3.1.1

LD_LIBRARY_PATH=/apps/eb/software/OpenMPI/3.1.1-iccifort-2018.3.222-GCC-7.3.0-2.30/lib:/apps/contrib/gnu/gcc/7.3.0/lib64:/apps/contrib/gnu/gcc/deps:/apps/intel/2019//itac/2019.0.018/intel64/slib:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/compiler/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/ipp/lib/intel64:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/compiler/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/mkl/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/tbb/lib/intel64/gcc4.7:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/tbb/lib/intel64/gcc4.7:/apps/intel/2019/debugger_2019/libipt/intel64/lib:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/daal/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/daal/../tbb/lib/intel64_lin/gcc4.4
LD_LIBRARY_PATH=/apps/eb/software/OpenMPI/3.1.1-iccifort-2018.3.222-GCC-7.3.0-2.30/lib:/apps/contrib/gnu/gcc/7.3.0/lib64:/apps/contrib/gnu/gcc/deps:/apps/intel/2019//itac/2019.0.018/intel64/slib:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/compiler/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/ipp/lib/intel64:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/compiler/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/mkl/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/tbb/lib/intel64/gcc4.7:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/tbb/lib/intel64/gcc4.7:/apps/intel/2019/debugger_2019/libipt/intel64/lib:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/daal/lib/intel64_lin:/apps/intel/2019/compilers_and_libraries_2019.0.117/linux/daal/../tbb/lib/intel64_lin/gcc4.4:/home/users/siwilson/netcdf_par/3.1.1/intel.19.0.0/lib
2021-03-23T09:20:00Z INFO - started
[INFO] Running JULES in parallel MPI mode
[INFO] exec /apps/eb/software/OpenMPI/3.1.1-iccifort-2018.3.222-GCC-7.3.0-2.30/bin/mpirun /home/users/nmc/cylc-run/u-cd180/share/fcm_make/build/bin/jules.exe
{MPI Task 3} [INFO] READ_NML_JULES_MODEL_ENVIRONMENT: Reading JULES_MODEL_ENVIRONMENT namelist…
{MPI Task 5} [INFO] READ_NML_JULES_MODEL_ENVIRONMENT: Reading JULES_MODEL_ENVIRONMENT namelist…
{MPI Task 7} [INFO] READ_NML_JULES_MODEL_ENVIRONMENT: Reading JULES_MODEL_ENVIRONMENT namelist…
{MPI Task 9} [INFO] READ_NML_JULES_MODEL_ENVIRONMENT: Reading JULES_MODEL_ENVIRONMENT namelist…
{MPI Task 4} [INFO] READ_NML_JULES_MODEL_ENVIRONMENT: Reading JULES_MODEL_ENVIRONMENT namelist…
{MPI Task 6} [INFO] READ_NML_JULES_MODEL_ENVIRONMENT: Reading JULES_MODEL_ENVIRONMENT namelist…
{MPI Task 9} [INFO] jules_model_environment: Contents of namelist jules_model_environment
{MPI Task 3} [INFO] jules_model_environment: Contents of namelist jules_model_environment
{MPI Task 5} [INFO] jules_model_environment: Contents of namelist jules_model_environment
{MPI Task 7} [INFO] jules_model_environment: Contents of namelist jules_model_environment
{MPI Task 4} [INFO] jules_model_environment: Contents of namelist jules_model_environment
{MPI Task 7} [INFO] jules_model_environment: l_jules_parent = 0
{MPI Task 3} [INFO] jules_model_environment: l_jules_parent = 0
{MPI Task 3} [INFO] init_lsm: Reading JULES_LSM_SWITCH namelist…
{MPI Task 5} [INFO] jules_model_environment: l_jules_parent = 0
{MPI Task 5} [INFO] init_lsm: Reading JULES_LSM_SWITCH namelist…
{MPI Task 9} [INFO] jules_model_environment: l_jules_parent = 0
{MPI Task 9} [INFO] init_lsm: Reading JULES_LSM_SWITCH namelist…
{MPI Task 7} [INFO] init_lsm: Reading JULES_LSM_SWITCH namelist…
{MPI Task 3} [INFO] init_lsm: Land surface model selected is JULES
{MPI Task 3} [INFO] init_surface_types: Reading JULES_SURFACE_TYPES namelist…
{MPI Task 5} [INFO] init_lsm: Land surface model selected is JULES
{MPI Task 5} [INFO] init_surface_types: Reading JULES_SURFACE_TYPES namelist…
{MPI Task 9} [INFO] init_lsm: Land surface model selected is JULES
{MPI Task 9} [INFO] init_surface_types: Reading JULES_SURFACE_TYPES namelist…
{MPI Task 7} [INFO] init_lsm: Land surface model selected is JULES
{MPI Task 7} [INFO] init_surface_types: Reading JULES_SURFACE_TYPES namelist…
{MPI Task 3} [INFO] init_surface_types: Using 5 natural PFTs and 0 crop PFTs
{MPI Task 3} [INFO] init_surface_types: Using 4 non-veg surface types
{MPI Task 3} [INFO] init_surface_types: Soil is type #8
{MPI Task 3} [INFO] init_surface_types: Lake (inland water) is type #7
{MPI Task 3} [INFO] init_surface_types: Land ice is type #9
{MPI Task 7} [INFO] init_surface_types: Using 5 natural PFTs and 0 crop PFTs
{MPI Task 7} [INFO] init_surface_types: Using 4 non-veg surface types
{MPI Task 7} [INFO] init_surface_types: Soil is type #8
{MPI Task 7} [INFO] init_surface_types: Lake (inland water) is type #7
{MPI Task 5} [INFO] init_surface_types: Using 5 natural PFTs and 0 crop PFTs
{MPI Task 5} [INFO] init_surface_types: Using 4 non-veg surface types
{MPI Task 5} [INFO] init_surface_types: Soil is type #8
{MPI Task 5} [INFO] init_surface_types: Lake (inland water) is type #7
{MPI Task 9} [INFO] init_surface_types: Using 5 natural PFTs and 0 crop PFTs
{MPI Task 9} [INFO] init_surface_types: Using 4 non-veg surface types
{MPI Task 9} [INFO] init_surface_types: Soil is type #8
{MPI Task 9} [INFO] init_surface_types: Lake (inland water) is type #7
{MPI Task 9} [INFO] init_surface_types: Land ice is type #9
{MPI Task 9} [INFO] init_surface_types: Urban is type #6
{MPI Task 9} [INFO] init_surface_types: No canyon type specified (URBAN-2T or MORUSES)
{MPI Task 3} [INFO] init_surface_types: Urban is type #6
{MPI Task 3} [INFO] init_surface_types: No canyon type specified (URBAN-2T or MORUSES)
{MPI Task 3} [INFO] init_surface_types: No roof type specified (URBAN-2T or MORUSES)
{MPI Task 3} [INFO] init_surface: Reading JULES_SURFACE namelist…
{MPI Task 9} [INFO] init_surface_types: No roof type specified (URBAN-2T or MORUSES)
{MPI Task 9} [INFO] init_surface: Reading JULES_SURFACE namelist…
{MPI Task 7} [INFO] init_surface_types: Land ice is type #9
{MPI Task 7} [INFO] init_surface_types: Urban is type #6
{MPI Task 7} [INFO] init_surface_types: No canyon type specified (URBAN-2T or MORUSES)
{MPI Task 7} [INFO] init_surface_types: No roof type specified (URBAN-2T or MORUSES)
{MPI Task 7} [INFO] init_surface: Reading JULES_SURFACE namelist…
{MPI Task 5} [INFO] init_surface_types: Land ice is type #9
{MPI Task 5} [INFO] init_surface_types: Urban is type #6
{MPI Task 5} [INFO] init_surface_types: No canyon type specified (URBAN-2T or MORUSES)
{MPI Task 5} [INFO] init_surface_types: No roof type specified (URBAN-2T or MORUSES)
{MPI Task 5} [INFO] init_surface: Reading JULES_SURFACE namelist…
{MPI Task 3} [INFO] init_time: Reading JULES_TIME namelist…
{MPI Task 7} [INFO] init_time: Reading JULES_TIME namelist…
{MPI Task 3} [INFO] init_time: Reading JULES_SPINUP namelist…
{MPI Task 3} [INFO] init_time: No leap years
{MPI Task 5} [INFO] init_time: Reading JULES_TIME namelist…
{MPI Task 9} [INFO] init_time: Reading JULES_TIME namelist…
{MPI Task 3} [INFO] init_time: Timestep is 3600 seconds
{MPI Task 9} [INFO] init_time: Reading JULES_SPINUP namelist…
{MPI Task 9} [INFO] init_time: No leap years
{MPI Task 5} [INFO] init_time: Reading JULES_SPINUP namelist…
{MPI Task 7} [INFO] init_time: Reading JULES_SPINUP namelist…
{MPI Task 7} [INFO] init_time: No leap years
{MPI Task 7} [INFO] init_time: Timestep is 3600 seconds
{MPI Task 3} [INFO] init_time: Main run start - 1860-01-01 00:00:00
{MPI Task 3} [INFO] init_time: Main run end - 1880-01-01 00:00:00
{MPI Task 3} [INFO] init_time: No spinup requested
{MPI Task 9} [INFO] init_time: Timestep is 3600 seconds
{MPI Task 5} [INFO] init_time: No leap years
{MPI Task 5} [INFO] init_time: Timestep is 3600 seconds
{MPI Task 7} [INFO] init_time: Main run start - 1860-01-01 00:00:00
{MPI Task 7} [INFO] init_time: Main run end - 1880-01-01 00:00:00
{MPI Task 7} [INFO] init_time: No spinup requested
{MPI Task 3} [INFO] init_radiation: Reading JULES_RADIATION namelist…
{MPI Task 9} [INFO] init_time: Main run start - 1860-01-01 00:00:00
{MPI Task 9} [INFO] init_time: Main run end - 1880-01-01 00:00:00
{MPI Task 9} [INFO] init_time: No spinup requested
{MPI Task 5} [INFO] init_time: Main run start - 1860-01-01 00:00:00
{MPI Task 5} [INFO] init_time: Main run end - 1880-01-01 00:00:00
{MPI Task 5} [INFO] init_time: No spinup requested
{MPI Task 7} [INFO] init_radiation: Reading JULES_RADIATION namelist…
{MPI Task 9} [INFO] init_radiation: Reading JULES_RADIATION namelist…
{MPI Task 5} [INFO] init_radiation: Reading JULES_RADIATION namelist…
{MPI Task 8} [INFO] READ_NML_JULES_MODEL_ENVIRONMENT: Reading JULES_MODEL_ENVIRONMENT namelist…
{MPI Task 8} [INFO] jules_model_environment: Contents of namelist jules_model_environment
{MPI Task 8} [INFO] jules_model_environment: l_jules_parent = 0
{MPI Task 8} [INFO] init_lsm: Reading JULES_LSM_SWITCH namelist…
{MPI Task 0} [INFO] READ_NML_JULES_MODEL_ENVIRONMENT: Reading JULES_MODEL_ENVIRONMENT namelist…
{MPI Task 8} [INFO] init_lsm: Land surface model selected is JULES
{MPI Task 1} [INFO] READ_NML_JULES_MODEL_ENVIRONMENT: Reading JULES_MODEL_ENVIRONMENT namelist…
{MPI Task 0} [INFO] jules_model_environment: Contents of namelist jules_model_environment
{MPI Task 2} [INFO] READ_NML_JULES_MODEL_ENVIRONMENT: Reading JULES_MODEL_ENVIRONMENT namelist…
{MPI Task 0} [INFO] jules_model_environment: l_jules_parent = 0
{MPI Task 0} [INFO] init_lsm: Reading JULES_LSM_SWITCH namelist…
{MPI Task 6} [INFO] jules_model_environment: Contents of namelist jules_model_environment
{MPI Task 8} [INFO] init_surface_types: Reading JULES_SURFACE_TYPES namelist…
{MPI Task 1} [INFO] jules_model_environment: Contents of namelist jules_model_environment
{MPI Task 4} [INFO] jules_model_environment: l_jules_parent = 0
{MPI Task 4} [INFO] init_lsm: Reading JULES_LSM_SWITCH namelist…
{MPI Task 1} [INFO] jules_model_environment: l_jules_parent = 0
{MPI Task 8} [INFO] init_surface_types: Using 5 natural PFTs and 0 crop PFTs
{MPI Task 8} [INFO] init_surface_types: Using 4 non-veg surface types
{MPI Task 1} [INFO] init_lsm: Reading JULES_LSM_SWITCH namelist…
{MPI Task 8} [INFO] init_surface_types: Soil is type #8
{MPI Task 0} [INFO] init_lsm: Land surface model selected is JULES
{MPI Task 4} [INFO] init_lsm: Land surface model selected is JULES
{MPI Task 2} [INFO] jules_model_environment: Contents of namelist jules_model_environment
{MPI Task 0} [INFO] init_surface_types: Reading JULES_SURFACE_TYPES namelist…
{MPI Task 4} [INFO] init_surface_types: Reading JULES_SURFACE_TYPES namelist…
{MPI Task 1} [INFO] init_lsm: Land surface model selected is JULES
{MPI Task 1} [INFO] init_surface_types: Reading JULES_SURFACE_TYPES namelist…
{MPI Task 6} [INFO] jules_model_environment: l_jules_parent = 0
{MPI Task 8} [INFO] init_surface_types: Lake (inland water) is type #7
{MPI Task 8} [INFO] init_surface_types: Land ice is type #9
{MPI Task 2} [INFO] jules_model_environment: l_jules_parent = 0
{MPI Task 8} [INFO] init_surface_types: Urban is type #6
{MPI Task 8} [INFO] init_surface_types: No canyon type specified (URBAN-2T or MORUSES)
{MPI Task 2} [INFO] init_lsm: Reading JULES_LSM_SWITCH namelist…
{MPI Task 8} [INFO] init_surface_types: No roof type specified (URBAN-2T or MORUSES)
{MPI Task 8} [INFO] init_surface: Reading JULES_SURFACE namelist…
{MPI Task 4} [INFO] init_surface_types: Using 5 natural PFTs and 0 crop PFTs
{MPI Task 4} [INFO] init_surface_types: Using 4 non-veg surface types
{MPI Task 4} [INFO] init_surface_types: Soil is type #8
{MPI Task 0} [INFO] init_surface_types: Using 5 natural PFTs and 0 crop PFTs
{MPI Task 0} [INFO] init_surface_types: Using 4 non-veg surface types
{MPI Task 4} [INFO] init_surface_types: Lake (inland water) is type #7
{MPI Task 4} [INFO] init_surface_types: Land ice is type #9
{MPI Task 0} [INFO] init_surface_types: Soil is type #8
{MPI Task 0} [INFO] init_surface_types: Lake (inland water) is type #7
{MPI Task 1} [INFO] init_surface_types: Using 5 natural PFTs and 0 crop PFTs
{MPI Task 1} [INFO] init_surface_types: Using 4 non-veg surface types
{MPI Task 1} [INFO] init_surface_types: Soil is type #8
{MPI Task 1} [INFO] init_surface_types: Lake (inland water) is type #7
{MPI Task 1} [INFO] init_surface_types: Land ice is type #9
{MPI Task 4} [INFO] init_surface_types: Urban is type #6
{MPI Task 4} [INFO] init_surface_types: No canyon type specified (URBAN-2T or MORUSES)
{MPI Task 4} [INFO] init_surface_types: No roof type specified (URBAN-2T or MORUSES)
{MPI Task 4} [INFO] init_surface: Reading JULES_SURFACE namelist…
{MPI Task 2} [INFO] init_lsm: Land surface model selected is JULES
{MPI Task 2} [INFO] init_surface_types: Reading JULES_SURFACE_TYPES namelist…
{MPI Task 0} [INFO] init_surface_types: Land ice is type #9
{MPI Task 0} [INFO] init_surface_types: Urban is type #6
{MPI Task 0} [INFO] init_surface_types: No canyon type specified (URBAN-2T or MORUSES)
{MPI Task 0} [INFO] init_surface_types: No roof type specified (URBAN-2T or MORUSES)
{MPI Task 0} [INFO] init_surface: Reading JULES_SURFACE namelist…
{MPI Task 1} [INFO] init_surface_types: Urban is type #6
{MPI Task 1} [INFO] init_surface_types: No canyon type specified (URBAN-2T or MORUSES)
{MPI Task 1} [INFO] init_surface_types: No roof type specified (URBAN-2T or MORUSES)
{MPI Task 1} [INFO] init_surface: Reading JULES_SURFACE namelist…
{MPI Task 8} [INFO] init_time: Reading JULES_TIME namelist…
{MPI Task 6} [INFO] init_lsm: Reading JULES_LSM_SWITCH namelist…
{MPI Task 4} [INFO] init_time: Reading JULES_TIME namelist…
{MPI Task 2} [INFO] init_surface_types: Using 5 natural PFTs and 0 crop PFTs
{MPI Task 2} [INFO] init_surface_types: Using 4 non-veg surface types
{MPI Task 2} [INFO] init_surface_types: Soil is type #8
{MPI Task 6} [INFO] init_lsm: Land surface model selected is JULES
{MPI Task 4} [INFO] init_time: Reading JULES_SPINUP namelist…
{MPI Task 4} [INFO] init_time: No leap years
{MPI Task 2} [INFO] init_surface_types: Lake (inland water) is type #7
{MPI Task 2} [INFO] init_surface_types: Land ice is type #9
{MPI Task 2} [INFO] init_surface_types: Urban is type #6
{MPI Task 2} [INFO] init_surface_types: No canyon type specified (URBAN-2T or MORUSES)
{MPI Task 6} [INFO] init_surface_types: Reading JULES_SURFACE_TYPES namelist…
{MPI Task 2} [INFO] init_surface_types: No roof type specified (URBAN-2T or MORUSES)
{MPI Task 2} [INFO] init_surface: Reading JULES_SURFACE namelist…
{MPI Task 0} [INFO] init_time: Reading JULES_TIME namelist…
{MPI Task 4} [INFO] init_time: Timestep is 3600 seconds
{MPI Task 1} [INFO] init_time: Reading JULES_TIME namelist…
{MPI Task 0} [INFO] init_time: Reading JULES_SPINUP namelist…
{MPI Task 4} [INFO] init_time: Main run start - 1860-01-01 00:00:00
{MPI Task 4} [INFO] init_time: Main run end - 1880-01-01 00:00:00
{MPI Task 8} [INFO] init_time: Reading JULES_SPINUP namelist…
{MPI Task 0} [INFO] init_time: No leap years
{MPI Task 4} [INFO] init_time: No spinup requested
{MPI Task 4} [INFO] init_radiation: Reading JULES_RADIATION namelist…
{MPI Task 1} [INFO] init_time: Reading JULES_SPINUP namelist…
{MPI Task 1} [INFO] init_time: No leap years
{MPI Task 0} [INFO] init_time: Timestep is 3600 seconds
{MPI Task 1} [INFO] init_time: Timestep is 3600 seconds
{MPI Task 0} [INFO] init_time: Main run start - 1860-01-01 00:00:00
{MPI Task 0} [INFO] init_time: Main run end - 1880-01-01 00:00:00
{MPI Task 1} [INFO] init_time: Main run start - 1860-01-01 00:00:00
{MPI Task 1} [INFO] init_time: Main run end - 1880-01-01 00:00:00
{MPI Task 0} [INFO] init_time: No spinup requested
{MPI Task 0} [INFO] init_radiation: Reading JULES_RADIATION namelist…
{MPI Task 6} [INFO] init_surface_types: Using 5 natural PFTs and 0 crop PFTs
{MPI Task 2} [INFO] init_time: Reading JULES_TIME namelist…
{MPI Task 1} [INFO] init_time: No spinup requested
{MPI Task 1} [INFO] init_radiation: Reading JULES_RADIATION namelist…
{MPI Task 9} [INFO] init_hydrology: Reading JULES_HYDROLOGY namelist…
{MPI Task 5} [INFO] init_hydrology: Reading JULES_HYDROLOGY namelist…
{MPI Task 8} [INFO] init_time: No leap years
{MPI Task 8} [INFO] init_time: Timestep is 3600 seconds
{MPI Task 7} [INFO] init_hydrology: Reading JULES_HYDROLOGY namelist…
{MPI Task 3} [INFO] init_hydrology: Reading JULES_HYDROLOGY namelist…
{MPI Task 2} [INFO] init_time: Reading JULES_SPINUP namelist…
{MPI Task 2} [INFO] init_time: No leap years
{MPI Task 2} [INFO] init_time: Timestep is 3600 seconds
{MPI Task 4} [INFO] init_hydrology: Reading JULES_HYDROLOGY namelist…
{MPI Task 8} [INFO] init_time: Main run start - 1860-01-01 00:00:00
{MPI Task 6} [INFO] init_surface_types: Using 4 non-veg surface types
{MPI Task 6} [INFO] init_surface_types: Soil is type #8
{MPI Task 2} [INFO] init_time: Main run start - 1860-01-01 00:00:00
{MPI Task 2} [INFO] init_time: Main run end - 1880-01-01 00:00:00
{MPI Task 6} [INFO] init_surface_types: Lake (inland water) is type #7
{MPI Task 6} [INFO] init_surface_types: Land ice is type #9
{MPI Task 6} [INFO] init_surface_types: Urban is type #6
{MPI Task 2} [INFO] init_time: No spinup requested
{MPI Task 2} [INFO] init_radiation: Reading JULES_RADIATION namelist…
{MPI Task 6} [INFO] init_surface_types: No canyon type specified (URBAN-2T or MORUSES)
{MPI Task 6} [INFO] init_surface_types: No roof type specified (URBAN-2T or MORUSES)
{MPI Task 6} [INFO] init_surface: Reading JULES_SURFACE namelist…
{MPI Task 0} [INFO] init_hydrology: Reading JULES_HYDROLOGY namelist…
{MPI Task 5} [INFO] init_hydrology: TOPMODEL is on
{MPI Task 9} [INFO] init_hydrology: TOPMODEL is on
{MPI Task 5} [INFO] init_soil: Reading JULES_SOIL namelist…
{MPI Task 7} [INFO] init_hydrology: TOPMODEL is on
{MPI Task 7} [INFO] init_soil: Reading JULES_SOIL namelist…
{MPI Task 9} [INFO] init_soil: Reading JULES_SOIL namelist…
{MPI Task 3} [INFO] init_hydrology: TOPMODEL is on
{MPI Task 3} [INFO] init_soil: Reading JULES_SOIL namelist…
{MPI Task 4} [INFO] init_hydrology: TOPMODEL is on
{MPI Task 4} [INFO] init_soil: Reading JULES_SOIL namelist…
{MPI Task 1} [INFO] init_hydrology: Reading JULES_HYDROLOGY namelist…
{MPI Task 2} [INFO] init_hydrology: Reading JULES_HYDROLOGY namelist…
{MPI Task 0} [INFO] init_hydrology: TOPMODEL is on
{MPI Task 0} [INFO] init_soil: Reading JULES_SOIL namelist…
{MPI Task 6} [INFO] init_time: Reading JULES_TIME namelist…
{MPI Task 1} [INFO] init_hydrology: TOPMODEL is on
{MPI Task 7} [INFO] init_soil: Using 4 soil levels
{MPI Task 5} [INFO] init_soil: Using 4 soil levels
{MPI Task 8} [INFO] init_time: Main run end - 1880-01-01 00:00:00
{MPI Task 8} [INFO] init_time: No spinup requested
{MPI Task 9} [INFO] init_soil: Using 4 soil levels
{MPI Task 3} [INFO] init_soil: Using 4 soil levels
{MPI Task 4} [INFO] init_soil: Using 4 soil levels
{MPI Task 1} [INFO] init_soil: Reading JULES_SOIL namelist…
{MPI Task 2} [INFO] init_hydrology: TOPMODEL is on
{MPI Task 2} [INFO] init_soil: Reading JULES_SOIL namelist…
{MPI Task 5} [INFO] init_soil: Soil levels: 0.1000000, 0.2500000, 0.6500000, 2.000000
{MPI Task 5} [INFO] init_soil: van Genuchten model will be used
{MPI Task 8} [INFO] init_radiation: Reading JULES_RADIATION namelist…
{MPI Task 0} [INFO] init_soil: Using 4 soil levels
{MPI Task 5} [INFO] init_soil: l_soil_sat_down = T - excess water is pushed down
{MPI Task 5} [INFO] init_soil: soilHc_method = 2 - following simplified Johansen (1975)
{MPI Task 7} [INFO] init_soil: Soil levels: 0.1000000, 0.2500000, 0.6500000, 2.000000
{MPI Task 7} [INFO] init_soil: van Genuchten model will be used
{MPI Task 7} [INFO] init_soil: l_soil_sat_down = T - excess water is pushed down
{MPI Task 9} [INFO] init_soil: Soil levels: 0.1000000, 0.2500000, 0.6500000, 2.000000
{MPI Task 9} [INFO] init_soil: van Genuchten model will be used
{MPI Task 9} [INFO] init_soil: l_soil_sat_down = T - excess water is pushed down
{MPI Task 9} [INFO] init_soil: soilHc_method = 2 - following simplified Johansen (1975)
{MPI Task 9} [INFO] init_soil: l_tile_soil = F. Soil tiling is switched off: nsoilt = 1
{MPI Task 3} [INFO] init_soil: Soil levels: 0.1000000, 0.2500000, 0.6500000, 2.000000
{MPI Task 3} [INFO] init_soil: van Genuchten model will be used
{MPI Task 3} [INFO] init_soil: l_soil_sat_down = T - excess water is pushed down
{MPI Task 3} [INFO] init_soil: soilHc_method = 2 - following simplified Johansen (1975)
{MPI Task 3} [INFO] init_soil: l_tile_soil = F. Soil tiling is switched off: nsoilt = 1
{MPI Task 4} [INFO] init_soil: Soil levels: 0.1000000, 0.2500000, 0.6500000, 2.000000
{MPI Task 4} [INFO] init_soil: van Genuchten model will be used
{MPI Task 4} [INFO] init_soil: l_soil_sat_down = T - excess water is pushed down
{MPI Task 4} [INFO] init_soil: soilHc_method = 2 - following simplified Johansen (1975)
{MPI Task 4} [INFO] init_soil: l_tile_soil = F. Soil tiling is switched off: nsoilt = 1
{MPI Task 5} [INFO] init_soil: l_tile_soil = F. Soil tiling is switched off: nsoilt = 1
{MPI Task 5} [INFO] init_vegetation: Reading JULES_VEGETATION namelist…
{MPI Task 7} [INFO] init_soil: soilHc_method = 2 - following simplified Johansen (1975)
{MPI Task 7} [INFO] init_soil: l_tile_soil = F. Soil tiling is switched off: nsoilt = 1
{MPI Task 7} [INFO] init_vegetation: Reading JULES_VEGETATION namelist…
{MPI Task 0} [INFO] init_soil: Soil levels: 0.1000000, 0.2500000, 0.6500000, 2.000000
{MPI Task 0} [INFO] init_soil: van Genuchten model will be used
{MPI Task 0} [INFO] init_soil: l_soil_sat_down = T - excess water is pushed down
{MPI Task 0} [INFO] init_soil: soilHc_method = 2 - following simplified Johansen (1975)
{MPI Task 0} [INFO] init_soil: l_tile_soil = F. Soil tiling is switched off: nsoilt = 1
{MPI Task 9} [INFO] init_vegetation: Reading JULES_VEGETATION namelist…
{MPI Task 3} [INFO] init_vegetation: Reading JULES_VEGETATION namelist…
{MPI Task 4} [INFO] init_vegetation: Reading JULES_VEGETATION namelist…
{MPI Task 0} [INFO] init_vegetation: Reading JULES_VEGETATION namelist…
{MPI Task 1} [INFO] init_soil: Using 4 soil levels
{MPI Task 2} [INFO] init_soil: Using 4 soil levels
{MPI Task 8} [INFO] init_hydrology: Reading JULES_HYDROLOGY namelist…
{MPI Task 1} [INFO] init_soil: Soil levels: 0.1000000, 0.2500000, 0.6500000, 2.000000
{MPI Task 1} [INFO] init_soil: van Genuchten model will be used
{MPI Task 1} [INFO] init_soil: l_soil_sat_down = T - excess water is pushed down
{MPI Task 1} [INFO] init_soil: soilHc_method = 2 - following simplified Johansen (1975)
{MPI Task 2} [INFO] init_soil: Soil levels: 0.1000000, 0.2500000, 0.6500000, 2.000000
{MPI Task 2} [INFO] init_soil: van Genuchten model will be used
{MPI Task 2} [INFO] init_soil: l_soil_sat_down = T - excess water is pushed down
{MPI Task 2} [INFO] init_soil: soilHc_method = 2 - following simplified Johansen (1975)
{MPI Task 1} [INFO] init_soil: l_tile_soil = F. Soil tiling is switched off: nsoilt = 1
{MPI Task 2} [INFO] init_soil: l_tile_soil = F. Soil tiling is switched off: nsoilt = 1
{MPI Task 2} [INFO] init_vegetation: Reading JULES_VEGETATION namelist…
{MPI Task 1} [INFO] init_vegetation: Reading JULES_VEGETATION namelist…
{MPI Task 5} [INFO] init_vegetation: irr_crop = 0: continuous irrigation
{MPI Task 5} [INFO] init_vegetation: Using can_rad_mod = 4 and can_model = 4
{MPI Task 6} [INFO] init_time: Reading JULES_SPINUP namelist…
{MPI Task 9} [INFO] init_vegetation: irr_crop = 0: continuous irrigation
{MPI Task 9} [INFO] init_vegetation: Using can_rad_mod = 4 and can_model = 4
{MPI Task 3} [INFO] init_vegetation: irr_crop = 0: continuous irrigation
{MPI Task 3} [INFO] init_vegetation: Using can_rad_mod = 4 and can_model = 4
{MPI Task 5} [INFO] init_soil_biogeochem: Reading JULES_SOIL_BIOGEOCHEM namelist…
{MPI Task 7} [INFO] init_vegetation: irr_crop = 0: continuous irrigation
{MPI Task 3} [INFO] init_soil_biogeochem: Reading JULES_SOIL_BIOGEOCHEM namelist…
{MPI Task 4} [INFO] init_vegetation: irr_crop = 0: continuous irrigation
{MPI Task 4} [INFO] init_vegetation: Using can_rad_mod = 4 and can_model = 4
{MPI Task 9} [INFO] init_soil_biogeochem: Reading JULES_SOIL_BIOGEOCHEM namelist…
{MPI Task 7} [INFO] init_vegetation: Using can_rad_mod = 4 and can_model = 4
{MPI Task 0} [INFO] init_vegetation: irr_crop = 0: continuous irrigation
{MPI Task 0} [INFO] init_vegetation: Using can_rad_mod = 4 and can_model = 4
{MPI Task 7} [INFO] init_soil_biogeochem: Reading JULES_SOIL_BIOGEOCHEM namelist…
{MPI Task 4} [INFO] init_soil_biogeochem: Reading JULES_SOIL_BIOGEOCHEM namelist…
{MPI Task 0} [INFO] init_soil_biogeochem: Reading JULES_SOIL_BIOGEOCHEM namelist…
{MPI Task 1} [INFO] init_vegetation: irr_crop = 0: continuous irrigation
{MPI Task 1} [INFO] init_vegetation: Using can_rad_mod = 4 and can_model = 4
{MPI Task 2} [INFO] init_vegetation: irr_crop = 0: continuous irrigation
{MPI Task 2} [INFO] init_vegetation: Using can_rad_mod = 4 and can_model = 4
{MPI Task 1} [INFO] init_soil_biogeochem: Reading JULES_SOIL_BIOGEOCHEM namelist…
{MPI Task 8} [INFO] init_hydrology: TOPMODEL is on
{MPI Task 5} [INFO] init_soil_biogeochem: Soil C is modelled using a single pool model with fixed C.
{MPI Task 5} [INFO] init_soil_biogeochem: Q10 equation will be used for soil respiration
{MPI Task 5} [INFO] init_snow: Reading JULES_SNOW namelist…
{MPI Task 6} [INFO] init_time: No leap years
{MPI Task 6} [INFO] init_time: Timestep is 3600 seconds
{MPI Task 2} [INFO] init_soil_biogeochem: Reading JULES_SOIL_BIOGEOCHEM namelist…
{MPI Task 9} [INFO] init_soil_biogeochem: Soil C is modelled using a single pool model with fixed C.
{MPI Task 9} [INFO] init_soil_biogeochem: Q10 equation will be used for soil respiration
{MPI Task 3} [INFO] init_soil_biogeochem: Soil C is modelled using a single pool model with fixed C.
{MPI Task 3} [INFO] init_soil_biogeochem: Q10 equation will be used for soil respiration
{MPI Task 9} [INFO] init_snow: Reading JULES_SNOW namelist…
{MPI Task 7} [INFO] init_soil_biogeochem: Soil C is modelled using a single pool model with fixed C.
{MPI Task 7} [INFO] init_soil_biogeochem: Q10 equation will be used for soil respiration
{MPI Task 4} [INFO] init_soil_biogeochem: Soil C is modelled using a single pool model with fixed C.
{MPI Task 4} [INFO] init_soil_biogeochem: Q10 equation will be used for soil respiration
{MPI Task 4} [INFO] init_snow: Reading JULES_SNOW namelist…
{MPI Task 6} [INFO] init_time: Main run start - 1860-01-01 00:00:00
{MPI Task 6} [INFO] init_time: Main run end - 1880-01-01 00:00:00
{MPI Task 6} [INFO] init_time: No spinup requested
{MPI Task 0} [INFO] init_soil_biogeochem: Soil C is modelled using a single pool model with fixed C.
{MPI Task 0} [INFO] init_soil_biogeochem: Q10 equation will be used for soil respiration
{MPI Task 3} [INFO] init_snow: Reading JULES_SNOW namelist…
{MPI Task 7} [INFO] init_snow: Reading JULES_SNOW namelist…
{MPI Task 0} [INFO] init_snow: Reading JULES_SNOW namelist…
{MPI Task 6} [INFO] init_radiation: Reading JULES_RADIATION namelist…
{MPI Task 1} [INFO] init_soil_biogeochem: Soil C is modelled using a single pool model with fixed C.
{MPI Task 1} [INFO] init_soil_biogeochem: Q10 equation will be used for soil respiration
{MPI Task 1} [INFO] init_snow: Reading JULES_SNOW namelist…
{MPI Task 2} [INFO] init_soil_biogeochem: Soil C is modelled using a single pool model with fixed C.
{MPI Task 2} [INFO] init_soil_biogeochem: Q10 equation will be used for soil respiration
{MPI Task 2} [INFO] init_snow: Reading JULES_SNOW namelist…
{MPI Task 5} [INFO] init_snow: Using multi-layer snow scheme with 3 levels
{MPI Task 5} [INFO] init_rivers: Reading JULES_RIVERS namelist…
{MPI Task 9} [INFO] init_snow: Using multi-layer snow scheme with 3 levels
{MPI Task 9} [INFO] init_rivers: Reading JULES_RIVERS namelist…
{MPI Task 3} [INFO] init_snow: Using multi-layer snow scheme with 3 levels
{MPI Task 3} [INFO] init_rivers: Reading JULES_RIVERS namelist…
{MPI Task 7} [INFO] init_snow: Using multi-layer snow scheme with 3 levels
{MPI Task 7} [INFO] init_rivers: Reading JULES_RIVERS namelist…
{MPI Task 4} [INFO] init_snow: Using multi-layer snow scheme with 3 levels
{MPI Task 4} [INFO] init_rivers: Reading JULES_RIVERS namelist…
{MPI Task 6} [INFO] init_hydrology: Reading JULES_HYDROLOGY namelist…
{MPI Task 8} [INFO] init_soil: Reading JULES_SOIL namelist…
{MPI Task 0} [INFO] init_snow: Using multi-layer snow scheme with 3 levels
{MPI Task 0} [INFO] init_rivers: Reading JULES_RIVERS namelist…
{MPI Task 1} [INFO] init_snow: Using multi-layer snow scheme with 3 levels
{MPI Task 1} [INFO] init_rivers: Reading JULES_RIVERS namelist…
{MPI Task 5} [INFO] init_rivers: No river routing selected
{MPI Task 5} [INFO] switches_urban: Contents of namelist jules_urban_switches
{MPI Task 2} [INFO] init_snow: Using multi-layer snow scheme with 3 levels
{MPI Task 2} [INFO] init_rivers: Reading JULES_RIVERS namelist…
{MPI Task 9} [INFO] init_rivers: No river routing selected
{MPI Task 9} [INFO] switches_urban: Contents of namelist jules_urban_switches
{MPI Task 5} [INFO] switches_urban: l_moruses_albedo = F
{MPI Task 3} [INFO] init_rivers: No river routing selected
{MPI Task 5} [INFO] switches_urban: l_moruses_emissivity = F
{MPI Task 5} [INFO] switches_urban: l_moruses_rough = F
{MPI Task 7} [INFO] init_rivers: No river routing selected
{MPI Task 7} [INFO] switches_urban: Contents of namelist jules_urban_switches
{MPI Task 3} [INFO] switches_urban: Contents of namelist jules_urban_switches
{MPI Task 9} [INFO] switches_urban: l_moruses_albedo = F
{MPI Task 9} [INFO] switches_urban: l_moruses_emissivity = F
{MPI Task 5} [INFO] switches_urban: l_moruses_storage = F
{MPI Task 5} [INFO] switches_urban: l_moruses_storage_thin = F
{MPI Task 9} [INFO] switches_urban: l_moruses_rough = F
{MPI Task 9} [INFO] switches_urban: l_moruses_storage = F
{MPI Task 3} [INFO] switches_urban: l_moruses_albedo = F
{MPI Task 3} [INFO] switches_urban: l_moruses_emissivity = F
{MPI Task 5} [INFO] switches_urban: l_moruses_macdonald = F
{MPI Task 5} [INFO] switches_urban: - - - - - - end of namelist - - - - - -
{MPI Task 5} [INFO] urban_param: Contents of namelist jules_urban2t_param
{MPI Task 2} [INFO] init_rivers: No river routing selected
{MPI Task 2} [INFO] switches_urban: Contents of namelist jules_urban_switches
{MPI Task 3} [INFO] switches_urban: l_moruses_rough = F
{MPI Task 3} [INFO] switches_urban: l_moruses_storage = F
{MPI Task 4} [INFO] init_rivers: No river routing selected
{MPI Task 4} [INFO] switches_urban: Contents of namelist jules_urban_switches
{MPI Task 8} [INFO] init_soil: Using 4 soil levels
{MPI Task 0} [INFO] init_rivers: No river routing selected
{MPI Task 0} [INFO] switches_urban: Contents of namelist jules_urban_switches
{MPI Task 7} [INFO] switches_urban: l_moruses_albedo = F
{MPI Task 7} [INFO] switches_urban: l_moruses_emissivity = F
{MPI Task 7} [INFO] switches_urban: l_moruses_rough = F
{MPI Task 7} [INFO] switches_urban: l_moruses_storage = F
{MPI Task 9} [INFO] switches_urban: l_moruses_storage_thin = F
{MPI Task 9} [INFO] switches_urban: l_moruses_macdonald = F
{MPI Task 9} [INFO] switches_urban: - - - - - - end of namelist - - - - - -
{MPI Task 9} [INFO] urban_param: Contents of namelist jules_urban2t_param
{MPI Task 1} [INFO] init_rivers: No river routing selected
{MPI Task 1} [INFO] switches_urban: Contents of namelist jules_urban_switches
{MPI Task 5} [INFO] urban_param: anthrop_heat_scale = 1.000000
{MPI Task 5} [INFO] urban_param: - - - - - - end of namelist - - - - - -
{MPI Task 2} [INFO] switches_urban: l_moruses_albedo = F
{MPI Task 2} [INFO] switches_urban: l_moruses_emissivity = F
{MPI Task 3} [INFO] switches_urban: l_moruses_storage_thin = F
{MPI Task 3} [INFO] switches_urban: l_moruses_macdonald = F
{MPI Task 3} [INFO] switches_urban: - - - - - - end of namelist - - - - - -
{MPI Task 3} [INFO] urban_param: Contents of namelist jules_urban2t_param
{MPI Task 4} [INFO] switches_urban: l_moruses_albedo = F
{MPI Task 4} [INFO] switches_urban: l_moruses_emissivity = F
{MPI Task 4} [INFO] switches_urban: l_moruses_rough = F
{MPI Task 0} [INFO] switches_urban: l_moruses_albedo = F
{MPI Task 0} [INFO] switches_urban: l_moruses_emissivity = F
{MPI Task 0} [INFO] switches_urban: l_moruses_rough = F
{MPI Task 7} [INFO] switches_urban: l_moruses_storage_thin = F
{MPI Task 7} [INFO] switches_urban: l_moruses_macdonald = F
{MPI Task 7} [INFO] switches_urban: - - - - - - end of namelist - - - - - -
{MPI Task 7} [INFO] urban_param: Contents of namelist jules_urban2t_param
{MPI Task 9} [INFO] urban_param: anthrop_heat_scale = 1.000000
{MPI Task 9} [INFO] urban_param: - - - - - - end of namelist - - - - - -
{MPI Task 1} [INFO] switches_urban: l_moruses_albedo = F
{MPI Task 1} [INFO] switches_urban: l_moruses_emissivity = F
{MPI Task 1} [INFO] switches_urban: l_moruses_rough = F
{MPI Task 1} [INFO] switches_urban: l_moruses_storage = F
{MPI Task 5} [INFO] init_input_grid: Reading JULES_INPUT_GRID namelist…
{MPI Task 2} [INFO] switches_urban: l_moruses_rough = F
{MPI Task 2} [INFO] switches_urban: l_moruses_storage = F
{MPI Task 2} [INFO] switches_urban: l_moruses_storage_thin = F
{MPI Task 6} [INFO] init_hydrology: TOPMODEL is on
{MPI Task 4} [INFO] switches_urban: l_moruses_storage = F
{MPI Task 4} [INFO] switches_urban: l_moruses_storage_thin = F
{MPI Task 4} [INFO] switches_urban: l_moruses_macdonald = F
{MPI Task 4} [INFO] switches_urban: - - - - - - end of namelist - - - - - -
{MPI Task 0} [INFO] switches_urban: l_moruses_storage = F
{MPI Task 0} [INFO] switches_urban: l_moruses_storage_thin = F
{MPI Task 0} [INFO] switches_urban: l_moruses_macdonald = F
{MPI Task 1} [INFO] switches_urban: l_moruses_storage_thin = F
{MPI Task 1} [INFO] switches_urban: l_moruses_macdonald = F
{MPI Task 1} [INFO] switches_urban: - - - - - - end of namelist - - - - - -
{MPI Task 9} [INFO] init_input_grid: Reading JULES_INPUT_GRID namelist…
{MPI Task 0} [INFO] switches_urban: - - - - - - end of namelist - - - - - -
{MPI Task 0} [INFO] urban_param: Contents of namelist jules_urban2t_param
{MPI Task 2} [INFO] switches_urban: l_moruses_macdonald = F
{MPI Task 2} [INFO] switches_urban: - - - - - - end of namelist - - - - - -
{MPI Task 2} [INFO] urban_param: Contents of namelist jules_urban2t_param
{MPI Task 3} [INFO] urban_param: anthrop_heat_scale = 1.000000
{MPI Task 3} [INFO] urban_param: - - - - - - end of namelist - - - - - -
{MPI Task 8} [INFO] init_soil: Soil levels: 0.1000000, 0.2500000, 0.6500000, 2.000000
{MPI Task 6} [INFO] init_soil: Reading JULES_SOIL namelist…
{MPI Task 4} [INFO] urban_param: Contents of namelist jules_urban2t_param
{MPI Task 7} [INFO] urban_param: anthrop_heat_scale = 1.000000
{MPI Task 7} [INFO] urban_param: - - - - - - end of namelist - - - - - -
{MPI Task 1} [INFO] urban_param: Contents of namelist jules_urban2t_param
{MPI Task 1} [INFO] urban_param: anthrop_heat_scale = 1.000000
{MPI Task 7} [INFO] init_input_grid: Reading JULES_INPUT_GRID namelist…
{MPI Task 0} [INFO] urban_param: anthrop_heat_scale = 1.000000
{MPI Task 0} [INFO] urban_param: - - - - - - end of namelist - - - - - -
{MPI Task 2} [INFO] urban_param: anthrop_heat_scale = 1.000000
{MPI Task 2} [INFO] urban_param: - - - - - - end of namelist - - - - - -
{MPI Task 3} [INFO] init_input_grid: Reading JULES_INPUT_GRID namelist…
{MPI Task 4} [INFO] urban_param: anthrop_heat_scale = 1.000000
{MPI Task 4} [INFO] urban_param: - - - - - - end of namelist - - - - - -
{MPI Task 1} [INFO] urban_param: - - - - - - end of namelist - - - - - -
{MPI Task 4} [INFO] init_input_grid: Reading JULES_INPUT_GRID namelist…
{MPI Task 6} [INFO] init_soil: Using 4 soil levels
{MPI Task 8} [INFO] init_soil: van Genuchten model will be used
{MPI Task 8} [INFO] init_soil: l_soil_sat_down = T - excess water is pushed down
{MPI Task 1} [INFO] init_input_grid: Reading JULES_INPUT_GRID namelist…
{MPI Task 0} [INFO] init_input_grid: Reading JULES_INPUT_GRID namelist…
{MPI Task 2} [INFO] init_input_grid: Reading JULES_INPUT_GRID namelist…
{MPI Task 8} [INFO] init_soil: soilHc_method = 2 - following simplified Johansen (1975)
{MPI Task 8} [INFO] init_soil: l_tile_soil = F. Soil tiling is switched off: nsoilt = 1
{MPI Task 8} [INFO] init_vegetation: Reading JULES_VEGETATION namelist…
{MPI Task 8} [INFO] init_vegetation: irr_crop = 0: continuous irrigation
{MPI Task 6} [INFO] init_soil: Soil levels: 0.1000000, 0.2500000, 0.6500000, 2.000000
{MPI Task 8} [INFO] init_vegetation: Using can_rad_mod = 4 and can_model = 4
{MPI Task 8} [INFO] init_soil_biogeochem: Reading JULES_SOIL_BIOGEOCHEM namelist…
{MPI Task 6} [INFO] init_soil: van Genuchten model will be used
{MPI Task 8} [INFO] init_soil_biogeochem: Soil C is modelled using a single pool model with fixed C.
{MPI Task 6} [INFO] init_soil: l_soil_sat_down = T - excess water is pushed down
{MPI Task 6} [INFO] init_soil: soilHc_method = 2 - following simplified Johansen (1975)
{MPI Task 6} [INFO] init_soil: l_tile_soil = F. Soil tiling is switched off: nsoilt = 1
{MPI Task 6} [INFO] init_vegetation: Reading JULES_VEGETATION namelist…
{MPI Task 6} [INFO] init_vegetation: irr_crop = 0: continuous irrigation
{MPI Task 8} [INFO] init_soil_biogeochem: Q10 equation will be used for soil respiration
{MPI Task 8} [INFO] init_snow: Reading JULES_SNOW namelist…
{MPI Task 8} [INFO] init_snow: Using multi-layer snow scheme with 3 levels
{MPI Task 6} [INFO] init_vegetation: Using can_rad_mod = 4 and can_model = 4
{MPI Task 6} [INFO] init_soil_biogeochem: Reading JULES_SOIL_BIOGEOCHEM namelist…
{MPI Task 6} [INFO] init_soil_biogeochem: Soil C is modelled using a single pool model with fixed C.
{MPI Task 8} [INFO] init_rivers: Reading JULES_RIVERS namelist…
{MPI Task 8} [INFO] init_rivers: No river routing selected
{MPI Task 6} [INFO] init_soil_biogeochem: Q10 equation will be used for soil respiration
{MPI Task 6} [INFO] init_snow: Reading JULES_SNOW namelist…
{MPI Task 6} [INFO] init_snow: Using multi-layer snow scheme with 3 levels
{MPI Task 8} [INFO] switches_urban: Contents of namelist jules_urban_switches
{MPI Task 8} [INFO] switches_urban: l_moruses_albedo = F
{MPI Task 6} [INFO] init_rivers: Reading JULES_RIVERS namelist…
{MPI Task 8} [INFO] switches_urban: l_moruses_emissivity = F
{MPI Task 6} [INFO] init_rivers: No river routing selected
{MPI Task 8} [INFO] switches_urban: l_moruses_rough = F
{MPI Task 8} [INFO] switches_urban: l_moruses_storage = F
{MPI Task 8} [INFO] switches_urban: l_moruses_storage_thin = F
{MPI Task 8} [INFO] switches_urban: l_moruses_macdonald = F
{MPI Task 8} [INFO] switches_urban: - - - - - - end of namelist - - - - - -
{MPI Task 6} [INFO] switches_urban: Contents of namelist jules_urban_switches
{MPI Task 8} [INFO] urban_param: Contents of namelist jules_urban2t_param
{MPI Task 6} [INFO] switches_urban: l_moruses_albedo = F
{MPI Task 8} [INFO] urban_param: anthrop_heat_scale = 1.000000
{MPI Task 6} [INFO] switches_urban: l_moruses_emissivity = F
{MPI Task 8} [INFO] urban_param: - - - - - - end of namelist - - - - - -
{MPI Task 6} [INFO] switches_urban: l_moruses_rough = F
{MPI Task 8} [INFO] init_input_grid: Reading JULES_INPUT_GRID namelist…
{MPI Task 6} [INFO] switches_urban: l_moruses_storage = F
{MPI Task 6} [INFO] switches_urban: l_moruses_storage_thin = F
{MPI Task 6} [INFO] switches_urban: l_moruses_macdonald = F
{MPI Task 6} [INFO] switches_urban: - - - - - - end of namelist - - - - - -
{MPI Task 6} [INFO] urban_param: Contents of namelist jules_urban2t_param
{MPI Task 6} [INFO] urban_param: anthrop_heat_scale = 1.000000
{MPI Task 6} [INFO] urban_param: - - - - - - end of namelist - - - - - -
{MPI Task 6} [INFO] init_input_grid: Reading JULES_INPUT_GRID namelist…
{MPI Task 7} [INFO] init_input_grid: Size of input grid - 192 x 144
{MPI Task 1} [INFO] init_input_grid: Size of input grid - 192 x 144
{MPI Task 1} [INFO] init_latlon: Reading JULES_LATLON namelist…
{MPI Task 0} [INFO] init_input_grid: Size of input grid - 192 x 144
{MPI Task 0} [INFO] init_latlon: Reading JULES_LATLON namelist…
{MPI Task 8} [INFO] init_input_grid: Size of input grid - 192 x 144
{MPI Task 8} [INFO] init_latlon: Reading JULES_LATLON namelist…
{MPI Task 8} [INFO] init_latlon: Getting latitude and longitude for the full input grid…
{MPI Task 4} [INFO] init_input_grid: Size of input grid - 192 x 144
{MPI Task 4} [INFO] init_latlon: Reading JULES_LATLON namelist…
{MPI Task 4} [INFO] init_latlon: Getting latitude and longitude for the full input grid…
{MPI Task 3} [INFO] init_input_grid: Size of input grid - 192 x 144
{MPI Task 3} [INFO] init_latlon: Reading JULES_LATLON namelist…
{MPI Task 3} [INFO] init_latlon: Getting latitude and longitude for the full input grid…
{MPI Task 7} [INFO] init_latlon: Reading JULES_LATLON namelist…
{MPI Task 7} [INFO] init_latlon: Getting latitude and longitude for the full input grid…
{MPI Task 2} [INFO] init_input_grid: Size of input grid - 192 x 144
{MPI Task 2} [INFO] init_latlon: Reading JULES_LATLON namelist…
{MPI Task 2} [INFO] init_latlon: Getting latitude and longitude for the full input grid…
{MPI Task 9} [INFO] init_input_grid: Size of input grid - 192 x 144
{MPI Task 9} [INFO] init_latlon: Reading JULES_LATLON namelist…
{MPI Task 9} [INFO] init_latlon: Getting latitude and longitude for the full input grid…
{MPI Task 5} [INFO] init_input_grid: Size of input grid - 192 x 144
{MPI Task 5} [INFO] init_latlon: Reading JULES_LATLON namelist…
{MPI Task 5} [INFO] init_latlon: Getting latitude and longitude for the full input grid…
{MPI Task 1} [INFO] init_latlon: Getting latitude and longitude for the full input grid…
{MPI Task 1} [INFO] init_latlon: Data is on a grid - reading latitude and longitude from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedn96e_coord2.nc
{MPI Task 0} [INFO] init_latlon: Getting latitude and longitude for the full input grid…
{MPI Task 7} [INFO] init_latlon: Data is on a grid - reading latitude and longitude from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
n96e_coord2.nc
{MPI Task 0} [INFO] init_latlon: Data is on a grid - reading latitude and longitude from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedn96e_coord2.nc
{MPI Task 2} [INFO] init_latlon: Data is on a grid - reading latitude and longitude from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
n96e_coord2.nc
{MPI Task 9} [INFO] init_latlon: Data is on a grid - reading latitude and longitude from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedn96e_coord2.nc
{MPI Task 6} [INFO] init_input_grid: Size of input grid - 192 x 144
{MPI Task 3} [INFO] init_latlon: Data is on a grid - reading latitude and longitude from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
n96e_coord2.nc
{MPI Task 5} [INFO] init_latlon: Data is on a grid - reading latitude and longitude from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedn96e_coord2.nc
{MPI Task 4} [INFO] init_latlon: Data is on a grid - reading latitude and longitude from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
n96e_coord2.nc
{MPI Task 8} [INFO] init_latlon: Data is on a grid - reading latitude and longitude from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedn96e_coord2.nc
{MPI Task 6} [INFO] init_latlon: Reading JULES_LATLON namelist…
{MPI Task 6} [INFO] init_latlon: Getting latitude and longitude for the full input grid…
{MPI Task 6} [INFO] init_latlon: Data is on a grid - reading latitude and longitude from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
n96e_coord2.nc
{MPI Task 2} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedn96e_coord2.nc for reading
{MPI Task 1} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
n96e_coord2.nc for reading
{MPI Task 4} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedn96e_coord2.nc for reading
{MPI Task 5} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
n96e_coord2.nc for reading
{MPI Task 0} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedn96e_coord2.nc for reading
{MPI Task 6} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
n96e_coord2.nc for reading
{MPI Task 3} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedn96e_coord2.nc for reading
{MPI Task 9} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
n96e_coord2.nc for reading
{MPI Task 8} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedn96e_coord2.nc for reading
{MPI Task 7} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
n96e_coord2.nc for reading
{MPI Task 5} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedn96e_coord2.nc
{MPI Task 9} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
n96e_coord2.nc
{MPI Task 1} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedn96e_coord2.nc
{MPI Task 7} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
n96e_coord2.nc
{MPI Task 8} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedn96e_coord2.nc
{MPI Task 6} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
n96e_coord2.nc
{MPI Task 2} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedn96e_coord2.nc
{MPI Task 1} [INFO] init_land_frac: Reading JULES_LAND_FRAC namelist…
{MPI Task 1} [INFO] init_land_frac: Getting land fraction for the full input grid…
{MPI Task 1} [INFO] init_land_frac: Data is on a grid - reading land fraction from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
CRU-NCEPv7.landfrac.nc
{MPI Task 0} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedn96e_coord2.nc
{MPI Task 5} [INFO] init_land_frac: Reading JULES_LAND_FRAC namelist…
{MPI Task 5} [INFO] init_land_frac: Getting land fraction for the full input grid…
{MPI Task 5} [INFO] init_land_frac: Data is on a grid - reading land fraction from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
CRU-NCEPv7.landfrac.nc
{MPI Task 7} [INFO] init_land_frac: Reading JULES_LAND_FRAC namelist…
{MPI Task 7} [INFO] init_land_frac: Getting land fraction for the full input grid…
{MPI Task 7} [INFO] init_land_frac: Data is on a grid - reading land fraction from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedCRU-NCEPv7.landfrac.nc
{MPI Task 2} [INFO] init_land_frac: Reading JULES_LAND_FRAC namelist…
{MPI Task 9} [INFO] init_land_frac: Reading JULES_LAND_FRAC namelist…
{MPI Task 9} [INFO] init_land_frac: Getting land fraction for the full input grid…
{MPI Task 9} [INFO] init_land_frac: Data is on a grid - reading land fraction from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
CRU-NCEPv7.landfrac.nc
{MPI Task 8} [INFO] init_land_frac: Reading JULES_LAND_FRAC namelist…
{MPI Task 8} [INFO] init_land_frac: Getting land fraction for the full input grid…
{MPI Task 6} [INFO] init_land_frac: Reading JULES_LAND_FRAC namelist…
{MPI Task 2} [INFO] init_land_frac: Getting land fraction for the full input grid…
{MPI Task 2} [INFO] init_land_frac: Data is on a grid - reading land fraction from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedCRU-NCEPv7.landfrac.nc
{MPI Task 5} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
CRU-NCEPv7.landfrac.nc for reading
{MPI Task 1} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedCRU-NCEPv7.landfrac.nc for reading
{MPI Task 9} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
CRU-NCEPv7.landfrac.nc for reading
{MPI Task 8} [INFO] init_land_frac: Data is on a grid - reading land fraction from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedCRU-NCEPv7.landfrac.nc
{MPI Task 6} [INFO] init_land_frac: Getting land fraction for the full input grid…
{MPI Task 6} [INFO] init_land_frac: Data is on a grid - reading land fraction from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
CRU-NCEPv7.landfrac.nc
{MPI Task 7} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedCRU-NCEPv7.landfrac.nc for reading
{MPI Task 2} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
CRU-NCEPv7.landfrac.nc for reading
{MPI Task 6} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedCRU-NCEPv7.landfrac.nc for reading
{MPI Task 8} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
CRU-NCEPv7.landfrac.nc for reading
{MPI Task 3} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedn96e_coord2.nc
{MPI Task 4} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
n96e_coord2.nc
{MPI Task 3} [INFO] init_land_frac: Reading JULES_LAND_FRAC namelist…
{MPI Task 3} [INFO] init_land_frac: Getting land fraction for the full input grid…
{MPI Task 3} [INFO] init_land_frac: Data is on a grid - reading land fraction from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedCRU-NCEPv7.landfrac.nc
{MPI Task 0} [INFO] init_land_frac: Reading JULES_LAND_FRAC namelist…
{MPI Task 3} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
CRU-NCEPv7.landfrac.nc for reading
{MPI Task 4} [INFO] init_land_frac: Reading JULES_LAND_FRAC namelist…
{MPI Task 4} [INFO] init_land_frac: Getting land fraction for the full input grid…
{MPI Task 4} [INFO] init_land_frac: Data is on a grid - reading land fraction from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedCRU-NCEPv7.landfrac.nc
{MPI Task 0} [INFO] init_land_frac: Getting land fraction for the full input grid…
{MPI Task 4} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
CRU-NCEPv7.landfrac.nc for reading
{MPI Task 0} [INFO] init_land_frac: Data is on a grid - reading land fraction from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedCRU-NCEPv7.landfrac.nc
{MPI Task 0} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
CRU-NCEPv7.landfrac.nc for reading
{MPI Task 7} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedCRU-NCEPv7.landfrac.nc
{MPI Task 1} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
CRU-NCEPv7.landfrac.nc
{MPI Task 2} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedCRU-NCEPv7.landfrac.nc
{MPI Task 7} [INFO] init_model_grid: Reading JULES_MODEL_GRID namelist…
{MPI Task 7} [INFO] init_model_grid: Setting up model grid variables…
{MPI Task 9} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
CRU-NCEPv7.landfrac.nc
{MPI Task 3} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedCRU-NCEPv7.landfrac.nc
{MPI Task 1} [INFO] init_model_grid: Reading JULES_MODEL_GRID namelist…
{MPI Task 1} [INFO] init_model_grid: Setting up model grid variables…
{MPI Task 1} [INFO] init_model_grid: From the points specified, only land points will be modelled
{MPI Task 5} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
CRU-NCEPv7.landfrac.nc
{MPI Task 7} [INFO] init_model_grid: From the points specified, only land points will be modelled
{MPI Task 0} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedCRU-NCEPv7.landfrac.nc
{MPI Task 8} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
CRU-NCEPv7.landfrac.nc
{MPI Task 2} [INFO] init_model_grid: Reading JULES_MODEL_GRID namelist…
{MPI Task 2} [INFO] init_model_grid: Setting up model grid variables…
{MPI Task 2} [INFO] init_model_grid: From the points specified, only land points will be modelled
{MPI Task 6} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedCRU-NCEPv7.landfrac.nc
{MPI Task 3} [INFO] init_model_grid: Reading JULES_MODEL_GRID namelist…
{MPI Task 3} [INFO] init_model_grid: Setting up model grid variables…
{MPI Task 5} [INFO] init_model_grid: Reading JULES_MODEL_GRID namelist…
{MPI Task 5} [INFO] init_model_grid: Setting up model grid variables…
{MPI Task 9} [INFO] init_model_grid: Reading JULES_MODEL_GRID namelist…
{MPI Task 9} [INFO] init_model_grid: Setting up model grid variables…
{MPI Task 4} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
CRU-NCEPv7.landfrac.nc
{MPI Task 3} [INFO] init_model_grid: From the points specified, only land points will be modelled
{MPI Task 9} [INFO] init_model_grid: From the points specified, only land points will be modelled
{MPI Task 5} [INFO] init_model_grid: From the points specified, only land points will be modelled
{MPI Task 6} [INFO] init_model_grid: Reading JULES_MODEL_GRID namelist…
{MPI Task 0} [INFO] init_model_grid: Reading JULES_MODEL_GRID namelist…
{MPI Task 6} [INFO] init_model_grid: Setting up model grid variables…
{MPI Task 0} [INFO] init_model_grid: Setting up model grid variables…
{MPI Task 0} [INFO] init_model_grid: From the points specified, only land points will be modelled
{MPI Task 6} [INFO] init_model_grid: From the points specified, only land points will be modelled
{MPI Task 4} [INFO] init_model_grid: Reading JULES_MODEL_GRID namelist…
{MPI Task 4} [INFO] init_model_grid: Setting up model grid variables…
{MPI Task 4} [INFO] init_model_grid: From the points specified, only land points will be modelled
{MPI Task 7} [INFO] decompose_domain: Decomposing domain across 10 available tasks
{MPI Task 7} [INFO] decompose_domain: Tasks are arranged as a grid of size 10 x 1
{MPI Task 1} [INFO] decompose_domain: Decomposing domain across 10 available tasks
{MPI Task 1} [INFO] decompose_domain: Tasks are arranged as a grid of size 10 x 1
{MPI Task 2} [INFO] decompose_domain: Decomposing domain across 10 available tasks
{MPI Task 2} [INFO] decompose_domain: Tasks are arranged as a grid of size 10 x 1
{MPI Task 8} [INFO] init_model_grid: Reading JULES_MODEL_GRID namelist…
{MPI Task 8} [INFO] init_model_grid: Setting up model grid variables…
{MPI Task 3} [INFO] decompose_domain: Decomposing domain across 10 available tasks
{MPI Task 3} [INFO] decompose_domain: Tasks are arranged as a grid of size 10 x 1
{MPI Task 9} [INFO] decompose_domain: Decomposing domain across 10 available tasks
{MPI Task 9} [INFO] decompose_domain: Tasks are arranged as a grid of size 10 x 1
{MPI Task 5} [INFO] decompose_domain: Decomposing domain across 10 available tasks
{MPI Task 5} [INFO] decompose_domain: Tasks are arranged as a grid of size 10 x 1
{MPI Task 8} [INFO] init_model_grid: From the points specified, only land points will be modelled
{MPI Task 6} [INFO] decompose_domain: Decomposing domain across 10 available tasks
{MPI Task 6} [INFO] decompose_domain: Tasks are arranged as a grid of size 10 x 1
{MPI Task 0} [INFO] decompose_domain: Decomposing domain across 10 available tasks
{MPI Task 0} [INFO] decompose_domain: Tasks are arranged as a grid of size 10 x 1
{MPI Task 4} [INFO] decompose_domain: Decomposing domain across 10 available tasks
{MPI Task 4} [INFO] decompose_domain: Tasks are arranged as a grid of size 10 x 1
{MPI Task 8} [INFO] decompose_domain: Decomposing domain across 10 available tasks
{MPI Task 8} [INFO] decompose_domain: Tasks are arranged as a grid of size 10 x 1
{MPI Task 9} [INFO] init_model_grid: Size of model grid - 777 x 1
{MPI Task 9} [INFO] init_model_grid: Selected grid contains 777 land points
{MPI Task 5} [INFO] init_model_grid: Size of model grid - 777 x 1
{MPI Task 5} [INFO] init_model_grid: Selected grid contains 777 land points
{MPI Task 2} [INFO] init_model_grid: Size of model grid - 777 x 1
{MPI Task 2} [INFO] init_model_grid: Selected grid contains 777 land points
{MPI Task 4} [INFO] init_model_grid: Size of model grid - 777 x 1
{MPI Task 4} [INFO] init_model_grid: Selected grid contains 777 land points
{MPI Task 1} [INFO] init_model_grid: Size of model grid - 777 x 1
{MPI Task 1} [INFO] init_model_grid: Selected grid contains 777 land points
{MPI Task 3} [INFO] init_model_grid: Size of model grid - 777 x 1
{MPI Task 3} [INFO] init_model_grid: Selected grid contains 777 land points
{MPI Task 8} [INFO] init_model_grid: Size of model grid - 777 x 1
{MPI Task 8} [INFO] init_model_grid: Selected grid contains 777 land points
{MPI Task 0} [INFO] init_model_grid: Size of model grid - 778 x 1
{MPI Task 0} [INFO] init_model_grid: Selected grid contains 778 land points
{MPI Task 7} [INFO] init_model_grid: Size of model grid - 777 x 1
{MPI Task 7} [INFO] init_model_grid: Selected grid contains 777 land points
{MPI Task 6} [INFO] init_model_grid: Size of model grid - 777 x 1
{MPI Task 6} [INFO] init_model_grid: Selected grid contains 777 land points
{MPI Task 3} [INFO] init_surf_hgt: Reading JULES_SURF_HGT namelist…
{MPI Task 3} [INFO] init_surf_hgt: Zero height selected - setting all heights to 0.0
{MPI Task 7} [INFO] init_surf_hgt: Reading JULES_SURF_HGT namelist…
{MPI Task 7} [INFO] init_surf_hgt: Zero height selected - setting all heights to 0.0
{MPI Task 9} [INFO] init_surf_hgt: Reading JULES_SURF_HGT namelist…
{MPI Task 9} [INFO] init_surf_hgt: Zero height selected - setting all heights to 0.0
{MPI Task 5} [INFO] init_surf_hgt: Reading JULES_SURF_HGT namelist…
{MPI Task 5} [INFO] init_surf_hgt: Zero height selected - setting all heights to 0.0
{MPI Task 1} [INFO] init_surf_hgt: Reading JULES_SURF_HGT namelist…
{MPI Task 1} [INFO] init_surf_hgt: Zero height selected - setting all heights to 0.0
{MPI Task 8} [INFO] init_surf_hgt: Reading JULES_SURF_HGT namelist…
{MPI Task 2} [INFO] init_surf_hgt: Reading JULES_SURF_HGT namelist…
{MPI Task 2} [INFO] init_surf_hgt: Zero height selected - setting all heights to 0.0
{MPI Task 4} [INFO] init_surf_hgt: Reading JULES_SURF_HGT namelist…
{MPI Task 4} [INFO] init_surf_hgt: Zero height selected - setting all heights to 0.0
{MPI Task 8} [INFO] init_surf_hgt: Zero height selected - setting all heights to 0.0
{MPI Task 6} [INFO] init_surf_hgt: Reading JULES_SURF_HGT namelist…
{MPI Task 3} [INFO] init_frac: Reading JULES_FRAC namelist…
{MPI Task 9} [INFO] init_frac: Reading JULES_FRAC namelist…
{MPI Task 5} [INFO] init_frac: Reading JULES_FRAC namelist…
{MPI Task 1} [INFO] init_frac: Reading JULES_FRAC namelist…
{MPI Task 7} [INFO] init_frac: Reading JULES_FRAC namelist…
{MPI Task 3} [INFO] init_frac: Reading tile fractions from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.veg.frac.nc
{MPI Task 9} [INFO] init_frac: Reading tile fractions from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.veg.frac.nc
{MPI Task 5} [INFO] init_frac: Reading tile fractions from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.veg.frac.nc
{MPI Task 1} [INFO] init_frac: Reading tile fractions from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.veg.frac.nc
{MPI Task 7} [INFO] init_frac: Reading tile fractions from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.veg.frac.nc
{MPI Task 2} [INFO] init_frac: Reading JULES_FRAC namelist…
{MPI Task 8} [INFO] init_frac: Reading JULES_FRAC namelist…
{MPI Task 4} [INFO] init_frac: Reading JULES_FRAC namelist…
{MPI Task 2} [INFO] init_frac: Reading tile fractions from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.veg.frac.nc
{MPI Task 4} [INFO] init_frac: Reading tile fractions from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.veg.frac.nc
{MPI Task 8} [INFO] init_frac: Reading tile fractions from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.veg.frac.nc
{MPI Task 3} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.veg.frac.nc for reading
{MPI Task 9} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.veg.frac.nc for reading
{MPI Task 5} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.veg.frac.nc for reading
{MPI Task 1} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.veg.frac.nc for reading
{MPI Task 7} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.veg.frac.nc for reading
{MPI Task 6} [INFO] init_surf_hgt: Zero height selected - setting all heights to 0.0
{MPI Task 2} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.veg.frac.nc for reading
{MPI Task 4} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.veg.frac.nc for reading
{MPI Task 8} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.veg.frac.nc for reading
{MPI Task 0} [INFO] init_surf_hgt: Reading JULES_SURF_HGT namelist…
{MPI Task 0} [INFO] init_surf_hgt: Zero height selected - setting all heights to 0.0
{MPI Task 6} [INFO] init_frac: Reading JULES_FRAC namelist…
{MPI Task 6} [INFO] init_frac: Reading tile fractions from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.veg.frac.nc
{MPI Task 0} [INFO] init_frac: Reading JULES_FRAC namelist…
{MPI Task 0} [INFO] init_frac: Reading tile fractions from file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.veg.frac.nc
{MPI Task 6} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.veg.frac.nc for reading
{MPI Task 0} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.veg.frac.nc for reading
{MPI Task 3} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.veg.frac.nc
{MPI Task 6} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.veg.frac.nc
{MPI Task 9} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.veg.frac.nc
{MPI Task 5} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.veg.frac.nc
{MPI Task 1} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.veg.frac.nc
{MPI Task 6} [INFO] init_soil_props: Reading JULES_SOIL_PROPS namelist…
{MPI Task 9} [INFO] init_soil_props: Reading JULES_SOIL_PROPS namelist…
{MPI Task 1} [INFO] init_soil_props: Reading JULES_SOIL_PROPS namelist…
{MPI Task 3} [INFO] init_soil_props: Reading JULES_SOIL_PROPS namelist…
{MPI Task 9} [INFO] init_soil_props: 'b_const_z' will be read from file
{MPI Task 9} [INFO] init_soil_props: 'sathh_const_z' will be read from file
{MPI Task 1} [INFO] init_soil_props: 'b_const_z' will be read from file
{MPI Task 1} [INFO] init_soil_props: 'sathh_const_z' will be read from file
{MPI Task 1} [INFO] init_soil_props: 'satcon_const_z' will be read from file
{MPI Task 1} [INFO] init_soil_props: 'sm_sat_const_z' will be read from file
{MPI Task 3} [INFO] init_soil_props: 'b_const_z' will be read from file
{MPI Task 3} [INFO] init_soil_props: 'sathh_const_z' will be read from file
{MPI Task 3} [INFO] init_soil_props: 'satcon_const_z' will be read from file
{MPI Task 3} [INFO] init_soil_props: 'sm_sat_const_z' will be read from file
{MPI Task 3} [INFO] init_soil_props: 'sm_crit_const_z' will be read from file
{MPI Task 6} [INFO] init_soil_props: 'b_const_z' will be read from file
{MPI Task 6} [INFO] init_soil_props: 'sathh_const_z' will be read from file
{MPI Task 6} [INFO] init_soil_props: 'satcon_const_z' will be read from file
{MPI Task 6} [INFO] init_soil_props: 'sm_sat_const_z' will be read from file
{MPI Task 6} [INFO] init_soil_props: 'sm_crit_const_z' will be read from file
{MPI Task 6} [INFO] init_soil_props: 'sm_wilt_const_z' will be read from file
{MPI Task 6} [INFO] init_soil_props: 'hcap_const_z' will be read from file
{MPI Task 5} [INFO] init_soil_props: Reading JULES_SOIL_PROPS namelist…
{MPI Task 5} [INFO] init_soil_props: 'b_const_z' will be read from file
{MPI Task 5} [INFO] init_soil_props: 'sathh_const_z' will be read from file
{MPI Task 9} [INFO] init_soil_props: 'satcon_const_z' will be read from file
{MPI Task 9} [INFO] init_soil_props: 'sm_sat_const_z' will be read from file
{MPI Task 9} [INFO] init_soil_props: 'sm_crit_const_z' will be read from file
{MPI Task 9} [INFO] init_soil_props: 'sm_wilt_const_z' will be read from file
{MPI Task 9} [INFO] init_soil_props: 'hcap_const_z' will be read from file
{MPI Task 9} [INFO] init_soil_props: 'hcon_const_z' will be read from file
{MPI Task 9} [INFO] init_soil_props: 'albsoil' will be read from file
{MPI Task 5} [INFO] init_soil_props: 'satcon_const_z' will be read from file
{MPI Task 5} [INFO] init_soil_props: 'sm_sat_const_z' will be read from file
{MPI Task 5} [INFO] init_soil_props: 'sm_crit_const_z' will be read from file
{MPI Task 5} [INFO] init_soil_props: 'sm_wilt_const_z' will be read from file
{MPI Task 1} [INFO] init_soil_props: 'sm_crit_const_z' will be read from file
{MPI Task 1} [INFO] init_soil_props: 'sm_wilt_const_z' will be read from file
{MPI Task 1} [INFO] init_soil_props: 'hcap_const_z' will be read from file
{MPI Task 1} [INFO] init_soil_props: 'hcon_const_z' will be read from file
{MPI Task 1} [INFO] init_soil_props: 'albsoil' will be read from file
{MPI Task 3} [INFO] init_soil_props: 'sm_wilt_const_z' will be read from file
{MPI Task 3} [INFO] init_soil_props: 'hcap_const_z' will be read from file
{MPI Task 3} [INFO] init_soil_props: 'hcon_const_z' will be read from file
{MPI Task 3} [INFO] init_soil_props: 'albsoil' will be read from file
{MPI Task 6} [INFO] init_soil_props: 'hcon_const_z' will be read from file
{MPI Task 6} [INFO] init_soil_props: 'albsoil' will be read from file
{MPI Task 8} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.veg.frac.nc
{MPI Task 5} [INFO] init_soil_props: 'hcap_const_z' will be read from file
{MPI Task 5} [INFO] init_soil_props: 'hcon_const_z' will be read from file
{MPI Task 5} [INFO] init_soil_props: 'albsoil' will be read from file
{MPI Task 8} [INFO] init_soil_props: Reading JULES_SOIL_PROPS namelist…
{MPI Task 8} [INFO] init_soil_props: 'b_const_z' will be read from file
{MPI Task 8} [INFO] init_soil_props: 'sathh_const_z' will be read from file
{MPI Task 9} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.soil.dust.merge.nc for reading
{MPI Task 8} [INFO] init_soil_props: 'satcon_const_z' will be read from file
{MPI Task 8} [INFO] init_soil_props: 'sm_sat_const_z' will be read from file
{MPI Task 1} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.soil.dust.merge.nc for reading
{MPI Task 3} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.soil.dust.merge.nc for reading
{MPI Task 8} [INFO] init_soil_props: 'sm_crit_const_z' will be read from file
{MPI Task 8} [INFO] init_soil_props: 'sm_wilt_const_z' will be read from file
{MPI Task 8} [INFO] init_soil_props: 'hcap_const_z' will be read from file
{MPI Task 8} [INFO] init_soil_props: 'hcon_const_z' will be read from file
{MPI Task 8} [INFO] init_soil_props: 'albsoil' will be read from file
{MPI Task 5} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.soil.dust.merge.nc for reading
{MPI Task 6} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.soil.dust.merge.nc for reading
{MPI Task 0} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.veg.frac.nc
{MPI Task 4} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.veg.frac.nc
{MPI Task 7} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.veg.frac.nc
{MPI Task 8} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.soil.dust.merge.nc for reading
{MPI Task 0} [INFO] init_soil_props: Reading JULES_SOIL_PROPS namelist…
{MPI Task 0} [INFO] init_soil_props: 'b_const_z' will be read from file
{MPI Task 0} [INFO] init_soil_props: 'sathh_const_z' will be read from file
{MPI Task 0} [INFO] init_soil_props: 'satcon_const_z' will be read from file
{MPI Task 7} [INFO] init_soil_props: Reading JULES_SOIL_PROPS namelist…
{MPI Task 0} [INFO] init_soil_props: 'sm_sat_const_z' will be read from file
{MPI Task 0} [INFO] init_soil_props: 'sm_crit_const_z' will be read from file
{MPI Task 7} [INFO] init_soil_props: 'b_const_z' will be read from file
{MPI Task 7} [INFO] init_soil_props: 'sathh_const_z' will be read from file
{MPI Task 0} [INFO] init_soil_props: 'sm_wilt_const_z' will be read from file
{MPI Task 0} [INFO] init_soil_props: 'hcap_const_z' will be read from file
{MPI Task 0} [INFO] init_soil_props: 'hcon_const_z' will be read from file
{MPI Task 0} [INFO] init_soil_props: 'albsoil' will be read from file
{MPI Task 7} [INFO] init_soil_props: 'satcon_const_z' will be read from file
{MPI Task 7} [INFO] init_soil_props: 'sm_sat_const_z' will be read from file
{MPI Task 7} [INFO] init_soil_props: 'sm_crit_const_z' will be read from file
{MPI Task 7} [INFO] init_soil_props: 'sm_wilt_const_z' will be read from file
{MPI Task 7} [INFO] init_soil_props: 'hcap_const_z' will be read from file
{MPI Task 7} [INFO] init_soil_props: 'hcon_const_z' will be read from file
{MPI Task 7} [INFO] init_soil_props: 'albsoil' will be read from file
{MPI Task 2} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.veg.frac.nc
{MPI Task 0} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.soil.dust.merge.nc for reading
{MPI Task 2} [INFO] init_soil_props: Reading JULES_SOIL_PROPS namelist…
{MPI Task 7} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.soil.dust.merge.nc for reading
{MPI Task 2} [INFO] init_soil_props: 'b_const_z' will be read from file
{MPI Task 2} [INFO] init_soil_props: 'sathh_const_z' will be read from file
{MPI Task 2} [INFO] init_soil_props: 'satcon_const_z' will be read from file
{MPI Task 2} [INFO] init_soil_props: 'sm_sat_const_z' will be read from file
{MPI Task 2} [INFO] init_soil_props: 'sm_crit_const_z' will be read from file
{MPI Task 2} [INFO] init_soil_props: 'sm_wilt_const_z' will be read from file
{MPI Task 2} [INFO] init_soil_props: 'hcap_const_z' will be read from file
{MPI Task 2} [INFO] init_soil_props: 'hcon_const_z' will be read from file
{MPI Task 2} [INFO] init_soil_props: 'albsoil' will be read from file
{MPI Task 4} [INFO] init_soil_props: Reading JULES_SOIL_PROPS namelist…
{MPI Task 4} [INFO] init_soil_props: 'b_const_z' will be read from file
{MPI Task 4} [INFO] init_soil_props: 'sathh_const_z' will be read from file
{MPI Task 4} [INFO] init_soil_props: 'satcon_const_z' will be read from file
{MPI Task 4} [INFO] init_soil_props: 'sm_sat_const_z' will be read from file
{MPI Task 4} [INFO] init_soil_props: 'sm_crit_const_z' will be read from file
{MPI Task 4} [INFO] init_soil_props: 'sm_wilt_const_z' will be read from file
{MPI Task 4} [INFO] init_soil_props: 'hcap_const_z' will be read from file
{MPI Task 4} [INFO] init_soil_props: 'hcon_const_z' will be read from file
{MPI Task 4} [INFO] init_soil_props: 'albsoil' will be read from file
{MPI Task 2} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.soil.dust.merge.nc for reading
{MPI Task 4} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.soil.dust.merge.nc for reading
{MPI Task 2} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.soil.dust.merge.nc
{MPI Task 3} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.soil.dust.merge.nc
{MPI Task 0} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.soil.dust.merge.nc
{MPI Task 1} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.soil.dust.merge.nc
{MPI Task 8} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.soil.dust.merge.nc
{MPI Task 4} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.soil.dust.merge.nc
{MPI Task 7} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.soil.dust.merge.nc
{MPI Task 9} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.soil.dust.merge.nc
{MPI Task 5} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.soil.dust.merge.nc
{MPI Task 6} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.soil.dust.merge.nc
{MPI Task 3} [INFO] init_top: Reading JULES_TOP namelist…
{MPI Task 8} [INFO] init_top: Reading JULES_TOP namelist…
{MPI Task 3} [INFO] init_top: 'fexp' will be set to a constant = 1.000000
{MPI Task 1} [INFO] init_top: Reading JULES_TOP namelist…
{MPI Task 3} [INFO] init_top: 'ti_mean' will be read from file
{MPI Task 3} [INFO] init_top: 'ti_sig' will be read from file
{MPI Task 9} [INFO] init_top: Reading JULES_TOP namelist…
{MPI Task 8} [INFO] init_top: 'fexp' will be set to a constant = 1.000000
{MPI Task 8} [INFO] init_top: 'ti_mean' will be read from file
{MPI Task 8} [INFO] init_top: 'ti_sig' will be read from file
{MPI Task 2} [INFO] init_top: Reading JULES_TOP namelist…
{MPI Task 2} [INFO] init_top: 'fexp' will be set to a constant = 1.000000
{MPI Task 1} [INFO] init_top: 'fexp' will be set to a constant = 1.000000
{MPI Task 1} [INFO] init_top: 'ti_mean' will be read from file
{MPI Task 1} [INFO] init_top: 'ti_sig' will be read from file
{MPI Task 3} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.hydtop.nc for reading
{MPI Task 9} [INFO] init_top: 'fexp' will be set to a constant = 1.000000
{MPI Task 9} [INFO] init_top: 'ti_mean' will be read from file
{MPI Task 9} [INFO] init_top: 'ti_sig' will be read from file
{MPI Task 0} [INFO] init_top: Reading JULES_TOP namelist…
{MPI Task 2} [INFO] init_top: 'ti_mean' will be read from file
{MPI Task 2} [INFO] init_top: 'ti_sig' will be read from file
{MPI Task 5} [INFO] init_top: Reading JULES_TOP namelist…
{MPI Task 5} [INFO] init_top: 'fexp' will be set to a constant = 1.000000
{MPI Task 5} [INFO] init_top: 'ti_mean' will be read from file
{MPI Task 7} [INFO] init_top: Reading JULES_TOP namelist…
{MPI Task 7} [INFO] init_top: 'fexp' will be set to a constant = 1.000000
{MPI Task 1} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.hydtop.nc for reading
{MPI Task 6} [INFO] init_top: Reading JULES_TOP namelist…
{MPI Task 6} [INFO] init_top: 'fexp' will be set to a constant = 1.000000
{MPI Task 0} [INFO] init_top: 'fexp' will be set to a constant = 1.000000
{MPI Task 5} [INFO] init_top: 'ti_sig' will be read from file
{MPI Task 7} [INFO] init_top: 'ti_mean' will be read from file
{MPI Task 7} [INFO] init_top: 'ti_sig' will be read from file
{MPI Task 9} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.hydtop.nc for reading
{MPI Task 6} [INFO] init_top: 'ti_mean' will be read from file
{MPI Task 6} [INFO] init_top: 'ti_sig' will be read from file
{MPI Task 0} [INFO] init_top: 'ti_mean' will be read from file
{MPI Task 0} [INFO] init_top: 'ti_sig' will be read from file
{MPI Task 2} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.hydtop.nc for reading
{MPI Task 8} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.hydtop.nc for reading
{MPI Task 5} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.hydtop.nc for reading
{MPI Task 7} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.hydtop.nc for reading
{MPI Task 0} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.hydtop.nc for reading
{MPI Task 6} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.hydtop.nc for reading
{MPI Task 4} [INFO] init_top: Reading JULES_TOP namelist…
{MPI Task 4} [INFO] init_top: 'fexp' will be set to a constant = 1.000000
{MPI Task 4} [INFO] init_top: 'ti_mean' will be read from file
{MPI Task 4} [INFO] init_top: 'ti_sig' will be read from file
{MPI Task 4} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.hydtop.nc for reading
{MPI Task 1} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.hydtop.nc
{MPI Task 9} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.hydtop.nc
{MPI Task 3} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.hydtop.nc
{MPI Task 2} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.hydtop.nc
{MPI Task 4} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.hydtop.nc
{MPI Task 9} [INFO] init_agric: Reading JULES_AGRIC namelist…
{MPI Task 9} [INFO] init_agric: Zero agricultural fraction indicated
{MPI Task 9} [INFO] init_ancillaries: Reading JULES_CO2 namelist…
{MPI Task 8} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.hydtop.nc
{MPI Task 4} [INFO] init_agric: Reading JULES_AGRIC namelist…
{MPI Task 4} [INFO] init_agric: Zero agricultural fraction indicated
{MPI Task 9} [INFO] INIT_PFTPARM_JULES: Reading JULES_PFTPARM namelist…
{MPI Task 4} [INFO] init_ancillaries: Reading JULES_CO2 namelist…
{MPI Task 4} [INFO] INIT_PFTPARM_JULES: Reading JULES_PFTPARM namelist…
{MPI Task 4} [INFO] INIT_NVEGPARM_JULES: Reading JULES_NVEGPARM namelist…
{MPI Task 4} [INFO] init_fire: Reading FIRE_SWITCHES namelist…
{MPI Task 4} [INFO] init_drive: Reading JULES_DRIVE namelist…
{MPI Task 4} [INFO] init_drive: Using time templating to get drive file names
{MPI Task 4} [INFO] init_drive: Using variable name templating to get drive file names
{MPI Task 4} [INFO] init_drive: Downward LW and downward SW radiation are both provided directly
{MPI Task 4} [INFO] init_drive: Diffuse radiation will be set as a constant
{MPI Task 4} [INFO] init_drive: Precipitation components will be derived from total precipitation
{MPI Task 4} [INFO] init_drive: Horizontal components of wind given directly
{MPI Task 4} [INFO] init_drive: 'sw_down' will be read from file
{MPI Task 4} [INFO] init_drive: 'lw_down' will be read from file
{MPI Task 4} [INFO] init_drive: 'precip' will be read from file
{MPI Task 4} [INFO] init_drive: 't' will be read from file
{MPI Task 4} [INFO] init_drive: 'q' will be read from file
{MPI Task 4} [INFO] init_drive: 'u' will be read from file
{MPI Task 4} [INFO] init_drive: 'v' will be read from file
{MPI Task 4} [INFO] init_drive: 'pstar' will be read from file
{MPI Task 3} [INFO] init_agric: Reading JULES_AGRIC namelist…
{MPI Task 3} [INFO] init_agric: Zero agricultural fraction indicated
{MPI Task 3} [INFO] init_ancillaries: Reading JULES_CO2 namelist…
{MPI Task 3} [INFO] INIT_PFTPARM_JULES: Reading JULES_PFTPARM namelist…
{MPI Task 3} [INFO] INIT_NVEGPARM_JULES: Reading JULES_NVEGPARM namelist…
{MPI Task 3} [INFO] init_fire: Reading FIRE_SWITCHES namelist…
{MPI Task 3} [INFO] init_drive: Reading JULES_DRIVE namelist…
{MPI Task 3} [INFO] init_drive: Using time templating to get drive file names
{MPI Task 3} [INFO] init_drive: Using variable name templating to get drive file names
{MPI Task 3} [INFO] init_drive: Downward LW and downward SW radiation are both provided directly
{MPI Task 3} [INFO] init_drive: Diffuse radiation will be set as a constant
{MPI Task 3} [INFO] init_drive: Precipitation components will be derived from total precipitation
{MPI Task 3} [INFO] init_drive: Horizontal components of wind given directly
{MPI Task 3} [INFO] init_drive: 'sw_down' will be read from file
{MPI Task 3} [INFO] init_drive: 'lw_down' will be read from file
{MPI Task 3} [INFO] init_drive: 'precip' will be read from file
{MPI Task 3} [INFO] init_drive: 't' will be read from file
{MPI Task 3} [INFO] init_drive: 'q' will be read from file
{MPI Task 3} [INFO] init_drive: 'u' will be read from file
{MPI Task 3} [INFO] init_drive: 'v' will be read from file
{MPI Task 3} [INFO] init_drive: 'pstar' will be read from file
{MPI Task 2} [INFO] init_agric: Reading JULES_AGRIC namelist…
{MPI Task 2} [INFO] init_agric: Zero agricultural fraction indicated
{MPI Task 2} [INFO] init_ancillaries: Reading JULES_CO2 namelist…
{MPI Task 2} [INFO] INIT_PFTPARM_JULES: Reading JULES_PFTPARM namelist…
{MPI Task 2} [INFO] INIT_NVEGPARM_JULES: Reading JULES_NVEGPARM namelist…
{MPI Task 2} [INFO] init_fire: Reading FIRE_SWITCHES namelist…
{MPI Task 2} [INFO] init_drive: Reading JULES_DRIVE namelist…
{MPI Task 2} [INFO] init_drive: Using time templating to get drive file names
{MPI Task 2} [INFO] init_drive: Using variable name templating to get drive file names
{MPI Task 2} [INFO] init_drive: Downward LW and downward SW radiation are both provided directly
{MPI Task 2} [INFO] init_drive: Diffuse radiation will be set as a constant
{MPI Task 2} [INFO] init_drive: Precipitation components will be derived from total precipitation
{MPI Task 2} [INFO] init_drive: Horizontal components of wind given directly
{MPI Task 2} [INFO] init_drive: 'sw_down' will be read from file
{MPI Task 2} [INFO] init_drive: 'lw_down' will be read from file
{MPI Task 2} [INFO] init_drive: 'precip' will be read from file
{MPI Task 2} [INFO] init_drive: 't' will be read from file
{MPI Task 2} [INFO] init_drive: 'q' will be read from file
{MPI Task 2} [INFO] init_drive: 'u' will be read from file
{MPI Task 2} [INFO] init_drive: 'v' will be read from file
{MPI Task 2} [INFO] init_drive: 'pstar' will be read from file
{MPI Task 5} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.hydtop.nc
{MPI Task 5} [INFO] init_agric: Reading JULES_AGRIC namelist…
{MPI Task 5} [INFO] init_agric: Zero agricultural fraction indicated
{MPI Task 5} [INFO] init_ancillaries: Reading JULES_CO2 namelist…
{MPI Task 5} [INFO] INIT_PFTPARM_JULES: Reading JULES_PFTPARM namelist…
{MPI Task 5} [INFO] INIT_NVEGPARM_JULES: Reading JULES_NVEGPARM namelist…
{MPI Task 1} [INFO] init_agric: Reading JULES_AGRIC namelist…
{MPI Task 1} [INFO] init_agric: Zero agricultural fraction indicated
{MPI Task 1} [INFO] init_ancillaries: Reading JULES_CO2 namelist…
{MPI Task 1} [INFO] INIT_PFTPARM_JULES: Reading JULES_PFTPARM namelist…
{MPI Task 1} [INFO] INIT_NVEGPARM_JULES: Reading JULES_NVEGPARM namelist…
{MPI Task 1} [INFO] init_fire: Reading FIRE_SWITCHES namelist…
{MPI Task 1} [INFO] init_drive: Reading JULES_DRIVE namelist…
{MPI Task 1} [INFO] init_drive: Using time templating to get drive file names
{MPI Task 1} [INFO] init_drive: Using variable name templating to get drive file names
{MPI Task 1} [INFO] init_drive: Downward LW and downward SW radiation are both provided directly
{MPI Task 1} [INFO] init_drive: Diffuse radiation will be set as a constant
{MPI Task 1} [INFO] init_drive: Precipitation components will be derived from total precipitation
{MPI Task 1} [INFO] init_drive: Horizontal components of wind given directly
{MPI Task 1} [INFO] init_drive: 'sw_down' will be read from file
{MPI Task 1} [INFO] init_drive: 'lw_down' will be read from file
{MPI Task 1} [INFO] init_drive: 'precip' will be read from file
{MPI Task 1} [INFO] init_drive: 't' will be read from file
{MPI Task 1} [INFO] init_drive: 'q' will be read from file
{MPI Task 1} [INFO] init_drive: 'u' will be read from file
{MPI Task 1} [INFO] init_drive: 'v' will be read from file
{MPI Task 1} [INFO] init_drive: 'pstar' will be read from file
{MPI Task 9} [INFO] INIT_NVEGPARM_JULES: Reading JULES_NVEGPARM namelist…
{MPI Task 9} [INFO] init_fire: Reading FIRE_SWITCHES namelist…
{MPI Task 9} [INFO] init_drive: Reading JULES_DRIVE namelist…
{MPI Task 9} [INFO] init_drive: Using time templating to get drive file names
{MPI Task 9} [INFO] init_drive: Using variable name templating to get drive file names
{MPI Task 9} [INFO] init_drive: Downward LW and downward SW radiation are both provided directly
{MPI Task 9} [INFO] init_drive: Diffuse radiation will be set as a constant
{MPI Task 9} [INFO] init_drive: Precipitation components will be derived from total precipitation
{MPI Task 9} [INFO] init_drive: Horizontal components of wind given directly
{MPI Task 9} [INFO] init_drive: 'sw_down' will be read from file
{MPI Task 9} [INFO] init_drive: 'lw_down' will be read from file
{MPI Task 9} [INFO] init_drive: 'precip' will be read from file
{MPI Task 9} [INFO] init_drive: 't' will be read from file
{MPI Task 9} [INFO] init_drive: 'q' will be read from file
{MPI Task 9} [INFO] init_drive: 'u' will be read from file
{MPI Task 9} [INFO] init_drive: 'v' will be read from file
{MPI Task 9} [INFO] init_drive: 'pstar' will be read from file
{MPI Task 7} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.hydtop.nc
{MPI Task 7} [INFO] init_agric: Reading JULES_AGRIC namelist…
{MPI Task 7} [INFO] init_agric: Zero agricultural fraction indicated
{MPI Task 7} [INFO] init_ancillaries: Reading JULES_CO2 namelist…
{MPI Task 7} [INFO] INIT_PFTPARM_JULES: Reading JULES_PFTPARM namelist…
{MPI Task 7} [INFO] INIT_NVEGPARM_JULES: Reading JULES_NVEGPARM namelist…
{MPI Task 6} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.hydtop.nc
{MPI Task 6} [INFO] init_agric: Reading JULES_AGRIC namelist…
{MPI Task 6} [INFO] init_agric: Zero agricultural fraction indicated
{MPI Task 6} [INFO] init_ancillaries: Reading JULES_CO2 namelist…
{MPI Task 6} [INFO] INIT_PFTPARM_JULES: Reading JULES_PFTPARM namelist…
{MPI Task 6} [INFO] INIT_NVEGPARM_JULES: Reading JULES_NVEGPARM namelist…
{MPI Task 6} [INFO] init_fire: Reading FIRE_SWITCHES namelist…
{MPI Task 6} [INFO] init_drive: Reading JULES_DRIVE namelist…
{MPI Task 6} [INFO] init_drive: Using time templating to get drive file names
{MPI Task 6} [INFO] init_drive: Using variable name templating to get drive file names
{MPI Task 6} [INFO] init_drive: Downward LW and downward SW radiation are both provided directly
{MPI Task 6} [INFO] init_drive: Diffuse radiation will be set as a constant
{MPI Task 6} [INFO] init_drive: Precipitation components will be derived from total precipitation
{MPI Task 6} [INFO] init_drive: Horizontal components of wind given directly
{MPI Task 6} [INFO] init_drive: 'sw_down' will be read from file
{MPI Task 6} [INFO] init_drive: 'lw_down' will be read from file
{MPI Task 6} [INFO] init_drive: 'precip' will be read from file
{MPI Task 6} [INFO] init_drive: 't' will be read from file
{MPI Task 6} [INFO] init_drive: 'q' will be read from file
{MPI Task 6} [INFO] init_drive: 'u' will be read from file
{MPI Task 6} [INFO] init_drive: 'v' will be read from file
{MPI Task 6} [INFO] init_drive: 'pstar' will be read from file
{MPI Task 0} [INFO] file_ncdf_close: Closing file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.hydtop.nc
{MPI Task 0} [INFO] init_agric: Reading JULES_AGRIC namelist…
{MPI Task 0} [INFO] init_agric: Zero agricultural fraction indicated
{MPI Task 0} [INFO] init_ancillaries: Reading JULES_CO2 namelist…
{MPI Task 0} [INFO] INIT_PFTPARM_JULES: Reading JULES_PFTPARM namelist…
{MPI Task 0} [INFO] INIT_NVEGPARM_JULES: Reading JULES_NVEGPARM namelist…
{MPI Task 0} [INFO] init_fire: Reading FIRE_SWITCHES namelist…
{MPI Task 0} [INFO] init_drive: Reading JULES_DRIVE namelist…
{MPI Task 0} [INFO] init_drive: Using time templating to get drive file names
{MPI Task 0} [INFO] init_drive: Using variable name templating to get drive file names
{MPI Task 0} [INFO] init_drive: Downward LW and downward SW radiation are both provided directly
{MPI Task 0} [INFO] init_drive: Diffuse radiation will be set as a constant
{MPI Task 0} [INFO] init_drive: Precipitation components will be derived from total precipitation
{MPI Task 0} [INFO] init_drive: Horizontal components of wind given directly
{MPI Task 0} [INFO] init_drive: 'sw_down' will be read from file
{MPI Task 0} [INFO] init_drive: 'lw_down' will be read from file
{MPI Task 0} [INFO] init_drive: 'precip' will be read from file
{MPI Task 0} [INFO] init_drive: 't' will be read from file
{MPI Task 0} [INFO] init_drive: 'q' will be read from file
{MPI Task 0} [INFO] init_drive: 'u' will be read from file
{MPI Task 0} [INFO] init_drive: 'v' will be read from file
{MPI Task 0} [INFO] init_drive: 'pstar' will be read from file
{MPI Task 8} [INFO] init_agric: Reading JULES_AGRIC namelist…
{MPI Task 8} [INFO] init_agric: Zero agricultural fraction indicated
{MPI Task 8} [INFO] init_ancillaries: Reading JULES_CO2 namelist…
{MPI Task 8} [INFO] INIT_PFTPARM_JULES: Reading JULES_PFTPARM namelist…
{MPI Task 8} [INFO] INIT_NVEGPARM_JULES: Reading JULES_NVEGPARM namelist…
{MPI Task 8} [INFO] init_fire: Reading FIRE_SWITCHES namelist…
{MPI Task 8} [INFO] init_drive: Reading JULES_DRIVE namelist…
{MPI Task 8} [INFO] init_drive: Using time templating to get drive file names
{MPI Task 8} [INFO] init_drive: Using variable name templating to get drive file names
{MPI Task 8} [INFO] init_drive: Downward LW and downward SW radiation are both provided directly
{MPI Task 8} [INFO] init_drive: Diffuse radiation will be set as a constant
{MPI Task 8} [INFO] init_drive: Precipitation components will be derived from total precipitation
{MPI Task 8} [INFO] init_drive: Horizontal components of wind given directly
{MPI Task 8} [INFO] init_drive: 'sw_down' will be read from file
{MPI Task 8} [INFO] init_drive: 'lw_down' will be read from file
{MPI Task 8} [INFO] init_drive: 'precip' will be read from file
{MPI Task 8} [INFO] init_drive: 't' will be read from file
{MPI Task 8} [INFO] init_drive: 'q' will be read from file
{MPI Task 8} [INFO] init_drive: 'u' will be read from file
{MPI Task 8} [INFO] init_drive: 'v' will be read from file
{MPI Task 8} [INFO] init_drive: 'pstar' will be read from file
{MPI Task 9} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 9} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_swdown_%y4_n96e.nc
{MPI Task 1} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 1} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_swdown_%y4_n96e.nc
{MPI Task 5} [INFO] init_fire: Reading FIRE_SWITCHES namelist…
{MPI Task 5} [INFO] init_drive: Reading JULES_DRIVE namelist…
{MPI Task 3} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 3} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_swdown_%y4_n96e.nc
{MPI Task 8} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 4} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 8} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_swdown_%y4_n96e.nc
{MPI Task 4} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_swdown_%y4_n96e.nc
{MPI Task 7} [INFO] init_fire: Reading FIRE_SWITCHES namelist…
{MPI Task 6} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 6} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_swdown_%y4_n96e.nc
{MPI Task 7} [INFO] init_drive: Reading JULES_DRIVE namelist…
{MPI Task 5} [INFO] init_drive: Using time templating to get drive file names
{MPI Task 5} [INFO] init_drive: Using variable name templating to get drive file names
{MPI Task 5} [INFO] init_drive: Downward LW and downward SW radiation are both provided directly
{MPI Task 5} [INFO] init_drive: Diffuse radiation will be set as a constant
{MPI Task 2} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 5} [INFO] init_drive: Precipitation components will be derived from total precipitation
{MPI Task 5} [INFO] init_drive: Horizontal components of wind given directly
{MPI Task 2} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_swdown_%y4_n96e.nc
{MPI Task 5} [INFO] init_drive: 'sw_down' will be read from file
{MPI Task 5} [INFO] init_drive: 'lw_down' will be read from file
{MPI Task 5} [INFO] init_drive: 'precip' will be read from file
{MPI Task 5} [INFO] init_drive: 't' will be read from file
{MPI Task 5} [INFO] init_drive: 'q' will be read from file
{MPI Task 5} [INFO] init_drive: 'u' will be read from file
{MPI Task 5} [INFO] init_drive: 'v' will be read from file
{MPI Task 5} [INFO] init_drive: 'pstar' will be read from file
{MPI Task 7} [INFO] init_drive: Using time templating to get drive file names
{MPI Task 7} [INFO] init_drive: Using variable name templating to get drive file names
{MPI Task 7} [INFO] init_drive: Downward LW and downward SW radiation are both provided directly
{MPI Task 7} [INFO] init_drive: Diffuse radiation will be set as a constant
{MPI Task 7} [INFO] init_drive: Precipitation components will be derived from total precipitation
{MPI Task 7} [INFO] init_drive: Horizontal components of wind given directly
{MPI Task 7} [INFO] init_drive: 'sw_down' will be read from file
{MPI Task 7} [INFO] init_drive: 'lw_down' will be read from file
{MPI Task 7} [INFO] init_drive: 'precip' will be read from file
{MPI Task 7} [INFO] init_drive: 't' will be read from file
{MPI Task 7} [INFO] init_drive: 'q' will be read from file
{MPI Task 7} [INFO] init_drive: 'u' will be read from file
{MPI Task 7} [INFO] init_drive: 'v' will be read from file
{MPI Task 7} [INFO] init_drive: 'pstar' will be read from file
{MPI Task 0} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 0} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_swdown_%y4_n96e.nc
{MPI Task 1} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_swdown_1860_n96e.nc for reading
{MPI Task 3} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_swdown_1860_n96e.nc for reading
{MPI Task 5} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 5} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_swdown_%y4_n96e.nc
{MPI Task 7} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 7} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_swdown_%y4_n96e.nc
{MPI Task 4} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_swdown_1860_n96e.nc for reading
{MPI Task 6} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_swdown_1860_n96e.nc for reading
{MPI Task 2} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_swdown_1860_n96e.nc for reading
{MPI Task 8} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_swdown_1860_n96e.nc for reading
{MPI Task 5} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_swdown_1860_n96e.nc for reading
{MPI Task 7} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_swdown_1860_n96e.nc for reading
{MPI Task 0} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_swdown_1860_n96e.nc for reading
{MPI Task 9} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_swdown_1860_n96e.nc for reading
{MPI Task 3} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 3} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_lwdown_%y4_n96e.nc
{MPI Task 7} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 7} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_lwdown_%y4_n96e.nc
{MPI Task 9} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 9} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_lwdown_%y4_n96e.nc
{MPI Task 6} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 6} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_lwdown_%y4_n96e.nc
{MPI Task 7} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_lwdown_1860_n96e.nc for reading
{MPI Task 3} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_lwdown_1860_n96e.nc for reading
{MPI Task 6} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_lwdown_1860_n96e.nc for reading
{MPI Task 9} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_lwdown_1860_n96e.nc for reading
{MPI Task 5} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 5} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_lwdown_%y4_n96e.nc
{MPI Task 1} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 1} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_lwdown_%y4_n96e.nc
{MPI Task 4} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 4} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_lwdown_%y4_n96e.nc
{MPI Task 8} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 8} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_lwdown_%y4_n96e.nc
{MPI Task 0} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 0} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_lwdown_%y4_n96e.nc
{MPI Task 2} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 2} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_lwdown_%y4_n96e.nc
{MPI Task 5} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_lwdown_1860_n96e.nc for reading
{MPI Task 1} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_lwdown_1860_n96e.nc for reading
{MPI Task 4} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_lwdown_1860_n96e.nc for reading
{MPI Task 2} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_lwdown_1860_n96e.nc for reading
{MPI Task 0} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_lwdown_1860_n96e.nc for reading
{MPI Task 8} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_lwdown_1860_n96e.nc for reading
{MPI Task 9} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 9} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_rain_%y4_n96e.nc
{MPI Task 3} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 3} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_rain_%y4_n96e.nc
{MPI Task 7} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 7} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_rain_%y4_n96e.nc
{MPI Task 7} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_rain_1860_n96e.nc for reading
{MPI Task 3} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_rain_1860_n96e.nc for reading
{MPI Task 9} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_rain_1860_n96e.nc for reading
{MPI Task 6} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 6} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_rain_%y4_n96e.nc
{MPI Task 6} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_rain_1860_n96e.nc for reading
{MPI Task 1} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 1} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_rain_%y4_n96e.nc
{MPI Task 5} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 5} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_rain_%y4_n96e.nc
{MPI Task 8} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 8} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_rain_%y4_n96e.nc
{MPI Task 0} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 0} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_rain_%y4_n96e.nc
{MPI Task 4} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 4} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_rain_%y4_n96e.nc
{MPI Task 2} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 2} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_rain_%y4_n96e.nc
{MPI Task 1} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_rain_1860_n96e.nc for reading
{MPI Task 5} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_rain_1860_n96e.nc for reading
{MPI Task 2} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_rain_1860_n96e.nc for reading
{MPI Task 8} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_rain_1860_n96e.nc for reading
{MPI Task 0} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_rain_1860_n96e.nc for reading
{MPI Task 4} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_rain_1860_n96e.nc for reading
{MPI Task 3} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 3} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_tair_%y4_n96e.nc
{MPI Task 2} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 2} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_tair_%y4_n96e.nc
{MPI Task 3} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_tair_1860_n96e.nc for reading
{MPI Task 2} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_tair_1860_n96e.nc for reading
{MPI Task 9} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 9} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_tair_%y4_n96e.nc
{MPI Task 7} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 7} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_tair_%y4_n96e.nc
{MPI Task 8} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 8} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_tair_%y4_n96e.nc
{MPI Task 0} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 9} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_tair_1860_n96e.nc for reading
{MPI Task 7} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_tair_1860_n96e.nc for reading
{MPI Task 8} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_tair_1860_n96e.nc for reading
{MPI Task 0} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_tair_%y4_n96e.nc
{MPI Task 0} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_tair_1860_n96e.nc for reading
{MPI Task 5} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 5} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_tair_%y4_n96e.nc
{MPI Task 1} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 1} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_tair_%y4_n96e.nc
{MPI Task 6} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 6} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_tair_%y4_n96e.nc
{MPI Task 4} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 4} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_tair_%y4_n96e.nc
{MPI Task 5} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_tair_1860_n96e.nc for reading
{MPI Task 1} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_tair_1860_n96e.nc for reading
{MPI Task 4} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_tair_1860_n96e.nc for reading
{MPI Task 6} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_tair_1860_n96e.nc for reading
{MPI Task 1} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 1} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_qair_%y4_n96e.nc
{MPI Task 9} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 9} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_qair_%y4_n96e.nc
{MPI Task 7} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 4} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 4} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_qair_%y4_n96e.nc
{MPI Task 8} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 8} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_qair_%y4_n96e.nc
{MPI Task 6} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 6} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_qair_%y4_n96e.nc
{MPI Task 0} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 0} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_qair_%y4_n96e.nc
{MPI Task 5} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 3} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 7} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_qair_%y4_n96e.nc
{MPI Task 3} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_qair_%y4_n96e.nc
{MPI Task 2} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 2} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_qair_%y4_n96e.nc
{MPI Task 1} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_qair_1860_n96e.nc for reading
{MPI Task 8} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_qair_1860_n96e.nc for reading
{MPI Task 6} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_qair_1860_n96e.nc for reading
{MPI Task 0} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_qair_1860_n96e.nc for reading
{MPI Task 4} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_qair_1860_n96e.nc for reading
{MPI Task 3} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_qair_1860_n96e.nc for reading
{MPI Task 7} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_qair_1860_n96e.nc for reading
{MPI Task 5} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_qair_%y4_n96e.nc
{MPI Task 2} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_qair_1860_n96e.nc for reading
{MPI Task 5} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_qair_1860_n96e.nc for reading
{MPI Task 9} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_qair_1860_n96e.nc for reading
{MPI Task 8} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 8} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_uwind_%y4_n96e.nc
{MPI Task 6} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 7} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 7} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_uwind_%y4_n96e.nc
{MPI Task 2} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 2} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_uwind_%y4_n96e.nc
{MPI Task 5} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 5} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_uwind_%y4_n96e.nc
{MPI Task 3} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 0} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 0} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_uwind_%y4_n96e.nc
{MPI Task 9} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 9} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_uwind_%y4_n96e.nc
{MPI Task 4} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 4} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_uwind_%y4_n96e.nc
{MPI Task 3} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_uwind_%y4_n96e.nc
{MPI Task 8} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_uwind_1860_n96e.nc for reading
{MPI Task 7} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_uwind_1860_n96e.nc for reading
{MPI Task 2} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_uwind_1860_n96e.nc for reading
{MPI Task 5} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_uwind_1860_n96e.nc for reading
{MPI Task 0} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_uwind_1860_n96e.nc for reading
{MPI Task 6} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_uwind_%y4_n96e.nc
{MPI Task 9} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_uwind_1860_n96e.nc for reading
{MPI Task 4} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_uwind_1860_n96e.nc for reading
{MPI Task 3} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_uwind_1860_n96e.nc for reading
{MPI Task 6} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_uwind_1860_n96e.nc for reading
{MPI Task 1} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 1} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_uwind_%y4_n96e.nc
{MPI Task 1} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_uwind_1860_n96e.nc for reading
{MPI Task 9} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 9} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_vwind_%y4_n96e.nc
{MPI Task 5} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 5} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_vwind_%y4_n96e.nc
{MPI Task 7} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 4} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 0} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 0} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_vwind_%y4_n96e.nc
{MPI Task 2} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 2} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_vwind_%y4_n96e.nc
{MPI Task 1} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 1} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_vwind_%y4_n96e.nc
{MPI Task 6} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 6} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_vwind_%y4_n96e.nc
{MPI Task 8} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 8} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_vwind_%y4_n96e.nc
{MPI Task 3} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 3} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_vwind_%y4_n96e.nc
{MPI Task 9} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_vwind_1860_n96e.nc for reading
{MPI Task 5} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_vwind_1860_n96e.nc for reading
{MPI Task 0} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_vwind_1860_n96e.nc for reading
{MPI Task 2} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_vwind_1860_n96e.nc for reading
{MPI Task 1} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_vwind_1860_n96e.nc for reading
{MPI Task 3} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_vwind_1860_n96e.nc for reading
{MPI Task 6} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_vwind_1860_n96e.nc for reading
{MPI Task 8} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_vwind_1860_n96e.nc for reading
{MPI Task 7} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_vwind_%y4_n96e.nc
{MPI Task 4} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_vwind_%y4_n96e.nc
{MPI Task 7} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_vwind_1860_n96e.nc for reading
{MPI Task 4} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_vwind_1860_n96e.nc for reading
{MPI Task 0} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 0} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_press_%y4_n96e.nc
{MPI Task 0} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_press_1860_n96e.nc for reading
{MPI Task 1} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 1} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_press_%y4_n96e.nc
{MPI Task 7} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 5} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 5} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_press_%y4_n96e.nc
{MPI Task 2} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 2} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_press_%y4_n96e.nc
{MPI Task 8} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 4} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 4} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_press_%y4_n96e.nc
{MPI Task 3} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 6} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 8} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_press_%y4_n96e.nc
{MPI Task 6} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_press_%y4_n96e.nc
{MPI Task 9} [INFO] file_ts_open: Opening time series with data_period=21600
{MPI Task 9} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_press_%y4_n96e.nc
{MPI Task 7} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_press_%y4_n96e.nc
{MPI Task 5} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_press_1860_n96e.nc for reading
{MPI Task 1} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_press_1860_n96e.nc for reading
{MPI Task 2} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_press_1860_n96e.nc for reading
{MPI Task 4} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_press_1860_n96e.nc for reading
{MPI Task 8} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_press_1860_n96e.nc for reading
{MPI Task 9} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_press_1860_n96e.nc for reading
{MPI Task 6} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_press_1860_n96e.nc for reading
{MPI Task 7} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_press_1860_n96e.nc for reading
{MPI Task 3} [INFO] file_ts_open: Detected period=-2 for template /gws/nopw/j04/jules/dataCRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_press_%y4_n96e.nc
{MPI Task 3} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data
CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_press_1860_n96e.nc for reading
{MPI Task 0} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED namelist…
{MPI Task 0} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED_DATASET namelist…
{MPI Task 0} [INFO] init_prescribed_data: Using single file for all data times
{MPI Task 0} [INFO] file_ts_open: Opening time series with data_period=-2
{MPI Task 0} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.veg.func.nc for reading
{MPI Task 3} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED namelist…
{MPI Task 1} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED namelist…
{MPI Task 5} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED namelist…
{MPI Task 7} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED namelist…
{MPI Task 9} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED namelist…
{MPI Task 3} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED_DATASET namelist…
{MPI Task 5} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED_DATASET namelist…
{MPI Task 1} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED_DATASET namelist…
{MPI Task 2} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED namelist…
{MPI Task 3} [INFO] init_prescribed_data: Using single file for all data times
{MPI Task 7} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED_DATASET namelist…
{MPI Task 8} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED namelist…
{MPI Task 4} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED namelist…
{MPI Task 5} [INFO] init_prescribed_data: Using single file for all data times
{MPI Task 6} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED namelist…
{MPI Task 9} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED_DATASET namelist…
{MPI Task 9} [INFO] init_prescribed_data: Using single file for all data times
{MPI Task 9} [INFO] file_ts_open: Opening time series with data_period=-2
{MPI Task 1} [INFO] init_prescribed_data: Using single file for all data times
{MPI Task 1} [INFO] file_ts_open: Opening time series with data_period=-2
{MPI Task 3} [INFO] file_ts_open: Opening time series with data_period=-2
{MPI Task 7} [INFO] init_prescribed_data: Using single file for all data times
{MPI Task 7} [INFO] file_ts_open: Opening time series with data_period=-2
{MPI Task 5} [INFO] file_ts_open: Opening time series with data_period=-2
{MPI Task 6} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED_DATASET namelist…
{MPI Task 4} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED_DATASET namelist…
{MPI Task 2} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED_DATASET namelist…
{MPI Task 8} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED_DATASET namelist…
{MPI Task 6} [INFO] init_prescribed_data: Using single file for all data times
{MPI Task 4} [INFO] init_prescribed_data: Using single file for all data times
{MPI Task 2} [INFO] init_prescribed_data: Using single file for all data times
{MPI Task 8} [INFO] init_prescribed_data: Using single file for all data times
{MPI Task 2} [INFO] file_ts_open: Opening time series with data_period=-2
{MPI Task 6} [INFO] file_ts_open: Opening time series with data_period=-2
{MPI Task 4} [INFO] file_ts_open: Opening time series with data_period=-2
{MPI Task 8} [INFO] file_ts_open: Opening time series with data_period=-2
{MPI Task 1} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.veg.func.nc for reading
{MPI Task 5} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.veg.func.nc for reading
{MPI Task 3} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.veg.func.nc for reading
{MPI Task 7} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.veg.func.nc for reading
{MPI Task 4} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.veg.func.nc for reading
{MPI Task 8} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.veg.func.nc for reading
{MPI Task 2} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.veg.func.nc for reading
{MPI Task 6} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.veg.func.nc for reading
{MPI Task 9} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.veg.func.nc for reading
{MPI Task 0} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED_DATASET namelist…
{MPI Task 0} [INFO] init_prescribed_data: Using single file for all data times
{MPI Task 0} [INFO] file_ts_open: Opening time series with data_period=-1
{MPI Task 0} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.veg.func.nc for reading
{MPI Task 9} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED_DATASET namelist…
{MPI Task 3} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED_DATASET namelist…
{MPI Task 3} [INFO] init_prescribed_data: Using single file for all data times
{MPI Task 4} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED_DATASET namelist…
{MPI Task 3} [INFO] file_ts_open: Opening time series with data_period=-1
{MPI Task 4} [INFO] init_prescribed_data: Using single file for all data times
{MPI Task 1} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED_DATASET namelist…
{MPI Task 9} [INFO] init_prescribed_data: Using single file for all data times
{MPI Task 4} [INFO] file_ts_open: Opening time series with data_period=-1
{MPI Task 7} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED_DATASET namelist…
{MPI Task 7} [INFO] init_prescribed_data: Using single file for all data times
{MPI Task 6} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED_DATASET namelist…
{MPI Task 6} [INFO] init_prescribed_data: Using single file for all data times
{MPI Task 8} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED_DATASET namelist…
{MPI Task 5} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED_DATASET namelist…
{MPI Task 5} [INFO] init_prescribed_data: Using single file for all data times
{MPI Task 2} [INFO] init_prescribed_data: Reading JULES_PRESCRIBED_DATASET namelist…
{MPI Task 2} [INFO] init_prescribed_data: Using single file for all data times
{MPI Task 9} [INFO] file_ts_open: Opening time series with data_period=-1
{MPI Task 8} [INFO] init_prescribed_data: Using single file for all data times
{MPI Task 7} [INFO] file_ts_open: Opening time series with data_period=-1
{MPI Task 1} [INFO] init_prescribed_data: Using single file for all data times
{MPI Task 5} [INFO] file_ts_open: Opening time series with data_period=-1
{MPI Task 2} [INFO] file_ts_open: Opening time series with data_period=-1
{MPI Task 6} [INFO] file_ts_open: Opening time series with data_period=-1
{MPI Task 8} [INFO] file_ts_open: Opening time series with data_period=-1
{MPI Task 1} [INFO] file_ts_open: Opening time series with data_period=-1
{MPI Task 3} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.veg.func.nc for reading
{MPI Task 4} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.veg.func.nc for reading
{MPI Task 7} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.veg.func.nc for reading
{MPI Task 8} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.veg.func.nc for reading
{MPI Task 6} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.veg.func.nc for reading
{MPI Task 2} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.veg.func.nc for reading
{MPI Task 9} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.veg.func.nc for reading
{MPI Task 5} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixedqrparm.veg.func.nc for reading
{MPI Task 1} [INFO] file_ncdf_open: Opening file /gws/nopw/j04/jules/data/Ancillary/n96e/GL7/fixed
qrparm.veg.func.nc for reading
{MPI Task 0} [INFO] init_ic: Reading JULES_INITIAL namelist…
{MPI Task 3} [INFO] init_ic: Reading JULES_INITIAL namelist…
{MPI Task 8} [INFO] init_ic: Reading JULES_INITIAL namelist…
{MPI Task 6} [INFO] init_ic: Reading JULES_INITIAL namelist…
{MPI Task 7} [INFO] init_ic: Reading JULES_INITIAL namelist…
{MPI Task 5} [INFO] init_ic: Reading JULES_INITIAL namelist…
{MPI Task 1} [INFO] init_ic: Reading JULES_INITIAL namelist…
{MPI Task 9} [INFO] init_ic: Reading JULES_INITIAL namelist…
{MPI Task 2} [INFO] init_ic: Reading JULES_INITIAL namelist…
{MPI Task 4} [INFO] init_ic: Reading JULES_INITIAL namelist…

comment:17 Changed 7 months ago by grenville

Hi Noel

I can't find /home/users/nmc/cylc-run/u-cd180.

Grenville

comment:18 Changed 7 months ago by pmcguire

Hi Noel:
Yes, the code changes I gave you were to change the suite so that it skips the RECON stage, and does a spin up from ideal conditions instead.
Patrick

comment:19 Changed 7 months ago by pmcguire

Hi Noel:
It looks like you have rerun your suite with —new (and it is still running with the new spinup_01 in the queue) since your comment number 16 where you had the error message. So your log files have been erased, since it's currently running. If you want us to look at log files, then maybe you can just retrigger the various apps int he Cylc GUI as needed.

Also, what you pasted above is your job.out file, and not your job.err file.

And if it's a big long file like that, maybe you can just paste the end part where there is an error message? If you don't erase your log files, then we can look at the full log files pretty easily.

Your new spinup_01 just crashed. I am looking at the log files now.
Patrick

comment:20 Changed 7 months ago by pmcguire

Hi Noel:
This is the end of your new spinup_01 job.err file:

{MPI Task 1} [FATAL ERROR] init_ic: Error reading namelist JULES_INITIAL (IOSTAT=64 IOMSG=input conversion error, unit 1, file /work/scratch-pw/nmc/cylc-run/u-cd180/work/18600101T0000Z/spinup_01/./initial_conditions.nml, line 9, position 22)

{MPI Task 3} [FATAL ERROR] init_ic: Error reading namelist JULES_INITIAL (IOSTAT=64 IOMSG=input conversion error, unit 1, file /work/scratch-pw/nmc/cylc-run/u-cd180/work/18600101T0000Z/spinup_01/./initial_conditions.nml, line 9, position 22) …

etc.

Patrick

Last edited 7 months ago by pmcguire (previous) (diff)

comment:21 Changed 7 months ago by pmcguire

Hi Noel:
It says that line 9 of the /work/scratch-pw/nmc/cylc-run/u-cd180/work/18600101T0000Z/spinup_01/./initial_conditions.nml has an error.
So if you look at that file, this is the line 9:

var='rgrain','rgrainl','canopy','cs','gs','snow_tile','sthuf',
't_soil','tstar_tile','sthzw','zw','rho_snow','snow_depth',
'snow_grnd','nsnow','snow_ds','snow_ice','snow_liq','tsnow',
var_name='rgrain','rgrainl','canopy','cs','gs','snow_tile','sthuf',
't_soil','tstar_tile','sthzw','zw','rho_snow','snow_depth',
'snow_grnd','nsnow','snow_ds','snow_ice','snow_liq','tsnow',

Patrick

comment:22 Changed 7 months ago by pmcguire

Hi Noel:
Sorry. That was line 10.
Line 9 is:

use_file=19*'.false.',

And position 22 of that is the ',' (comma).

Patrick

comment:23 Changed 7 months ago by pmcguire

Hi Noel:
You are coding now. So I see that you are changing the suite's rose-app.conf file.
For it to work right, you need to keep variable '$USE_FILE'. I think the problem is with the quotes somehow.
The use_file variable in the cylc-run initial_conditions.nml file should not have the quotes in It. it should be take the setting from $USE_FILE without quotes, so that in the cylc-run initial_conditions.nml file either has use_file=19*.false., for spinup_01 and use_file=19*.true., for successive spinups and for the main run.

Does this help?
Patrick

comment:24 Changed 7 months ago by NoelClancy

I see the commas in the initial_conditions.nml in the cylc-run

/home/users/nmc/cylc-run/u-cd180/work/18600101T0000Z/spinup_01/initial_conditions.nml

However, I don't know why the commas are added here.

There is also other differences
file = '$INIFILE' in roses
file = 'tmp.nc', in cylc-run
and
use_file =19*$'USE_FILE' in roses
use_file =19*'.false.', in cylc-run

Do you mean that the use_file in cylc-run shoud be:
use_file =19*.false.,

If so, how can that be remedied?
Could I change the use_file in roses to be:
use_file =19*$USE_FILE

comment:25 Changed 7 months ago by pmcguire

Hi Noel:
I think that might work. Yes.

If it doesn't work, then look at the log files and the .nml files, when you reload and retrigger the spinup_01 app, to see how you can try again to fix it.
Patrick

comment:26 Changed 7 months ago by NoelClancy

Thanks, but when I do a "rose suite-clean" and a "rose suite-run —new"
I get the following message

[FAIL] [Errno 16] Device or resource busy: '/work/scratch-pw/nmc/cylc-run/u-cd180/work/18600101T0000Z/spinup_01/.panfs.469510ac.1616499738764547000'
(base) [nmc@cylc1 u-cd180]$

comment:27 Changed 7 months ago by NoelClancy

No sure how to get past this for u-cd180, but I've made the same changes to a copy of the suite to see if it all works

comment:28 Changed 7 months ago by pmcguire

Hi Noel:
I don't know what is wrong. Do you have any files open independently from the /work/scratch-pw/nmc/cylc-run/u-cd180/work/18600101T0000Z/spinup_01/ directory?

I am glad you got it working by making a copy of the suite.

When I suggested to 'reload and retrigger the spinup_01' app, this did not mean also to do a rose suite-clean and a rose suite-run --new. If you are only changing the namelists, sometimes it can be much quicker to 'reload (the suite) and retrigger the app' than it is to do a rose suite-clean and a rose suite-run --new. And if you only reload the suite and retrigger the app, then you don't delete all the log information that is useful for debugging (by yourself or me or other people).

You're right that it makes it much cleaner to do a rose suite-clean and a rose suite-run --new, and sometimes that is useful in order to fix other problems. But in this case, it probably created a new problem by not allowing you to do this because of the Device or resource busy problem.

Can you try to do the reload and retrigger next time instead?
Patrick

Last edited 7 months ago by pmcguire (previous) (diff)

comment:29 Changed 7 months ago by NoelClancy

I had other directories in the /work/scratch-pw/nmc open but not anymore.
Next time I will try to reload and retrigger, is done with the GUI rather than the command line?

However, with suite u-cd187 the following error messages occur:

job.err
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:

Process name: [[23580,1],9]
Exit code: 174


[FAIL] rose-jules-run <<'STDIN'
[FAIL]
[FAIL] 'STDIN' # return-code=174
2021-03-23T12:59:49Z CRITICAL - failed/EXIT

job.out
Primary job terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.


{MPI Task 0} [INFO] write_dump: latitude
{MPI Task 0} [INFO] write_dump: longitude

When looking at the files in the cylc-run log files
/home/users/nmc/cylc-run/u-cd187/work/18600101T0000Z/spinup_01/initial_conditions.nml

I notice the file='tmp.nc', where there are commas.
If I change in roses, file='$INITFILE' to file=$INITFILE
Would that remove the commas in the cylc-run file?

&jules_initial
const_val=50.0,50.0,0.1,1.0,100.0,10.0,0.75,272.0,273.0,0.5,3.0,250.0,
0.0,0.0,1.0,0.0,0.0,0.0,273.0,
dump_file=.false.,
file='tmp.nc',
nvars=19,
total_snow=.false.,
tpl_name=19*,
use_file=19*.false.,
var='rgrain','rgrainl','canopy','cs','gs','snow_tile','sthuf',
't_soil','tstar_tile','sthzw','zw','rho_snow','snow_depth',
'snow_grnd','nsnow','snow_ds','snow_ice','snow_liq','tsnow',
var_name='rgrain','rgrainl','canopy','cs','gs','snow_tile','sthuf',
't_soil','tstar_tile','sthzw','zw','rho_snow','snow_depth',
'snow_grnd','nsnow','snow_ds','snow_ice','snow_liq','tsnow',
/

comment:30 Changed 7 months ago by pmcguire

Hi Noel:
Those are single-quotes around the 'tmp.nc', not commas. I think those are needed. The single-quotes around .false. might not be needed because .false. is a reserved word instead of a generic string.

For the reload and retrigger, the reload is done from the command line (after saving changes to the suite with your editor):

rose suite-run --reload

and the retrigger of whatever apps (fcm_make, recon, spinup_01, jules, main, etc.) need retriggering is done in the GUI by right-mouse-clicking on the app in question, and then pressing the retrigger menu option.
Patrick

comment:31 Changed 7 months ago by pmcguire

Hi Noel:
I have been looking over your log files and your setup.
Your error messages suggest that spinup_01 succeeded! There was an error message at the end.
But it wrote a dump file:

/work/scratch-pw/nmc/u-cd187/JULES-GL7.0.vn5.3.CRUNCEPv7SLURM.spinup_01.dump.18600101.0.nc

I don't know yet if that dump file was at the beginning of spinup or at the end of spin up.

You might also find of interest the suite.rc.processed file, since it shows spinup_01, spinup_02,etc., and the calling sequence:

~nmc/cylc-run/u-cd187/suite.rc.processed 

Patrik

comment:32 Changed 7 months ago by pmcguire

Hi Noel:
I take that back, maybe it only partially succeeded, with the spinup_01 app.
The job.err file also has this message:

Image              PC                Routine            Line        Source
jules.exe          00000000007D4AE3  Unknown               Unknown  Unknown
libpthread-2.17.s  00007FD50FFB4630  Unknown               Unknown  Unknown
jules.exe          0000000000657973  qsat_mod_mp_qsat_         118  qsat_mod.F90
jules.exe          000000000077A040  sf_stom_mod_mp_sf         520  sf_stom_jls_mod.F90
jules.exe          000000000073E9B0  physiol_mod_mp_ph         941  physiol_jls_mod.F90
jules.exe          000000000065B4C6  sf_expl_l_mod_mp_         812  sf_expl_jls.F90
jules.exe          00000000005B4670  surf_couple_expli         511  surf_couple_explicit_mod.F90
jules.exe          000000000040FDA6  control_                  479  control.F90
jules.exe          000000000040CC68  MAIN__                    129  jules.F90
jules.exe          000000000040CB92  Unknown               Unknown  Unknown
libc-2.17.so       00007FD50F9F5555  __libc_start_main     Unknown  Unknown
jules.exe          000000000040CAA9  Unknown               Unknown  Unknown

So maybe it wrote the initial start dump, and then crashed during the first spinup (spinup_01), before writing the final dump for spinup_01, before it could get to spinup_02.
Patrick

comment:33 Changed 7 months ago by pmcguire

Hi Noel:
Maybe the first easy thing to do is to print out the log info for every time step to job.out.
That way, you can see how many time steps it is before it fails.
so in your roses/app/jules/rose-app.conf file, change your print_step=240 to print_step=1.
Then do a reload of the suite (from the command line) and a retrigger of the spinup_01 app.

If it makes it past the first time step or two, then that is good news, I guess.

You can also try to look at the start dump NETCDF file Ito see if its ozone fields are reasonable.

Then another thing you could do (if it makes it past the 1st time step) is to try only to run for 1 time step (3600 seconds is your timestep_len), and then write an end dump file to disk, to see if the end dump still has reasonable numbers in it for everything.

If it doesn't make it past the first time step, it is somewhat more difficult. But we can figure out a way.
Patrick

comment:34 Changed 7 months ago by NoelClancy

I changed print_step=240 to print_step=1
Failed now, how can I see if it made it past the first step?

comment:35 Changed 7 months ago by pmcguire

Hi Noel:
You can look in the job.out file for spinup_01. It should have printed info about each time step near the end if it made it past the first step.

You can compare it to the job.out file for the previous run of spinup_01. If there is nothing new, then it didn't make it past the first time step.

Each run's logs are in the log directory, with 01, 02, etc. subdirectories. (that is, each run's logs are there if you don't do a clean and —new).

Have you looked at the start dump file yet from the previous run (or from this run)? How do the ozone parameters look?

Patrick

comment:36 Changed 7 months ago by NoelClancy

The job.out indicates that Timestep 2 started, however, there are no additional log files 01, 02 etc.

{MPI Task 9} [INFO] next_time: Timestep: 2; Started at: 1860-01-01 01:00:00
{MPI Task 0} [INFO] write_dump: snow_ice
{MPI Task 0} [INFO] write_dump: snow_liq
{MPI Task 1} [INFO] next_time: Timestep: 2; Started at: 1860-01-01 01:00:00
{MPI Task 0} [INFO] write_dump: tsnow
{MPI Task 8} [INFO] next_time: Timestep: 2; Started at: 1860-01-01 01:00:00
{MPI Task 4} [INFO] next_time: Timestep: 2; Started at: 1860-01-01 01:00:00
{MPI Task 7} [INFO] next_time: Timestep: 2; Started at: 1860-01-01 01:00:00
{MPI Task 0} [INFO] write_dump: rgrainl
{MPI Task 5} [INFO] next_time: Timestep: 2; Started at: 1860-01-01 01:00:00
{MPI Task 0} [INFO] write_dump: frac
{MPI Task 2} [INFO] next_time: Timestep: 2; Started at: 1860-01-01 01:00:00
{MPI Task 0} [INFO] write_dump: b
{MPI Task 0} [INFO] write_dump: sathh
{MPI Task 0} [INFO] write_dump: satcon
{MPI Task 0} [INFO] write_dump: sm_sat
{MPI Task 0} [INFO] write_dump: sm_crit
{MPI Task 0} [INFO] write_dump: sm_wilt
{MPI Task 6} [INFO] next_time: Timestep: 2; Started at: 1860-01-01 01:00:00
{MPI Task 0} [INFO] write_dump: hcap
{MPI Task 0} [INFO] write_dump: hcon
{MPI Task 0} [INFO] write_dump: albsoil
{MPI Task 0} [INFO] write_dump: fexp
{MPI Task 0} [INFO] write_dump: ti_mean
{MPI Task 0} [INFO] write_dump: ti_sig
{MPI Task 0} [INFO] write_dump: frac_agr
{MPI Task 0} [INFO] write_dump: co2_mmr
{MPI Task 0} [INFO] write_dump: latitude
{MPI Task 0} [INFO] write_dump: longitude
{MPI Task 0} [INFO] file_ncdf_close: Closing file /work/scratch-pw/nmc/u-cd187/JULES-GL7.0.vn5.3.CRUNCEPv7SLURM.spinup_01.dump.18600101.0.nc
{MPI Task 3} [INFO] next_time: Timestep: 2; Started at: 1860-01-01 01:00:00
{MPI Task 0} [INFO] init: Initialisation is complete


Primary job terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.


There are no o3 related fields in the dump file.

comment:37 Changed 7 months ago by pmcguire

Hi Noel:
it says that 'next_time: Timestep: 2'. So I guess this means that 'this_time' if that variable exists is 'Timestep: 1'. Since it doesn't have any 'next_time: Timestep: 3', then probably there is no 'this_time: Timestep: 1'.

I don't know if that logic would hold water with further investigation. But if it does, then I would suspect that it failed during the first time step, before the second time step.

I thought you were adding ozone or something to the dump files. Sorry. I was confused.

But you can still look at the dump file and see if it makes sense. The dump file uses 1D land points, not a 2D grid. So it's not easy to inspect with ncview.

But you can use ncdump filename.nc | more or ncdump filename.nc > tmp.txt ; vi tmp.txt to look at the files.

Or you could write an iris Python script to convert the 1D NETCDF files to 2D NETCDF files.

Or you can write an iris Python script to make geographic plots of the variables in the 1D NETCDF dump files.

Patrick

comment:38 Changed 7 months ago by pmcguire

Hi Noel:
I just got my suite u-cc615 to do offline/standalone JULES almost through the RECON step of the run, which means that I got most of the ancillaries and driving data and namelists configured almost or all the way right.

But I saw the same/similar errors that you are seeing in your suite:

in: ~/cylc-run/u-cc615/log/job/19790101T0000Z/RECON/22/job.out

{MPI Task 0} [INFO] init: Initialisation is complete

Primary job terminated normally, but 1 process returned a non-zero exit code. 
Per user-direction, the job has been aborted.

and:

in: ~/cylc-run/u-cc615/log/job/19790101T0000Z/RECON/22/job.err

forrtl: severe (174): SIGSEGV, segmentation fault occurred
Image              PC                Routine            Line        Source
jules.exe          0000000000804E73  Unknown               Unknown  Unknown
libpthread-2.17.s  00007FA98EB99630  Unknown               Unknown  Unknown
jules.exe          000000000066E293  qsat_mod_mp_qsat_         118  qsat_mod.F90
jules.exe          00000000007A5607  sf_stom_mod_mp_sf         671  sf_stom_jls_mod.F90
jules.exe          00000000007656C6  physiol_mod_mp_ph         961  physiol_jls_mod.F90
jules.exe          0000000000677465  sf_expl_l_mod_mp_         796  sf_expl_jls.F90
jules.exe          00000000005C74D7  surf_couple_expli         497  surf_couple_explicit_mod.F90
jules.exe          0000000000410568  control_                  571  control.F90
jules.exe          000000000040CCD3  MAIN__                    136  jules.F90
jules.exe          000000000040CB92  Unknown               Unknown  Unknown
libc-2.17.so       00007FA98E5DA555  __libc_start_main     Unknown  Unknown
jules.exe          000000000040CAA9  Unknown               Unknown  Unknown
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:

  Process name: [[40658,1],9]
  Exit code:    174
--------------------------------------------------------------------------
[host435.jc.rl.ac.uk:111775] PMIX ERROR: NO-PERMISSIONS in file gds_dstore.c at line 702
[host435.jc.rl.ac.uk:111775] PMIX ERROR: NO-PERMISSIONS in file gds_dstore.c at line 702
[host435.jc.rl.ac.uk:111775] PMIX ERROR: NO-PERMISSIONS in file gds_dstore.c at line 711
[FAIL] rose-jules-run <<'__STDIN__'
[FAIL]
[FAIL] '__STDIN__' # return-code=174
2021-03-23T16:59:44Z CRITICAL - failed/EXIT


So, we're having similar problems right now.

Patrick

comment:39 Changed 7 months ago by pmcguire

Hi Noel:
I just posted a ticket similar to my comment above ( http://cms.ncas.ac.uk/ticket/3494#comment:38 ) to the CMS Helpdesk as a new ticket #3501 .

Patrick

Last edited 7 months ago by pmcguire (previous) (diff)

comment:40 Changed 7 months ago by NoelClancy

Even though there are no o3 variables in the spinup files, ozone will affect the spinup.
The suite u-bx723 is for a fixed phenology. I need to modify a copy of the suite to enable phenology but that suite is failing for me. I want to be able to run my own spinup dumps from scratch.

Was the issue raised in #3501 solved? As it's alos a problem for me.

comment:41 Changed 7 months ago by NoelClancy

In /app/jules/rose-app.conf

[namelist:jules_spinup]
max_spinup_cycles=0
!!nvars=1
!!spinup_end='1999-01-01 00:00:00'
!!spinup_start='1989-01-01 00:00:00'
!!terminate_on_spinup_fail=.false.
!!tolerance=0.0
!!use_percent=.true.
!!var='smcl'

This was never modified.
I assume that the max_spinup cycles needs to be set to the same as the SPINCYCLES in the rose=app.conf and the other lines uncommented

comment:42 Changed 7 months ago by NoelClancy

Just tried that and still failing. With same error message as in ticket #3501

comment:43 Changed 7 months ago by pmcguire

Hi Noel:
No, the issue I posted in ticket #3501 hasn't solved yet.
Patrick

comment:44 Changed 7 months ago by pmcguire

Hi Noel:
I am not sure if you are right about max_spinup_cycles=0.
The suite.rc file looks like it has an alternative logic for spinup . You can check that logic by running it once we get the RECON fixed, and see if it goes through spinup cycles of the right length even if max_spinup_cycles=0.
Patrick

comment:45 Changed 7 months ago by pmcguire

Hi Noel:
For ticket #3501, I was able to get past the RECON phase, to the spinup phase.
It turned out that I had some of the namelist time ranges set incorrectly for the driving data and the prescribed data.
It still crashes in the spinup phase, and sometimes in the RECON phase, but I am making progress.
It has also been helpful to set up output profiles at the time-frequency of the driving data, and to make sure that the output NETCDF file has an ending time before the point at which the suite currently crashes, so that the NETCDF file is completed properly, and you can see what was going on before the crash.

After making changes, you can usually just reload the suite and retrigger the RECON app, to see if the changes helped at all.

It's also help to make a backup copy sometimes of your ~/cylc-run/u-cd187 directory, in the case you need to do a rose suite-run --new, which resets everything in that directory (including the log files). The log files are useful for me to help you figure out what is going on. Also, you can change the name of the output files every time you run a suite, for the same reason.

BTW, I have been told that when we do a rose suite-run --new, we don't need to also do a rose suite-clean, since the former also does the latter.
Patrick

comment:46 Changed 7 months ago by NoelClancy

Thanks for response, do you mean the following data_start and data_end are set incorrectly?
If so, do I need to change the my time ranges and what are the correct time ranges?

/app/jules/rose-app.conf

[namelist:jules_drive]
data_end='2017-01-01 00:00:00'
data_period=21600
data_start='1860-01-01 00:00:00'
diff_frac_const=0.0
file='$DATA_DIREC/CRU-NCEP-v7/CRU-NCEP-v7-n96e/cruncepv7_%vv_%y4_n96e.nc'

[namelist:jules_prescribed_dataset(1)]
data_end='1861-01-01 00:00:00'
data_period=-2
data_start='1860-01-01 00:00:00'
file='$ANCIL_DIREC/qrparm.veg.func.nc'

[namelist:jules_prescribed_dataset(2)]
data_end='1861-01-01 00:00:00'
data_period=-1
data_start='1860-01-01 00:00:00'
file='$ANCIL_DIREC/qrparm.veg.func.nc'

[namelist:jules_prescribed_dataset(3)]
data_end='2015-01-01 00:00:00'
data_period=-2
data_start='1860-01-01 00:00:00'
file='$ANCIL_TIME_DIREC/co2_mmr_1860_2015.n96e.nc'

comment:47 Changed 7 months ago by pmcguire

Hi Noel:
I didn't say the time settings were wrong for you.
I just said that for my other suite, they might have been part of the problem.
By the looks of your time settings that you post here, they look ok. The times match up.

One other thing is that possibly another of your namelist settings is not set properly for recon and spinup from ideal conditions. You might check that all of your ideal conditions that you have in jules_initial make sense. Neither you nor I have ever really used the u-bb316 suite that this suite is derived from to spinup from idealized conditions. Maybe one of those settings is not in a good range for starting the suite from.

There might be other namelist settings that are wrong too.

But I still think it is worthwhile (if you can get the recon or spinup to work for at least one time step) to make an output profile that properly saves the output to a NETCDF file to disk before it crashes.

Patrick

comment:48 Changed 7 months ago by NoelClancy

When comparing to an earlier suite

(base) [nmc@cylc1 u-cc806]$ ls -ltr
total 12324864
-rw-r—r— 1 nmc users 8301124 Mar 3 11:43 JULES-GL7.0.vn5.3.CRUNCEPv7SLURM.RECON.dump.18600101.0.nc
-rw-r—r— 1 nmc users 8301124 Mar 3 11:43 JULES-GL7.0.vn5.3.CRUNCEPv7SLURM.RECON.dump.18600102.0.nc
-rw-r—r— 1 nmc users 8301124 Mar 3 11:47 JULES-GL7.0.vn5.3.CRUNCEPv7SLURM.spinup_01.dump.18600101.0.nc
-rw-r—r— 1 nmc users 8301124 Mar 3 12:27 JULES-GL7.0.vn5.3.CRUNCEPv7SLURM.spinup_01.dump.18700101.0.nc
-rw-r—r— 1 nmc users 8301124 Mar 3 13:06 JULES-GL7.0.vn5.3.CRUNCEPv7SLURM.spinup_01.dump.18800101.0.nc
-rw-r—r— 1 nmc users 8301124 Mar 3 13:08 JULES-GL7.0.vn5.3.CRUNCEPv7SLURM.S2.dump.18600101.0.nc

I get the first spinup_01.dump18600101.0.nc which is the same size as the other files.
-rw-r—r— 1 nmc users 8301124 Mar 29 13:27 JULES-GL7.0.vn5.3.CRUNCEPv7SLURM.spinup_01.dump.18600101.0.nc

The RECON step is bypassed in the suite which attempts to run the spinup from scratch.

comment:49 Changed 7 months ago by pmcguire

Hi Noel:
The dump files are 1D land points files in JULES, which might be of some use. Maybe you can use ncview to look at the data in them or ncinfo to study the variables in them.

But it would be better if you can change the suite so that it outputs the output profiles after 1 or more successful time steps. The output profiles for JULES suites can be configured to save 2D geographic files instead of 1D land points only files. This can make the interpretation of the output much easier, so that you can pinpoint which grid cell (if any or all or none) is causing problems, and for which variable.

The first dump file for spinup probably should be the same as before. But what might be important is what happens in the model after one or more successful time step(s) (the time steps might be 1800 seconds = 30 mins).
Patrick

comment:50 Changed 7 months ago by NoelClancy

Not sure what this means, in the job.err I get
forrtl: severe (174): SIGSEGV, segmentation fault occurred

ncinfo file.nc
(plot) [nmc@cylc1 u-cd187]$ ncinfo JULES-GL7.0.vn5.3.CRUNCEPv7SLURM.spinup_01.dump.18600101.0.nc
<type 'netCDF4._netCDF4.Dataset'>
root group (NETCDF3_CLASSIC data model, file format NETCDF3):

dimensions(sizes): land(7771), tile(9), sclayer(1), scpool(1), soil(4), snow(3), type(9), scalar(1)
variables(dimensions): float32 canopy(tile,land), float32 cs(scpool,sclayer,land), float32 gs(land), float32 snow_tile(tile,land), float32 t_soil(soil,land), float32 tstar_tile(tile,land), float32 sthuf(soil,land), float32 sthzw(land), float32 zw(land), float32 rgrain(tile,land), float32 rho_snow(tile,land), float32 snow_depth(tile,land), float32 snow_grnd(tile,land), float32 nsnow(tile,land), float32 snow_ds(snow,tile,land), float32 snow_ice(snow,tile,land), float32 snow_liq(snow,tile,land), float32 tsnow(snow,tile,land), float32 rgrainl(snow,tile,land), float32 frac(type,land), float32 b(soil,land), float32 sathh(soil,land), float32 satcon(soil,land), float32 sm_sat(soil,land), float32 sm_crit(soil,land), float32 sm_wilt(soil,land), float32 hcap(soil,land), float32 hcon(soil,land), float32 albsoil(land), float32 fexp(land), float32 ti_mean(land), float32 ti_sig(land), float32 frac_agr(land), float32 co2_mmr(scalar), float32 latitude(land), float32 longitude(land)

job.err
{MPI Task 3} [WARNING] init_output: No output profiles given - output will not be generated for this run
forrtl: severe (174): SIGSEGV, segmentation fault occurred

job.out
{MPI Task 4} [INFO] init: Initialisation is complete
{MPI Task 0} [INFO] write_dump: cs
{MPI Task 0} [INFO] write_dump: gs
{MPI Task 0} [INFO] write_dump: snow_tile
{MPI Task 0} [INFO] write_dump: t_soil
{MPI Task 0} [INFO] write_dump: tstar_tile
{MPI Task 0} [INFO] write_dump: sthuf
{MPI Task 0} [INFO] write_dump: sthzw
{MPI Task 0} [INFO] write_dump: zw
{MPI Task 0} [INFO] write_dump: rgrain
{MPI Task 0} [INFO] write_dump: rho_snow
{MPI Task 0} [INFO] write_dump: snow_depth
{MPI Task 0} [INFO] write_dump: snow_grnd
{MPI Task 0} [INFO] write_dump: nsnow
{MPI Task 9} [INFO] next_time: Timestep: 2; Started at: 1860-01-01 01:00:00
{MPI Task 9} [INFO] next_time: Spinup cycle: 1
{MPI Task 0} [INFO] write_dump: snow_ds
{MPI Task 1} [INFO] next_time: Timestep: 2; Started at: 1860-01-01 01:00:00

{MPI Task 0} [INFO] file_ncdf_close: Closing file /work/scratch-pw/nmc/u-cd187/JULES-GL7.0.vn5.3.CRUNCEPv7SLURM.spinup_01.dump.spin1.18600101.0.nc


Primary job terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.


{MPI Task 0} [INFO] init: Initialisation is complete
{MPI Task 0} [INFO] next_time: Timestep: 2; Started at: 1860-01-01 01:00:00
{MPI Task 0} [INFO] next_time: Spinup cycle: 1

comment:51 Changed 7 months ago by pmcguire

Hi Noel:
Can you try to modify your suite so that it outputs data for only the first time step of the RECON and/or the spinup? How long is your timestep? Make sure that this is only for the first time step, so that the file closes before it crashes during the 2nd time step.

The next_time variable above says that the next time is Timestep: 2. I am not sure if it is even finishing properly the current_time Timestep: 1. Maybe it isn't even getting to finish the first timestep, before it crashes. In this case, the suggestion in the previous paragraph won't help that much. But on the other hand, maybe it is indeed finishing the first timestep, in which case you can use ncview to view the NETCDF file's variable values, which can be quite helpful.

Patrick

comment:52 Changed 7 months ago by NoelClancy

Can you try to modify your suite so that it outputs data for only the first time step of the RECON and/or the spinup? How?

[namelist:jules_output_profile(1)]
output_spinup=.true.

How long is your timestep?
print_step=1
timestep_len=3600

I do not think it finished the first time step because it failed within seconds of starting to run on the GUI. What is the shortest timestep_len? Can I set this for 1 second?

comment:53 Changed 7 months ago by pmcguire

Hi Noel:
The long holiday weekend is starting. So I won't be able to help out more until it's over. Please wait to send tickets until the long weekend's over. OK?

You can try for a timestep_len of 1 second if you want. I am not sure if it will work or not. But it's possible.

To make a jules_output_profile for one time step, you need to put in the start time and the stop time for the output_profile that differs by the number of seconds in timestep_len. There might be other things you need to do, too, to get the output_profile after 1 recon or spinup time step.

But again, make sure that your output is a 2D geographic grid instead of 1D land points, by making sure the land_only flag is set to .false. . There might be other changes you need to make, to make sure it's a 2D grid. And you can reset it to 1D later, if you want, especially if you need to save disk space or to make it faster.
Patrick

comment:54 Changed 7 months ago by NoelClancy

Can someone else in NCAS CMS pick up the ticket while you are away?

The output_profile does not have a start and stop time in app/jules/rose/rose-app.conf
[namelist:jules_output_profile(1)]
file_period=-2
nvars=33
output_main_run=.true.
output_period=-2
output_spinup=.true.
output_type=33*'M'
profile_name='Annual'

Do you mean I need to insert these lines?

I didn't understand that I needed to set the land_only flag to .false.
I've done that now, thanks.

comment:55 Changed 6 months ago by pmcguire

Hi Noel:
The documentation for the output.nml namelists is here:
http://jules-lsm.github.io/vn5.2/namelists/output.nml.html

Yes, you can add the appropriate variable settings for timestep_len, output_start, and output_end. It might be good to read that documentation for more information.

Patrick

comment:56 Changed 6 months ago by NoelClancy

To make a jules_output_profile for one time step, are the following changes correct?

[namelist:jules_model_grid]
land_only=.false.

[namelist:jules_output_profile(1)]
output_end='1860-01-01 00:00:01'
output_start='1860-01-01 00:00:00'
output_main_run=.true.

[namelist:jules_time]
print_step=1
timestep_len=1

[namelist:jules_output] # No changes
dump_period=10
nprofiles=3
output_dir='$OUTPUT_FOLDER'
run_id='$ID_STEM'

[namelist:jules_spinup] # No changes
max_spinup_cycles=0
!!nvars=1
!!spinup_end='1880-01-01 00:00:00'
!!spinup_start='1860-01-01 00:00:00'
!!terminate_on_spinup_fail=.false.
!!tolerance=-3.0
!!use_percent=.true.
!!var='smcl'

After making the above changes, the spinup_01 runs for only 3s, T-start = 17:29:28, T-finish = 17:29:31
But maybe I have done something wrong?

comment:57 Changed 6 months ago by pmcguire

Hi Noel:
Those settings appear to be right.

I would guess it should only take less than a few seconds to simulate 1 time step of the land surface with JULES.

Did you use ncview to open the JULES NETCDF file? It should be on a 2D geographical grid (instead of a 1D land-only vector), so it should be easy to inspect and interpret the numerical values for the different variables with ncview. You can also double click on the map to look at plots of the time series. There should be only 1-2 points in the time series for this run.

For future runs, you might try to look at longer time series by changing output_end and timestep_len appropriately, as long as you close the file before JULES crashes. You might want to change the name of the output file each time your un, so that you don't overwrite your previous runs.

Patrick

comment:58 Changed 6 months ago by NoelClancy

Thanks for repsonse.

I did use ncview but can see no map.

ncview JULES-GL7.0.vn5.3.CRUNCEPv7SLURM.spinup_01.dump.18600101.0.nc
Note: 7771 missing values were eliminated along axis "land"; index= 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 29638145…

ncview JULES-GL7.0.vn5.3.CRUNCEPv7SLURM.spinup_01.dump.spin1.18600101.0.nc
Note: 7771 missing values were eliminated along axis "land"; index= 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 14204417…

comment:59 Changed 6 months ago by pmcguire

Hi Noel:
These files are dump files.
I think all dump files are on a 1D land-only grid, instead of a 2D geographic grid.
So that's probably why you can't see any geographic map of the variables' values.

The output profiles that you created (see above) are probably on a 2D geographic grid, especially if you
chose land_only=.false..
And if these files are closed properly before the JULES crash, then you can see those values.
These output profiles will probably be in a file with a filename that doesn't have the word 'dump' in it.

You can also use ncview sometimes to look at 1D land-only files, but it is not easy. One thing you can do is open up the variable window in ncview, and double click on one of the grid cells in the 1D vector, and that will show a time series for that grid cell. It might be helpful if you use the magnification option in ncview to zoom in on the grid cell before you click on it to make a time series. Otherwise, it's hard to click on the grid cell. But for a dump file, there is only 1 value in a time series for a grid cell, so maybe making the time series is not all that useful, and hovering with the mouse over the grid cell is enough to show the variable's value for the different grid cells.
Patrick

comment:60 Changed 6 months ago by NoelClancy

Those two files are the only ones produced by my suite. There are no files produced without the word 'dump'.

It also says "All values are missing", "no scan axis" and "max and min both 0 for vaiable" so I'm not sure if these files are of any use.

comment:61 Changed 6 months ago by pmcguire

Hi Noel:
I don't know right now why you don't have any output profiles.

You can practice using ncview with the output profiles that I made with my southern England suite that crashes a short time later.
See the settings for the u-cc615_1procsuite in my directory for the filenames and settings for the output profiles. See ticket #3501 for more details.

Patrick

comment:62 Changed 6 months ago by NoelClancy

I noticed the following differences

u-cc615_1proc/app/jules/rose-app.conf
is it necessary to change
[namelist:jules_output]
dump_period=10
nprofiles=3

u-cd180/app/jules/rose-app.conf
[namelist:jules_output]
dump_period=1
nprofiles=1

Is it necessary to also make these changes for u-cd180?

Also, my u-cd180/app/jules/opt/rose-app-spinup.conf is as follows:

[namelist:jules_output]
nprofiles=0

[namelist:jules_prescribed]
n_datasets=2

But I noticed additional information in
u-cc615_1proc/app/jules/opt/rose-app-spinup.conf
under the following headings:
[namelist:jules_prescribed]
[namelist:jules_initial]
[namelist:jules_time]
[namelist:jules_output]
[namelist:jules_output_profile(1)]

Do I need these details in u-cd180/app/jules/opt/rose-app-spinup.conf ?

comment:63 Changed 6 months ago by pmcguire

Hi Noel:
Yes, if your spinup app has nprofiles=0, then it won't write any output profiles to disk.

You can change that nprofiles in your spinup app to whatever the number of output profiles that you have defined, and then it should work just fine. Then you can do a rose suite-run --reload and then retrigger your spinup app in the cylc GUI.
Patrick

comment:64 Changed 6 months ago by NoelClancy

Thanks, I will try that.
Also, earlier in this ticket you recommended that I make the following change in /app/jules/rose-app.conf

[namelist:jules_time]
print_step=1

However, in u-cc615_1proc this change is in
u-cc615_1proc/app/jules/opt/rose-app-spinup.conf
[namelist:jules_time]
print_step=1

and not in u-cc615_1proc/app/jules/rose-app.conf
[namelist:jules_time]
print_step=240

Can I just double check whether this change be implemented in
/app/jules/rose-app.conf
or
/app/jules/opt/rose-app-spinup.conf ?

comment:65 Changed 6 months ago by pmcguire

Hi Noel:
You can use whatever print_step value you want for the jules spinup app and/or the main jules app. If you don't set a value for it in the jules spinup app, it will use the value from the main jules app as default.

If you use a value of print_step=1, then it prints out Information to the stdout log file every time step, which can be useful if you are debugging, and if there are crashes. Then you can see at what time step it crashes. But it will also produce very long log files, which might overburden the system or your quota or your available disk space.
Patrick

comment:66 Changed 6 months ago by NoelClancy

ok, I will leave it as it is for now then.
There are also extra details in
u-cc615_1proc/app/jules/opt/rose-app-recon.conf
that are not in
u-cd180/app/jules/opt/rose-app-recon.conf

However, since my suite is not going through the RECON step, I assume that I do not need to worry about this.

comment:67 Changed 6 months ago by NoelClancy

ncview JULES-GL7.0.vn5.3.CRUNCEPv7SLURM.spinup_01.Annual.1860.nc

calculating min and maxes for canht.
calculating min and maxes for lai.
calculating min and maxes for npp_pft.
calculating min and maxes for gpp.
calculating min and maxes for et_stom.
calculating min and maxes for snow_depth.
calculating min and maxes for landCoverFrac.
Note: 192 missing values were eliminated along axis "x"; index= 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 139818798405882…
Note: 192 missing values were eliminated along axis "x"; index= 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 139818798405882…
Note: 192 missing values were eliminated along axis "x"; index= 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 139818798405882…

"All values are missing"
Magnification does nothing

Also, in u-cc615_1proc/app/jules/rose-app.conf
[namelist:jules_initial]
#These values don't have 4 values for each of the 4 soil layers for the sthuf and t_soil variables

How could I overcome this issue in u-cd180/app/jules/rose-app.conf

If the suite is to use constant values?

comment:68 Changed 6 months ago by pmcguire

Hi Noel:
Congratulations! At least a small congratulations! You were able to make a new output profile for the spinup. But as you can see, it has all NaN's in it or something like that, so maybe it failed to put the actual values for the different variables for the first time step in there.

Normally, you can use the arrow buttons in ncview to switch between time steps. But you only have one time step, so the arrow buttons aren't much use.

If you mouse-over the different grid cells in the geographic plots, you can see the Lat Lon values for each grid cell, and that each grid cell has NaN values. Normally, these would be filled with the variable values for the different variables. You already figured out how to switch variables by pressing the variable name buttons.

I don't know what is wrong right now. But you can see from the end of your stdout file:

~nmc/cylc-run/u-cd180/log/job/18600101T0000+01/spinup_01/02/job.out

that the output profile NETCDF file was opened, but that it was never closed. It failed before it could finish the first time step.

You can also see that the stderr file:

~nmc/cylc-run/u-cd180/log/job/18600101T0000+01/spinup_01/02/job.err

has this warning before it crashes later on:

{MPI Task 7} [WARNING] Jin11_check_inputs: 10m wind speed out of range 0.0 - 100.0 ms-1

Maybe then you can look at the driving data with ncview, to see what the wind values are for the first time step in 1860?

Maybe you can also try to look at the 2D NETCDF output files with ncview for a suite that doesn't spin up from scratch, just to get practice with ncview, if you haven't done something like that already? You can also do the same for the RECON output of that example suite of mine for the southern UK, that I told you about, to see examples of where the files might be messed up, if you can get beyond the first time step.

Patrick

comment:69 Changed 6 months ago by NoelClancy

No need to congratulate me. I don't really understand what has been done or achieved by all of this or where to go from here. I'll try to see if I can ncview the driving data. Thanks for trying to help.

comment:70 Changed 6 months ago by pmcguire

Hi Noel:
Did you figure out if the abnormal winds were causing the problem? Were the winds really abnormal?

You said that "you don't really understand what has been done or achieved by all of this or where to go from here". We're trying to figure out why it's crashing on the first time step. I hope you understand that much. Or is there something else that needs elucidating?

Patrick

comment:71 Changed 6 months ago by NoelClancy

cd /gws/nopw/j04/jules/data/CRU-NCEP-v7/CRU-NCEP-v7-n96e
ncview cruncepv7_uwind_1860_n96e.nc displayed range: -14.6135 to 15.6403 m/s
ncview cruncepv7_vwind_1860_n96e.nc displayed range: -17.3846 to 12.0692 m/s

but the warning is
{MPI Task 7} [WARNING] Jin11_check_inputs: 10m wind speed out of range 0.0 - 100.0 ms-1

Can I use this driving wind data to start the run from initial conditions?
Can I set any value below zero to 0 ms-1?

Not sure what else I don't understand, just completely overwhelmed I think.

comment:72 Changed 6 months ago by pmcguire

Hi Noel:
So the wind speeds are not that extreme or out of range. Do you also see the geographic maps of the wind speed?
You probably want to keep the negative wind speeds.
Maybe you can study the driving data documentation for JULES to see what wind speed ranges are appropriate and how to set the wind inputs? There are options to use x,y components of the wind, or just the magnitude of the wind speed, I think.
And even better, maybe you can use grep to find that warning message in the JULES FORTRAN source code? Then, once you find it, you can try to figure out why JULES is complaining about the wind speeds?
Patrick

comment:73 Changed 6 months ago by NoelClancy

Yes, I can see the geographical maps.

[WARNING] Jin11_check_inputs: 10m wind speed out of range 0.0 - 100.0 ms-1
I don't know if I need to change the wind speeds or not, or how to change them.

There is no detailed information there at all at
http://jules-lsm.github.io/vn5.2/namelists/drive.nml.html?
It doesn't mention what the appropriate ranges are or how to set them.
Where is the socumention you are referring to?

The follwoing commands return nothing.
cd /home/users/nmc/roses/u-cd180
grep -r Jin11_check_inputs
How can I grep the JULES FORTRAN source code?

comment:74 Changed 6 months ago by pmcguire

Hi Noel:
Yes, that's the documentation I was referring to. The driving data documentation might not say about the appropriate ranges, but I think it talks about using either 'wind' or 'u & v' for the wind. If it thinks you are using 'wind', then it might require that the wind values be non-negative.

The JULES FORTRAN code is not in your Rose/Cylc suite, so if you grep for it there, it won't find anything.
You need to find out where the JULES FORTRAN code is by looking in your suite settings. It might already be checked out and installed on JASMIN in the jules GWS. Or maybe you need to check out a copy of the JULES FORTRAN code yourself on to JASMIN, and then you can grep through the code that way.

Patrick

comment:75 Changed 6 months ago by NoelClancy

I must be using u and v
ncview cruncepv7_uwind_1860_n96e.nc displayed range: -14.6135 to 15.6403 m/s
ncview cruncepv7_vwind_1860_n96e.nc displayed range: -17.3846 to 12.0692 m/s
so the southerly and westerly components could be negative at times, so I don't understand the warning message the u and v variables are requested.
In the app/jules/rose-app.conf,
var='sw_down','lw_down','precip','t','q','u','v','pstar'
var_name='Incoming_Short_Wave_Radiation',

='Incoming_Long_Wave_Radiation','Total_Precipitation',
='Temperature','Air_Specific_Humidity','U_wind_component',
='V_wind_component','Pression'

So how does it think that I am using wind? and why does it give the warning message.
{MPI Task 7} [WARNING] Jin11_check_inputs: 10m wind speed out of range 0.0 - 100.0 ms-1 ?

Can't find where it says where the FORTRAN code is in the suite settings? What could I grep in the suite settings that might lead me to the correct loation in the suite where then I can see where the FORTRAN code is?
I've looked in gws in JULES but cannot find it yet.

comment:76 Changed 6 months ago by pmcguire

Hi Noel:
It says in your suite in the fcm_make settings, in the file:

~nmc/roses/u-cd180/app/fcm_make/file/fcm-make.cfg

that it gets the JULES FORTRAN source code from:

include = $JULES_FCM/etc/fcm-make/make.cfg@$JULES_REVISION

If you look in your file:

~nmc/roses/u-cd180/rose-suite.conf

you will see that the settings are JULES_FCM='fcm:jules.x_tr' and JULES_REVISION='13249'.

This means that the JULES FORTRAN code is located at:

include = fcm:jules.x_tr/etc/fcm-make/make.cfg@13249

In order to check out a copy of this with MOSRS on cylc1.jasmin.ac.uk, you can:
1) log in to cylc1 if you haven't already. Make sure you enter your MOSRS password when prompted.
2) mkdir ~/jules_source
3) cd ~/jules_source
4) fcm checkout fcm:jules.x_tr@13249 r13249

It should give a long list of the FORTRAN files that it is checking out a copy of.

Then after this, you can do:
1) cd ~/jules_source
#(like you tried already in another directory):
2) grep -r Jin11_check_inputs
3) you can look at the files which it finds with the grep by using vi.
4) I guess you need to study the source code to find out why it's using the Jin parametrisation:

"Calculates the open sea albedo using the parameterisation of Jin et al. 2011, Optics Express (doi:10.1364/OE.19.026429)"

5) Maybe the wind inputs for the Jin parametrisation are set wrong or something?
Patrick

comment:77 Changed 6 months ago by NoelClancy

The file says the 10m wind should be between 0.0 - 100.0 ms-1

! check the wind speed, should be between 0.0, and not too large:
l_ws_10m_capped = .FALSE.
DO ipt = 1, nd_points

IF (ws_10m_in(ipt) < 0.0) THEN

l_ws_10m_capped = .TRUE.
ws_10m(ipt) = 0.0

ELSE IF (ws_10m_in(ipt) > 100.0) THEN

l_ws_10m_capped = .TRUE.
ws_10m(ipt) = 100.0

ELSE

ws_10m(ipt) = ws_10m_in(ipt)

END IF

END DO
IF (l_ws_10m_capped) THEN

! Use ereport with a -ve status to issue warnings
errcode = -1
CALL ereport('Jin11_check_inputs', errcode, &

'10m wind speed out of range 0.0 - 100.0 ms-1')

END IF

Since the ranges I am getting are:
ncview cruncepv7_uwind_1860_n96e.nc displayed range: -14.6135 to 15.6403 m/s
ncview cruncepv7_vwind_1860_n96e.nc displayed range: -17.3846 to 12.0692 m/s
I suppose I need to convert any negative value to zero.

The spin-up is from 1860 through to 1880, so I assume that I would need to convert all negative values in the following files to 0 ms-1

ncview cruncepv7_uwind_1860_n96e.nc
ncview cruncepv7_vwind_1860_n96e.nc
ncview cruncepv7_uwind_1870_n96e.nc
ncview cruncepv7_vwind_1870_n96e.nc
ncview cruncepv7_uwind_1880_n96e.nc
ncview cruncepv7_vwind_1880_n96e.nc

How would I convert all negative values in these files to 0 m/s?

comment:78 Changed 6 months ago by NoelClancy

I won't have permission to convert these values in the driving data.
I could copy the driver data to my own folder and then make the changes and change the driver pathway in my suite to the new location but the driver data is 8.0T
Maybe that's not a good idea.

Could I modify the checked out version
fcm checkout fcm:jules.x_tr@13249 r13249
on my home directory by changing the wind speed ranges to allow negative values and then run the suite from there?

comment:79 Changed 6 months ago by pmcguire

Hi Noel:
I traced a bit of the JULES code for this.
I think you need to study the code more too before you try changing things.

I see that in the file: src/science/radiation/Jin11_osa_mod.F90, the routine Jin11_osa
uses the wind speed for the sea points to calculate the sea albedo. We're not doing sea points,
so I don't know right now why this routine is called.

The wind speed is previously computed in the routine:

src/control/standalone/control.F90

by passing through the routine:

src/science/radiation/ftsa_jls_mod.F90

This is the code from that control.F90 routine:

IF ( i_sea_alb_method == 3 ) THEN
  ! Calculate the 10m wind speed.
  IF (a_step == 1) THEN
    ! the u/v at 10m not calculated yet, so make a first guess:
    ws10m(:,:) = 2.0
  ELSE
    ws10m(:,:) = SQRT(sf_diag%u10m(:,:)**2 + sf_diag%v10m(:,:)**2)
  END IF
END IF

You can see that the wind speed ws10m should never be less than 0.0, since the sqrt() function is non-negative.

Or maybe there is some NaN problem with sf_diag%u10m(:,:) or sf_diag%v10m(:,:)?
Just a suggestion.
Patrick

comment:80 Changed 6 months ago by NoelClancy

Thanks for that. I can see that despite negative u or v values, the resultant ws must always be positive (or zero) but only because the SQRT function calculation is pointed out to me. I don't know how you managed to trace it back, I didn't know where or how to find that.

I think there are may be too many issues on JASMIN to have a realistic chance of running a GL7 suite on JASMIN from spinup anytime soon. I've only some months of my PhD remaining and still no GL7 results. I'll ask BC and PLV in today's meeting to see if I can try to find an easier solution on MONSOON.

comment:81 Changed 6 months ago by NoelClancy

Hi Patrick:
I think there was a mix-up between lw_down and sw_down and between sw_down and sw_net in the suite being discussed in ticket #3501. Did you use the same script in u-bx723? If so, would the same issue exist in the copies I made of u-bx723?
Noel

Last edited 6 months ago by pmcguire (previous) (diff)

comment:82 Changed 6 months ago by pmcguire

Hi Noel:
Yes, I guess it might be possible that your suite could also have the same mix up as my suite. But I doubt it. Your suite is a GL7 suite on an N96 grid, which originally came from Andy Wiltshire and Carolina Duran Rojas. They (or somebody else) made the driving data, not me.
The suite that I have been trying to get running uses driving data that I extracted from a PRIMAVERA AMIP GL6R suite on an N1280 grid, run by Annette Osprey.
Furthermore, this suite worked with the same driving data without doing the spinning up from scratch from idealized values.
So it's unlikely that the same error was made by both them and me.

But it is easy to check this yourself maybe. Just use ncview and/or ncinfo to look at the variable names and the variable values for the short-wave fluxes in the driving data files. The short-wave fluxes might either have a downwelling shortwave name or a net shortwave name. If the short-wave flux data is downwelling, then the variable values should be non-negative, whereas if the short-wave flux data is net, then it can be negative. The variable names in the ~/roses/app/jules/rose-app.conf etc. file(s) should have the jules_drive variable names sw_down or sw_net which match the data that's in the driving data files.
Patrick

Note: See TracTickets for help on using tickets.