Opened 3 years ago

Closed 3 years ago

#1915 closed help (answered)

Archiving directly into the /nerc space

Reported by: valerio Owned by: um_support
Component: Archiving Keywords:
Cc: Platform: ARCHER
UM Version: 8.4


Dear CMS helpdesk,

I've been trying to set up automatic archiving of my Archer runs into the /nerc/n02/n02/valerio space. I have attempted this on a very short (1 day) vn8.4 job (xmqhd) following the directions on your wiki page ( However the files are still being saved in my /work space at the end of the run. I get this message at the end of the leave file:

 ==================================== SERVER ================================== 
qsserver: Starting...

The job is not operational 

Machine: LINUXMPP  archsys: true  archhector: true
Hector achiving has been selected for this run

qsserver: The following variables were set up
qsserver: RUNID=xmqhd
qsserver: ARCHIVEDIR=/nerc/n02/n02/valerio/archive/xmqhd

Files in directory ARCHIVEDIR = /nerc/n02/n02/valerio/archive/xmqhd
total 0
Server process:... Ending
0+1 records in
0+1 records out
380 bytes (380 B) copied, 0.00141234 s, 269 kB/s

The full leave file can be found here: /home/n02/n02/valerio/output/xmqhd000.xmqhd.d16189.t160913.leave

I have noticed that the xmqhd job has two branches in the "Central Script Modifications" section of "FCM options for atmosphere and reconfiguration" in the UMUI: the one mentioned on your wiki page (fcm:um_br/dev/jeff/vn8.4_hector_monsoon_archiving/src) as well as another one (fcm:um_br/dev/luke/vn8.4_hector_monsoon_archiving_ff2pp/src)

I tried running the job with both branches included and also with only one of them included (the one you recommend) but there was no difference between the two runs.

Also, I have noticed that an xmqhd folder is actually created in /nerc/n02/n02/valerio/archive, but it is empty.

Thanks in advance,


Change History (5)

comment:1 Changed 3 years ago by ros

Hi Valerio,

You are not running the job for long enough. After 1 day there is nothing to archive. It can't archive the last files as these would be needed to continue the run.

Try running for a 2 or 3 days and you should find the .pa stream will get archived.


comment:2 Changed 3 years ago by valerio

Hi Ros,

Thanks for your reply. I ran the job for 5 days and it did work! However, out of the 5 pp files generated, only 4 were archived in the /nerc space and one of them (corresponding to the final day of the run) was left in the /work space (details below).

valerio@eslogin005:/nerc/n02/n02/valerio/archive/xmqhd> ls -l
total 1915264
-rw-r--r-- 1 valerio n02 490287744 Jul 11 14:37 xmqhda.pa19991201.pp
-rw-r--r-- 1 valerio n02 490287744 Jul 11 14:41 xmqhda.pa19991202.pp
-rw-r--r-- 1 valerio n02 490287744 Jul 11 14:46 xmqhda.pa19991203.pp
-rw-r--r-- 1 valerio n02 490287744 Jul 11 14:50 xmqhda.pa19991204.pp

valerio@eslogin005:/work/n02/n02/valerio/xmqhd> ls -l
total 3938384
drwxr-sr-x 2 valerio n02       4096 Jul  8 15:46 bin
drwxr-sr-x 2 valerio n02       4096 Jul  8 22:40 history_archive
drwxr-sr-x 2 valerio n02      12288 Jul 11 14:30 pe_output
-rw-r--r-- 1 valerio n02 3047272448 Jul 11 11:35 xmqhd.astart
-rw-r--r-- 1 valerio n02       9078 Jul 11 14:53 xmqhd.list
-rw-r--r-- 1 valerio n02        348 Jul 11 14:50 xmqhd.requests
-rw-r--r-- 1 valerio n02     708452 Jul 11 14:53 xmqhd.stash
-rw-r--r-- 1 valerio n02        376 Jul 11 14:30
-rw-r--r-- 1 valerio n02      47578 Jul 11 14:31 xmqhd.xhist
-rw-r--r-- 1 valerio n02  977133568 Jul 11 14:53 xmqhda.pa19991205
-rw-r--r-- 1 valerio n02    9224224 Jul 11 14:31 xmqhda.pb1999dec

Is this normal? At the end of the day it's not a lot of hassle to move one single file, as long as my /work space doesn't reach its quota!

Thanks again,


comment:3 Changed 3 years ago by grenville


It won't archive the last file - it doesn't know how (that's why your 1 day run didn't archive).


comment:4 Changed 3 years ago by valerio

Hi Grenville,

That makes sense. Thanks again for your help,


comment:5 Changed 3 years ago by grenville

  • Resolution set to answered
  • Status changed from new to closed
Note: See TracTickets for help on using tickets.