# Changes between Version 27 and Version 28 of WRF4GWRFReforecast

Ignore:
Timestamp:
May 2, 2013 6:41:55 PM (9 years ago)
Comment:

--

### Legend:

Unmodified
 v27 The monitoring and maintenance of an experiment can be shared my different users, provided they connect their WRF4G to the same database. == Postprocess == If the computing resources and the model configuration are stable enough, once running the experiment the monitoring is not time consuming. However, if the experiment is large (e.g. 30 year of daily reforecasts), the postprocessing of the output produced by WRF4G can be a complicated and error-prone task. Here we provide some guidelines to follow to successfully postprocess a reforecast like experiment like this. In this tutorial case it should not be too complicated, since it is only one month. === 1st step: Generate daily CF-compliant files. === To generate CF-compliant files from WRF4G raw data should not be complicated, since it is applying a dictionary to variable name and attributes. However, in fact, we want to do many other things: . Merge the files apart from traducing them to CF.compliant netCDF. . Perform operations over the available fields: change units, de-accumulate, averages, etc. . Compute derived fields. . Split files by variable, and vertical levels. . Delete spin up and/or overlapping periods and concatenate the files correctly. To deal with them, we created a tool written in python called [http://www.meteo.unican.es/wiki/cordexwrf/SoftwareTools/WrfncXnj WRFnc extract and join], published with a GNU open license. It is documented in that web. What do we need now it to write an small shell script that will call this python script with the proper arguments. First, we need to define the paths of the different files. An useful practice is to write a small file called "dirs" with these paths, something like: {{{ BASEDIR="/oceano/gih/work/sw_ncep" EXPDIR="${BASEDIR}/data/raw" DATADIR="${BASEDIR}/data" SCRIPTDIR="${BASEDIR}/scripts" DIAGDIR="${BASEDIR}/diags" FIGSDIR="${BASEDIR}/figures" POSTDIR="${BASEDIR}/data/post_fullres" POST2DIR="${BASEDIR}/data/post_ih" OBSDIR="${BASEDIR}/data/obs" }}} Now writing "source dirs" in our postprocess script we can easily load these variables.