This page serves as a documentation site for the development of BACY and SEVIRI. The subsequenct paragraphs give some explanations on BACY, former outdated information can be found under old_bacy. The handling of SEVIRI data and radar data and COSMO and LETKF is put on other pages. Please fill in your experiment description on this page.
Description: It interpolates boundary conditions from global ICON model to regional COSMO model. The two models differ, inter alia, by their different implementation grid.
Description: The routine integrates the model and computes a first guess (time step is 1 hour)
Description: The routine applies the Data Assimilation and computes the prediction
The corresponding scripts are located in scripts/ and are called
The corresponding config file is config/config_forecast.sh . Major variables are:
After computing the forecasts, i.e. the model output files lfff* and lfff*p, one should evaluate the forecasts by comparing them to the observations. Since the forecasts are located in model space whereas observations are located in observation space, we have to transform the forecasts into observation space by application of the observation operator H to the forecast files. This can be done by running the MEC (Model Equivalent Calculator). For KENDA this can be done easily by calling the datool-too mec_cosmo . Its option are:
mec_cosmo produces verOBSTYPE_DATE.nc files which can be found in the output directory mec/output .
The subsequent statistical evaluation can be performed by tools developed by Felix Fundel that you find on the web page DIY Verify 2.0 underhttp://oflxs464.dwd.de/~ffundel/ or directly under http://184.108.40.206:1111/users/ffundel/test/diyVeri/ . There, you should fill in the mask and choose the paths of your verOBSTYPE_DATE.nc files as the feedback directories of your experiments. Then copy the script on the right hand side into an editor on the machine lce and save the script.
If you already have a passwordless access to the department server oflxs464, then you can run the script immediately in your shell. If not, you should install the passwordless access, e.g. according to website https://www.tecmint.com/ssh-passwordless-login-using-ssh-keygen-in-5-easy-steps/ . In this context, it may be helpful to note that the password of the department server is identical to your DWD-wide acount password and hence, in principle, different from the password on the HPC system (e.g. the lce-machine). As an example, the following steps permit to install the passwordless access:
ssh-keygen -t rsa
(leave passphrase empty)
ssh ahutt@oflxs464 mkdir -p .ssh cat .ssh/id_rsa.pub | ssh ahutt@oflxs464 'cat >> .ssh/authorized_keys' ssh ahutt@oflxs464 "chmod 700 .ssh; chmod 640 .ssh/authorized_keys"
If your run script is called e.g. ./run_eval.sh, then type
After the script finished, one can visualize the results for upper air data, e.g. TEMP, on http://220.127.116.11:1111/users/ffundel/fdbk_temp_cont/ and for surface data on http://18.104.22.168:1111/users/ffundel/fdbk_cont/ .
The tool obs_err_stat is located in the path of the binary letkf, e.g. in /e/uhome/hreich/3dvar_orig/3dvar/build/LINUX64.cray-gnu-debug/bin/ . It needs a namelist that should be located in COSMO/feedback/EXPID , where EXPID (e.g. 1001.01) is the experiment identification previously chosen in the file config_cycle.sh . The namelist is an ascii-file, should be called namelist_obs_stat and should look like this:
&OBS_STAT netcdf_template = 'ekf_OBSTYP___YYMMDDHHMMSS_.nc' obstyp_name = 'TEMP AIREP PILOT' !'DRIBU' ! obstyp_name = 'SYNOP' !'DRIBU' date_ini = '20140517060000' date_end = '20140530000000' interval = 1.0 !set to -1.0 to switch of !use_passive = .true. area = 19 /
It accesses the ekf-files (feedback files) in the sub-directories with name of the data, e.g. 20140525030000 . Let us call this directory DATE. To this end, it is necessary to generate soft links from COSMO/feedback/EXPID to the ekf-files in the directories. This can be done in several ways, one way to do this for each DATE:
If you want to have a look into the laf- or lff-files using grads, you first have to produce a ctl-file. This can be done in the following way: You need scripts called 'cut_fields' and 'grib2ctl.pl' in the directory where your laf- or lff-files are stored. E.g. go into this directory and
cp /e/uhome/hreich/tmp/cut_fields . cp /e/uhome/hreich/tmp/grib2ctl.pl .
Run 'cut_fields' via
./cut_fields <filename (e.g.:laf20140525010000.det)> <filename.out (e.g.:laf20140525010000.det.out)>
The result are 3 files: filename.out, filename.out.ctl, filename.out.idx