Warning: "continue" targeting switch is equivalent to "break". Did you mean to use "continue 2"? in /homepages/1/d352993642/htdocs/inverseproblems/inc/parser/handler.php on line 1552

Warning: Declaration of syntax_plugin_shortcut_anchor::handle($match, $state, $pos, &$handler) should be compatible with DokuWiki_Syntax_Plugin::handle($match, $state, $pos, Doku_Handler $handler) in /homepages/1/d352993642/htdocs/inverseproblems/lib/plugins/shortcut/syntax/anchor.php on line 23

Warning: Declaration of syntax_plugin_shortcut_anchor::render($mode, &$renderer, $data) should be compatible with DokuWiki_Syntax_Plugin::render($format, Doku_Renderer $renderer, $data) in /homepages/1/d352993642/htdocs/inverseproblems/lib/plugins/shortcut/syntax/anchor.php on line 23

Warning: Cannot modify header information - headers already sent by (output started at /homepages/1/d352993642/htdocs/inverseproblems/inc/parser/handler.php:1552) in /homepages/1/d352993642/htdocs/inverseproblems/inc/actions.php on line 38

Warning: Cannot modify header information - headers already sent by (output started at /homepages/1/d352993642/htdocs/inverseproblems/inc/parser/handler.php:1552) in /homepages/1/d352993642/htdocs/inverseproblems/lib/tpl/dokuwiki/main.php on line 12
reading:kenda [Inverse Problems and Data Assimilation Wiki and International Community]

User Tools

Site Tools


KENDA Documentation Page

This page serves as a documentation site for the development of BACY and SEVIRI. The subsequenct paragraphs give some explanations on BACY, former outdated information can be found under old_bacy. The handling of SEVIRI data and radar data and COSMO and LETKF is put on other pages. Please fill in your experiment description on this page.

Abbreviations in BACY files

  • laf local, analysis, full domain
  • lff local, forecast, full domain
  • lbc old (abbreviation from GME-model)
  • lbff boundary files
  • int2lm interpolation to local model
  • lm local model
  • PBS relevant for queueing system; processors
  • igfff global equivalent to lff; provides forecast data used for lbff via int2lm
  • iefff help field for ICON
  • hhl help field for ICON, level of height
  • YUCHKDAT diagnostic fields in output of COSMO, initial and boundary data
  • YUPRHUMI humidity
  • YUPRMASS Bodendruck, vertical winds
  • YUVERIF verification of observations
  • YUSPECIF namelist spezification
  • YU… data in ascii-format
  • ekf output of LETKF, observations and model equivalents, feedback files
  • cdfin contains conventional observations without model values
  • lmstat verifies observations in forecast window, today every hour or corresponding to observations; to be replaced by mec

Description of different tools in BACY script


Description: It interpolates boundary conditions from global ICON model to regional COSMO model. The two models differ, inter alia, by their different implementation grid.

Input files:

  • igfff00?00000 located in /e/uscratch/for3cre/IEDA001/data/2014052503/COSMO_DE/ and soft-linked to RUN_DIR/bcdata/$DATE , see script run_int2lm

Output files:

  • generates laf and lbf - files located in RUN_DIR_COSMO/det (for deterministic run) and RUN_DIR_COSMO/ens0?? (for ensembles)


Description: The routine integrates the model and computes a first guess (time step is 1 hour)

Input files:

  • laf-files: analysis, generated previously either by int2lm (after the first run) or by LETKF (during cycling)
  • lbf-files: boundary files generated previously
  • cdfin-files: observation data
  • blklsttmp: blacklist file, mask to remove unwanted observations

Output files:

  • lff-file: first guess (forecast) files, grip-file (viewable with plot_grib)
  • fof-files: feedback files, model equivalents of the observations
  • YU..: internal COSMO files
  • INPUT_..: configuration files for COSMO with namelists
  • M_…: Meteogramme, results at last time instance at certain grid points


Description: The routine applies the Data Assimilation and computes the prediction

Input files:

  • lff-files: prediction (1h), taken from COSMO
  • fof-files: feedback files taken from COSMO

Output files:

  • laf-files: analysis files
  • ekf-files: feedback files, model equivalents of analysis results
  • LETKF.o…: summary of processed data, including the type of observations considered in the run

Forecasts in BACY

The corresponding scripts are located in scripts/ and are called

  • run_forecast : major script controlling the forecast
  • run_cosmo_forecast : COSMO forecast script

The corresponding config file is config/config_forecast.sh . Major variables are:

  • DATE_INI : initial time of forecast cycling, e.g., 20140516030000 means year 2014 May 16 at 3am.
  • DATE_END : final time of forecast, e.g., 20140615000000
  • FCTIME : maximum forecast time in seconds, e.g., 86400s means 24h. Intrinsically, one assumes forecast steps of 1 hour.
  • FCINT : interval in seconds by which the forecast cycling start jumps in time, e.g., 21600 means 6h. In other words, the starting times run from DATE_INI to DATE_END in intervals FCINT .
  • BDINT : the interval in seconds of the times when ICON boundary files are present, e.g., 21600 means 6 hours. Typically BDINT is a factor of FCINT and often chosen to BDINT = FCINT .
  • LBCTIME : maximum time for which int2lm provides lateral boundary forecast conditions, e.g. 86400 means 24 hours. It is reasonable to choose LBCTIME >= FCINT and often one chooses LBCTIME = FCINT .

After computing the forecasts, i.e. the model output files lfff* and lfff*p, one should evaluate the forecasts by comparing them to the observations. Since the forecasts are located in model space whereas observations are located in observation space, we have to transform the forecasts into observation space by application of the observation operator H to the forecast files. This can be done by running the MEC (Model Equivalent Calculator). For KENDA this can be done easily by calling the datool-too mec_cosmo . Its option are:

  • ekf_dir : the directory where you find sub-directories with the corresponding ekf-files, e.g. cosmo_letkf/feedback/1002.27/ .
  • lff_dir : the directory where you find sub-directories with the corresponding lff-files, e.g cosmo_letkf/data/1002.27/
  • startdate enddate : e.g. 20140523130000 20140524140000
  • options:
    • -i : verification (output) interval in sec, intrinsically assumed 1 hour (default: 1 hour = 3600 )
    • -t : verification period in sec, equivalent to FCTIME (default: 12 hours = 43200)
    • -s : interval between forecast starts in sec, equivalent to FCINT (default: 6 hours = 21600)

mec_cosmo produces verOBSTYPE_DATE.nc files which can be found in the output directory mec/output .

The subsequent statistical evaluation can be performed by tools developed by Felix Fundel that you find on the web page DIY Verify 2.0 underhttp://oflxs464.dwd.de/~ffundel/ or directly under . There, you should fill in the mask and choose the paths of your verOBSTYPE_DATE.nc files as the feedback directories of your experiments. Then copy the script on the right hand side into an editor on the machine lce and save the script.

If you already have a passwordless access to the department server oflxs464, then you can run the script immediately in your shell. If not, you should install the passwordless access, e.g. according to website https://www.tecmint.com/ssh-passwordless-login-using-ssh-keygen-in-5-easy-steps/ . In this context, it may be helpful to note that the password of the department server is identical to your DWD-wide acount password and hence, in principle, different from the password on the HPC system (e.g. the lce-machine). As an example, the following steps permit to install the passwordless access:

  ssh-keygen -t rsa 

(leave passphrase empty)

  ssh ahutt@oflxs464 mkdir -p .ssh     
  cat .ssh/id_rsa.pub | ssh ahutt@oflxs464 'cat >> .ssh/authorized_keys'     
  ssh ahutt@oflxs464 "chmod 700 .ssh; chmod 640 .ssh/authorized_keys"  

If your run script is called e.g. ./run_eval.sh, then type


After the script finished, one can visualize the results for upper air data, e.g. TEMP, on and for surface data on .

Verification of results


The tool obs_err_stat is located in the path of the binary letkf, e.g. in /e/uhome/hreich/3dvar_orig/3dvar/build/LINUX64.cray-gnu-debug/bin/ . It needs a namelist that should be located in COSMO/feedback/EXPID , where EXPID (e.g. 1001.01) is the experiment identification previously chosen in the file config_cycle.sh . The namelist is an ascii-file, should be called namelist_obs_stat and should look like this:

netcdf_template  = 'ekf_OBSTYP___YYMMDDHHMMSS_.nc'	
obstyp_name      = 'TEMP AIREP PILOT' !'DRIBU'
!  obstyp_name      = 'SYNOP' !'DRIBU' 	
date_ini         = '20140517060000' 
date_end         = '20140530000000'
interval         = 1.0 !set to -1.0 to switch of
!use_passive      = .true.
area             = 19

It accesses the ekf-files (feedback files) in the sub-directories with name of the data, e.g. 20140525030000 . Let us call this directory DATE. To this end, it is necessary to generate soft links from COSMO/feedback/EXPID to the ekf-files in the directories. This can be done in several ways, one way to do this for each DATE:

  1. cd DATE
  2. for file in ekf* ; do ln -s DATE/\$file ../\$file; done
  3. cd ..


ctl files for grads

If you want to have a look into the laf- or lff-files using grads, you first have to produce a ctl-file. This can be done in the following way: You need scripts called 'cut_fields' and 'grib2ctl.pl' in the directory where your laf- or lff-files are stored. E.g. go into this directory and

cp /e/uhome/hreich/tmp/cut_fields .
cp /e/uhome/hreich/tmp/grib2ctl.pl .

Run 'cut_fields' via

./cut_fields <filename (e.g.:laf20140525010000.det)> <filename.out (e.g.:laf20140525010000.det.out)>

The result are 3 files: filename.out, filename.out.ctl, filename.out.idx

look inside netcdf files

  • ncdump : gives you structure and unformatted content in ascii ; call it by ncdump FILE | less
  • netcdf2asci : gives some information on netcdf file, but generates valuable .info file in ascii format; call it as netcdf2asci and then provide netcdf-file name.

look inside grib files

  • grib_dump : gives you detailed structure in ascii, with data content
  • grib_ls : gives you the rough structure, no data content
reading/kenda.txt · Last modified: 2017/06/19 16:18 by hutt