Data assimilation is the technique by which observations are combined with an NWP product (the first guess or background forecast) and their respective error statistics to provide an improved estimate (the analysis) of the atmospheric (or oceanic, Jovian, whatever) state. Variational (Var) data assimilation achieves this through the iterative minimization of a prescribed cost (or penalty) function. Differences between the analysis and observations/first guess are penalized (damped) according to their perceived error. The difference between three-dimensional (3D-Var) and four-dimensional (4D-Var) data assimilation is the use of a numerical forecast model in the latter.
MMM Division of NCAR supports a unified (global/regional, multi-model, 3/4D-Var) model-space variational data assimilation system (WRF-Var) for use by NCAR staff and collaborators, and is also freely available to the general community, together with further documentation, test results, plans etc., from the WRF-Var web-page http://www.wrf-model.org/development/group/WG4. The documentation you are reading is the "Users Guide" for those interested in downloading and running the code. This text also forms the documentation for the online tutorial. The online WRF-Var tutorial is recommended for people who are
· Potential users of WRF-Var who want to learn how to run WRF-Var by themselves;
· New users who plan on coming to the NCAR WRF-Var tutorial - for you we recommend that you try this tutorial before you come to NCAR whether you are able or unable to register for practice sessions, and this will hopefully help you to understand the lectures a lot better;
· Users who are looking for references to diagnostics, namelist options etc - look for 'Miscellanies' and 'Trouble Shooting' sections on each page.
If you are a new WRF-Var user, this tutorial is designed to take you through WRF-Var-related programs step by step. If you are familiar with 3/4D-Var systems, you may find useful information here too as the WRF-Var implementation of 3/4D-Var contains a number of unique capabilities (e.g. multiple background error models, WRF-framework based parallelism/IO, direct radar reflectivity assimilation). If you do not know anything about WRF-Var, you should first read the WRF-Var tutorial presentations available from the WRF WRF-Var web page http://www.wrf-model.org/development/group/WG4.
In this WRF-Var tutorial, you will learn how to run the various components of WRF-Var system. In the online tutorial, you are supplied with a test case including the following input data: a) observation file, b) WRF NETCDF background file (previous forecast used as a first guess of the analysis), and c) Background error statistics (climatological estimate of errors in the background file). In your own work, you will need to create these input files yourselves.
Various components of the WRF-Var system are shown in blue in the sketch below, together with their relationship with rest of the WRF system.
Before using your own data, we suggest that you start by running through the WRF-Var related programs at least once using the supplied test case. This serves two purposes: First, you can learn how to run the programs with data we have tested ourselves, and second you can test whether your computer is adequate to run the entire modeling system. After you have done this tutorial, you can try
WARNING: It is impossible to test every code upgrade with every permutation of computer, compiler, number of processors, case, namelist option, etc. The “namelist” options that are supported are indicated in the “Registry.wrfvar” and these are the default options. WRF-Var may be run with options other than the default option by specifying its value via the “wrapper” script. Sample of “wrapper” scripts are included in “var/scripts/wrappers” directory. For running “WRF-Var”, you may like to adopt a suitable “wrapper” script in this directory and modify it depending on your case.
As a professional courtesy, we request that you include the following reference in any publications that makes use of any component of the community WRF-Var system:
Barker,
D.M., W. Huang, Y. R. Guo, and Q. N. Xiao., 2004: A Three-Dimensional
(3DVAR)Data Assimilation System For Use With MM5: Implementation and Initial
Results. Mon. Wea. Rev., 132, 897-914.
As you are going through the online tutorial, you will download program tar files and data to your local computer, compile and run on it. Do you know what machine you are going to use to run WRF-Var related programs? What compilers do you have on the machine?
Running WRF-Var requires a Fortran 90 compiler. We currently support the following platforms: IBM, DEC, SGI, PC/Linux (with Portland Group compiler), Cray-X1, and Apple G4/G5. Please let us know if this does not meet your requirements, and we will attempt to add other machines to our list of supported architectures as resources allow. Although we are interested to hear of your experiences modifying compile options, we do not yet recommend making changes to the configure file used to compile WRF-Var.
We recommend you follow the online tutorial in the order of the sections listed below. This tutorial does not cover parts of the larger WRF system, required if you wish to go beyond the test case supplied here, e.g. the WRF Standard Initialization (SI) or WPS and real pre-preprocessors are needed to create your own background field.
The online tutorial is broken down into the following sections.
a) Download Test Data: This page describes how to access test data sets to run WRF-Var.
b) Download Source Code: This page describes how to access source code.
c) WRF-Var observation pre-processor (obsproc): This page describes how to create an observation file for subsequent use in WRF-Var, and plot observation distributions.
d) Setting up WRF-Var: In this part of the tutorial you will compile the codes that form the WRF-Var system.
e) Single Observation Test for WRF-Var In this part of the tutorial you will compile the codes that form the WRF-Var system.
f) Run WRF-Var for con200 Case Study: In this section, you will learn how to run WRF-Var for a test case.
g)
WRF-Var
Diagnostics: WRF-Var produces a number of diagnostics file that
contain useful information on how the assimilation has performed. This section
will introduce you to some of these files, and what to look for.
h) Updating WRF lateral boundary conditions: Before using WRF-Var analysis as the initial conditions for a WRF forecast, the lateral boundary file must be modified to take account of the differences between first guess and analysis.
i) Additional WRF-Var exercises: In this section you will learn more about WRF-Var, like how to run single observations test, response of background error scaling & variance, minimization aspect and how to suppress a particular type of data in WRF-Var etc
This page describes how to access test datasets required to run WRF-Var. If you do not know anything about WRF-Var, you should first read the Introduction section.
Required Data
The WRF-Var system requires three input files to run: a) A WRF first guess input format file output from either WPS (cold-start) or WRF itself (warm-start), b) Observations (in BUFR or ASCII little_r format), and c) A background error statistics file (containing background error covariance currently calculated via the NMC-method), using “gen_be” utility of WRF-Var. The use of these three data files in WRF-Var is described later in the tutorial.
The following table summarizes the above info:
Input
Data |
Format |
Created
By |
First Guess |
NETCDF |
WRF Preprocessing System (WPS) or WRF |
Observations |
ASCII (PREPBUFR also possible) |
Observation Preprocessor (OBSPROC)
|
Background Error Statistics |
Binary |
WRF-Var gen_be utility |
In the online tutorial, example input files are given. In your own work after the tutorial, you will need to create these input data sets yourselves.
Downloading Test Data
In the online tutorial, you will store data in a directory defined by the environment variable $DAT_DIR. This directory can be at any location and it should have read access. To run this case it is desired to have at least 100MB of memory to be made available on your machine (maybe much more for other cases). Type
setenv
DAT_DIR dat_dir
Here, "dat_dir" is your own chosen directory where you aim to store the WRF-Var input data. Create this directory, if it does not exist, and type
cd $DAT_DIR
Download the tutorial test data for a “con200” case which is valid 00 UTC 02 January 2007 test case from
http://www2.mmm.ucar.edu/wrf/src/data/WRFV3-Var-testdata.tar.gz
Once you have downloaded “WRFV3-Var-testdata.tar.gz” file to $DAT_DIR, then extract it by typing
gunzip WRFV3-Var-testdata.tar.gz
tar -xvf WRFV3-Var-testdata.tar
You should then see a number of data files that will be used in the tutorial. To save space type
rm -rf WRFV3-Var-testdata.tar
What next?
Note: You may find the following three sub-directories/files under “$DAT_DIR”
ob /2007010200/ob.little_r ### Observation data in “little_r” format
rc /2007010200/wrfinput_d01 & wrfbdy_d01 ### First guess & lateral boundary file
be/be.dat
### Background error file
http://www2.mmm.ucar.edu/wrf/src/data/WRFV3-Var.tar.gz
Once you have downloaded source code file “WRFV3-Var.tar.gz”, next step is to extract the source code by typing
gunzip WRFV3-Var.tar.gz
tar -xvf WRFV3-Var.tar
You should see a sub-directory “var”. Place this directory under your WRFV3 directory (which you previously downloaded). The subdirectory “var” is the main sub-directory holding source code for both WRF-Var and “obsproc”, the observation pre-processor for WRF-Var observation data input. To save the space execute following command.
rm -rf WRFV3-Var.tar
WRF-Var Observation Preprocessor (OBSPROC)
Having downloaded successfully the test data and the source code, first step is to process observation input for WRF-Var by compiling and running the obs pre-processor (OBSPROC).
Compiling OBSPROC code
WRF-Var observation preprocessor is residing
under “var/obsproc” directory
cd var/obsproc
Read instructions in “README” file to compile OBSPROC and proceed by typing
make
Once this is complete (a minute or less on most machines), you can check for the presence of the OBSPROC executable by issuing following command (from “var/obsproc” directory)
ls -l
src/3dvar_obs.exe
Running OBSPROC:
OK, so now you have compiled “OBSPROC”. Before running “3dvar_obs.exe”, create the desired namelist file namelist.3dvar_obs (see README.namelist for details);
For your reference in “var/obsproc” directory a file named “namelist_3dvar_obs.wrfvar-tut” have already been created. Thus, proceed as follows.
cp namelist.3dvar_obs.wrfvar-tut namelist.3dvar_obs
edit namelist.3dvar_obs
In this namelist file, all you need is to change the full path and name of the observation file (ob.little_r) depending upon where this file is downloaded and where it finally resides.
To run OBSPROC, type
3dvar_obs.exe >&!
3dvar_obs.out
Looking at WRF-Var OBSPROC output.
Once 3dvar_obs.exe has completed successfully, you will see an observation data file: obs_gts.3dvar, in “obsproc” directory. This is the input observation file to WRF-Var. Before running WRF-Var, you may like to learn more about various types of data you are aiming to assimilate for this case, its geographical distribution etc. This file is in ASCII format and so you can easily view it. To have a graphical view about the content of this file, there is a “MAP_plot” utility to look at the data distribution for each type of observations. To exercise this, proceed as follows.
1) cd MAP_plot;
2) Modify the script Map.csh to set the time window and full path of input observation file (obs_gts.3dvar). Precisely, the following string in this script as follows.
TIME_WINDOW_MIN = ‘2007010121’
TIME_ANALYSIS = ‘2007010200’
TIME_WINDOW_MAX = ‘2007010203’
OBSDATA = ./obs_gts.3dvar
3) Type
Map.csh
Note: the configure.user will be automatically picked up for your computer system when you type Map.csh;
4) When the job has completed, you will have a gmeta file gmeta.{analysis_time} corresponding to analysis_time=2007010200. This contains plots of data distribution for each type of observations contained in the OBS data file: obs_gts.3dvar. To view this, type
idt gmeta.2007010200
It will display (panel by panel) geographical distribution of various types of data which are listed in the header of “obs_gts.3dvar”. Following is the geographic distribution of “sonde” observations for this case.
Saving necessary file for WRF-Var and clean
OBSPROC
In this tutorial, we are storing data in a directory defined by the environment variable $DAT_DIR. After creating successfully, your own observation file (obs_gts.3dvar), copy it to $DAT_DIR using the command (from “obsproc” directory)
mv obs_gts.3dvar $DAT_DIR/ob/2007010200/ob.ascii
Finally, to clean up the “var/obsproc” directory, type following command in the same directory
make
clean
Miscellanies:
1) When you run 3dvar_obs.exe, and you did not obtain the file obs_gts.3dvar, please check 3dvar_obs.out file to see where the program aborted. Usually there is information in this file to tell you what is wrong;
2) When you run 3dvar_obs.exe and got an error as 'Error in NAMELIST record 2' in 3dvar_obs.out file, please check if your namelist.3dvar_obs file matches with the Makefile settings. Either your namelist.3dvar_obs file or Makefile need to be modified, then re-compile and re-run the job;
3) From the *.diag files, you may find which observation report caused the failure of job.
4) In most cases, the job failure was caused by incorrect input file names, or the specified analysis time, time window, etc. in innamelist.3dvar_obs;
5) If users still cannot figure out the troubles, please inform us and pass us your input files including the namelist file, and printed file 3dvar_obs.out.
What next?
OK, you have now created the observation file and looked at some plots of observations, now you are ready to move on to setting up WRF-Var.
In this part of the tutorial you will compile WRF-Var code.
Compile WRF-Var code
Proceed as follows while in the main WRFV3 directory:
setenv
WRF_DA_CORE 1
source var/build/setup.csh
./configure
You will then see a list of options, depending on whether you want to run single processor, with different compilers etc. Choose proper option suiting your requirements.
In this tutorial session, you will be running WRF-Var in single processor mode. Therefore, enter 1 (single-threaded) at the prompt. This will automatically create a file configure.wrf in the main WRFV3 directory. Check it with the following command:
ls -l configure.wrf
We recommend you to run WRF-Var in single processor mode first, but later if you want you can run WRF-Var on distributed memory machines by recompiling it appropriately.
For compilation of WRF-Var code type (from the main WRFV3 directory)
./compile all_wrfvar
Successful compilation of ‘all_wrfvar” will produce several executables in “var/da” directory including “da_wrfvar.exe”. You can list these executables by issuing the command (from the main WRFV3 directory)
ls -l var/da/*exe
After successful compilation of WRF-Var you are ready to run WRF-Var for the test case. The WRF-Var system is run through a suitable wrapper script, which activates various standard scripts residing in “WRFV3/var/scripts” directory.
Thus to run any case like the tutorial test case in this tutorial exercise, user is supposed to write a suitable wrapper script and execute this script. For example, the wrapper script for running “con200”, the tutorial test case is located at:
WRFV3/var/scripts/wrappers/da_run_suite_wrapper_con200.ksh
You need to modify this wrapper script for various job details defined via different environment variables in this wrapper script. The user should also be able to run WRF-Var with namelist options other than its pre-set default values, which are defined in WRFV3/Registry/Registry.wrfvar file. It is done by suitably defining the desired environment variables appropriate for your own application. Examples include changing data directory, source code location, experiment run area, east-west south-north, vertical dimension of your domain etc. All non-default namelist options, which user wanted to set via wrapper script will show up in the WRF-Var namelist (namelist.input) file, which is created in the run directory ($RUN_DIR) defined via the wrapper script.
Note: As a rule any WRF-Var namelist option should always be set in wrapper script using upper case letters preceded by “NL” string. For example for “con200” case the grid size dimensions in x-direction is 200000 m and the corresponding WRF-Var namelist variable name is “dx”. This is specified in wrapper script as “export NL_DX=200000”.
WRF-Var system uses WRF framework to define and perform parallel, I/O functions. This is fairly transparent in the WRF-Var code. At run time, WRF-Var requires one to specify the grid dimensions at run-time. This is communicated to WRF-Var system via the wrapper script like for this case as follows.
export NL_E_WE=45
export NL_E_SN=45
export NL_E_VERT=28\
Thus user needs to change these parameters to run their own case.
Note: If this grid dimensions do not mach with the dimensions written in the first guess (FG) input file, WRF-Var will abort with proper diagnostic message.
Having compiled WRF-Var and familiarized yourself with the setting of various environment variables in wrapper script, it’s time to run your test case.
In this section, you will learn how to run WRF-Var using observations and a first guess from a low-resolution (200 km) CONUS domain (see below).
By this stage you have successfully created the
three input files (first guess, observation and background error statistics
files in directory $DAT_DIR) required to
run WRF-Var. Also, you have successfully downloaded and compiled the WRF-Var
code. If this is correct, we are ready to learn how to run WRF-Var.
The Case
The data for this case is valid at 00 UTC 2nd January 2007. The first guess comes from the NCEP global final analysis system (FNL), passed through the WRF-WPS and real packages. The first-guess level 18 potential temperature field is shown below, illustrating the domain:
The intention of running this test-case is to provide a simplified, computationally cheap application in order to train potential uses of WRF-Var (this case uses only a small fraction of WRF-Var capabilities).
Note: In WRF-Var various NCL scripts are included under var/graphics/ncl sub-directory to display results.
Changes required in wrapper script to run con200 case:
Following environment variables needs to be set in
WRFV3/var/scripts/wrappers/da_run_suite_wrapper_con200.ksh script depending upon your case.
REL_DIR - Full path of the parent code directory
WRFVAR_DIR - Full directory path of wrfvar under $REL_DIR
DAT_DIR - Full path of data input holding ob, be, rc directories
REGION - String representing region under $DAT_DIR
OB_DIR - Full path for Observation directory
RC_DIR - Full path for “FG” directory
BE_DIR - Full path for “background error (BE)” directory
INITIAL_DATE - Initial date for running WRF-Var
FINAL_DATE - Final date for running WRF-Var
Following environment variables are optional with default values shown in bracket:
EXPT - Experiment ID (expt)
RUN_DIR - Full path for run directory ($DAT_DIR/$REGION/expt/test_$NUM_PROC)
FC_DIR - Analysis output directory ($DAT_DIR/$REGION/expt/fc)
Note: Since output is written in $RUN_DIR, user has to ensure that this directory/area has proper write permission.
Run WRF-Var
Once you have set the necessary environment variables, in “da_run_suite_wrapper_con200.ksh” script, you can run this case by typing
da_run_suite_wrapper_con200.ksh
in WRFV3/var/scripts/wrappers subdirectory or from the directory where your modified wrapper script is located.
Successful completion of job results in a number of output diagnostic files in ${RUN_DIR} directory. The final analysis file (wrfinput_d01) will appear under $FC_DIR/2007010200 directory. Various textual diagnostics output files will be explained in the next section (WRF-Var Diagnostics). Here, we merely wish to run WRF-Var for this case.
In order to understand the role of various important WRF-Var options, try re-running WRF-Var by changing different namelist options via wrapper script. For example, making WRF-Var convergence criteria more stringent by reducing the value of “EPS” to e.g. 0.0001 by setting "NL_EPS=0.0001" in your wrapper script. If WRF-Var has not converged by the maximum number of iterations (NTMAX=200) then you may need to increase the value of NTMAX variable by setting "export NL_NTMAX=500" in your wrapper script and may like to run the case again this case. Last section of this tutorial deals with many such WRF-Var additional exercises.
Note: You may like to change “RUN_DIR”
environment variable to store results separately for each run. In your wrapper
script by setting the option “CLEAN=true”, you can save lot of space as this
option removes contents of “working” directory.
v. What next?
Having run WRF-Var, you should now spend time looking at some of the diagnostic output files created by WRF-Var.
WRF-Var produces a number of diagnostic files that contain useful information on how the data assimilation has performed. This section will introduce you to some of these files, and what to look for.
Which are the important diagnostic files to look for?
Having run WRF-Var, it is important to check a number of output files to see if the assimilation appears sensible. Change directory to where you ran this case:
cd ${RUN_DIR}/2007010200/wrfvar
ls -l
You will see something like the following:
total 10880
-rw-r--r-- 1 rizvi ncar 280 Mar 18 15:41 cost_fn
-rw-r--r-- 1 rizvi ncar 247 Mar 18 15:41 grad_fn
-rw-r--r-- 1 rizvi ncar 9048641 Mar 18 15:41 gts_omb_oma
-rw-r--r-- 1 rizvi ncar 4156 Mar 18 15:41 index.html
-rw-r--r-- 1 rizvi ncar 1942 Mar 18 15:41 namelist.input
-rw-r--r-- 1 rizvi ncar 76986 Mar 18 15:41 namelist.output
drwxr-xr-x 2 rizvi ncar 65536 Mar 18 15:41 rsl
-rw-r--r-- 1 rizvi ncar 22730 Mar 18 15:41 statistics
drwxr-xr-x 2 rizvi ncar 65536 Mar 18 15:41 trace
drwxr-xr-x 3 rizvi ncar 65536 Mar 18 15:41 working
Content of some useful diagnostic files are as
follows:
cost_fn & grad_fn: These files hold (in ASCII format) WRF-Var cost & gradient function values respectively for the first and last iterations. However, if you run with “NL_CALCULATE_CG_COST_FN=true”, it lists these values for each iterations. For visualization purpose, sometimes it is good to run WRF-Var with this option, as it will list the cost and gradient function for each iteration. Following NCL script may be used to plot the content of “cont_fn” & “grad_fn”, if these files are generated with “NL_CALCULATE_CG_COST_FN=true”.
“WRFV3/var/graphcs/ncl/plot_cost_grad_fn.ncl”
Note: Make sure that you removed first two records (header) in “cost_fn” & “grad_fn”. Also you need to specify the directory name for these two files.
gts_omb_oma: It holds (in ASCII format) information of each type of observations, like its value, quality, observation error, observation minus background (OMB) & observation minus analysis (OMA). This information is very useful for (both analysis or forecasts) verification purposes.
namelist.input: WRF-Var input namelist file. It displays all the non-default options which user defined. If any of your namelist defined options did not appear in this file, you may like to check its name and match it with the “WRFV3/Registry/Registry.wrfvar”.
namelist.output: Consolidated list of all the namelist options used.
rsl: Directory containing information of standard WRF-Var output from individual processors. It contains host of information on number of observations, minimization, timings etc. Additional diagnostics may be printed in these files by including various “print” WRF-Var namelist options. TO learn more about these additional “print” options, search “print_” string in “WRFV3/Registry/Registry.wrfvar”.
statistics: Text file containing OMB (OI), OMA (OA) statistics (minimum, maximum, mean and standard deviation) for each observation type and variable. This information is very useful in diagnosing how WRF-Var has used different components of the observing system. Also contained are the analysis minus background (A-B) statistics i.e. statistics of the analysis increments for each model variable at each model level. This information is very useful in checking the range of analysis increment values found in the analysis, and where they are in the WRF-model grid space.
Where is the final analysis file?
The final analysis file resides as “$FC_DIR/2007010200/wrfinput_d01”. It is in WRF (NETCDF) format. It me be visualized using following NCL script.
“WRFV3/var/graphcs/ncl/WRF-Var_plot.ncl”
You need to specify the analsyis_file name, its full path etc. The details are listed in this script as comments.
As an example, if you are aiming to display U-component of the analysis at level 18, execute following command after modifying the scrip “WRFV3/var/graphcs/ncl/WRF-Var_plot.ncl”
ncl WRF-Var_plot.ncl
It will display like:
You may like to change the variable name, level
etc in this script to display the variable of you’re your choice at the desired
sigma level.
Take time to look through the textual output files to ensure you understand how WRF-Var has performed. For example,
How closely has WRF-Var fitted individual observation types? . Look at the statistics file to compare the O-B and O-A statistics.
How big are the analysis increments? Again, look in the statistics file to see minimum/maximum values of A-B for each variable at various levels. It will give you a feel of the impact of input observation data you assimilated via WRF-Var by modifying the input analysis first guess.
How long did WRF-Var take to converge? Does it really converge? You will get the answers of all these questions by looking into rsl-files, as it indicates the number of iterations taken by WRF-Var to converge. If this is the same as the maximum number of iterations specified in the namelist (NTMAX) or its default value (=200) set in WRFV3/Registry/Registry.wrfvar”, then it means that the analysis solution did not converge. If so, so you may like to increase the value of “NTAMAX” and rerun your case to ensure that the convergence is achieved.
Plotting WRF-Var analysis increments
A good visual way of seeing the impact of assimilation of observations is to plot the analysis increments (i.e. analysis minus first guess difference). There are many different graphics packages used (e.g. RIP, NCL, GRADS etc) that can do this. The plot of level 18 theta increments below was produced using the particular NCL script. This script is located at
“WRFV3/var/graphcs/ncl/WRF-Var_plot.ncl”
You need to modify this script to fix the full path for first_guess & analysis files. You may also like to modify the display level by setting “kl” and the name of the variable to display by setting “var”. Further details are given in this script.
If you are aiming to display increment of potential temperature at level 18, after modifying “WRFV3/var/graphcs/ncl/WRF-Var_plot.ncl” suitably,
When you execute the following command from “WRFV3/var/graphics/ncl”.
ncl WRF-Var_plot.ncl
The plot created will looks as follows:
Note: Higher the analysis increments, more is the
data impact in that region.
What next?
OK, you have run WRF-Var, checked out the diagnostics and are confident things are OK. Now you are ready to run the NWP forecasts using WRF-model.
Before running NWP forecast using WRF-model, you must modify the tendencies within the lateral boundary condition files to make it consistent with the new WRF-Var initial conditions (analysis). This is absolutely essential because when you initially generated the tendencies for the lateral boundary condition (in wrfbdy_d01 file), it was consistent but subsequently by doing WRF-Var you changed the initial value (at t=0) and so accordingly the initial tendencies needs to be updated in this file (wrfbdy_d01) to adjust the change at the initial time.
This is a simple procedure performed by the WRF utility called “updated_bc”.
Note: Make sure that you have “da_update_bc.exe” in “WRFV3/var/da” directory. This executable automatically gets created when you compiled WRF-Var system,
Running UPDATE_BC
The “update_bc.exe” is run via the same wrapper script, you ran con200 by adding following line in your wrapper script.
export RUN_UPDATE_BC=true
With this option standard script located at “WRFV3/var/scripts/da_run_update_bc.ksh” will be activated to update the content of “wrfbdy_d01”. The new lateral boundary input file will be located at the same location and with the same name. You may like to check its date of creation by issuing the following command
ls –ltr $RC_DIR/2007010200/wrfbdy_d01
What next?
Once you are able to run all these programs successfully, and have spent some time looking at the variety of diagnostics output that is produced, we hope that you'll have some confidence in handling the WRF-Var system programs when you start your cases. Following are some additional WRF-Var exercise to try and learn more about WRF-Var.
In WRF-Var, when you activate the namelist option
“pseudo=1” by defining
“export NL_NUM_PSEUDO=1”
WRF-Var generates a single observation with pre-specified innovation Observation – First Guess) value at desired location e.g at (in terms of grid co-ordinate) 23x23, level 14 for “U” observation with error characteristics 1 mps, innovation size = 1.0 mps, include following options in your con200 wrapper script and run it.
export NL_NUM_PSEUDO=1
export NL_PSEUDO_VAR="u" # Can be "u u t t q"
export NL_PSEUDO_X=23.0
export NL_PSEUDO_Y=23.0
export NL_PSEUDO_Z=14.0
export NL_PSEUDO_ERR=1.0 # Should be "1.0 1.0 1.0 1.0 0.001"
export NL_PSEUDO_VAL=1.0 # Should be "1.0 1.0 1.0 1.0 0.001"
Also set
“ export runplot_psot=1”
Note:
You may also like to change your “RUN_DIR” to generate output in
separate directory. You may also notice that in this sample wrapper script for
con200, all these parameters are already set but they are commented. So you
need to just remove the undesired comments.
After you have run this new wrapper script
successfully, look at various diagnostics files to understand the response of
one single “U” observation almost at the middle of the domain. You can display
the analysis increments as you did previously or run the same wrapper script by
setting:
“export runplot_psot=2”
It will activate some standard NCL script
residing in “WRFV3/var/graphics/ncl” and generate following three files in your
new $RUN_DIR/plotpsot directory.
xy_psot1.pdf --- XY cross-section
xz_psot1.pdf ---
XZ cross-section
yz_psot1.pdf ---
YZ cross-section
Note: You may like to repeat this exercise for
other observations like temperature (“t”), pressure “ps”, specific humidity
etc.