How to Compile?

How to Run?

WRF Namelist

WRF Updates

Moving Nest

Nudging

Known Problems and Fixes

Graphic Tools

Utilities

 

 

 

WRF Model Version 2: How to Run the Model?

How to run?

After successful compilation, it is time to run the model. You can do so by either cd to the run/ directory, or the test/case_name directory. In either case, you should see executables, ideal.exe or real.exe, and wrf.exe, linked files (mostly for real-data cases), and one or more namelist.input files in the directory.

Idealized, real data, two-way nested, and one-way nested runs are explained on this page. Read on. Moving nest is explained here. For information on how to set up and make analysis and observation nudging runs, see User's Guide.

a. Idealized case

Say you choose to compile the test case em_squall2d_x, now type 'cd test/em_squall2d_x' or 'cd run'.

Edit namelist.input file (see README.namelist in WRFV2/run/ directory or its Web version) to change length of integration, frequency of output, size of domain, timestep, physics options, and other parameters.

If you see a script in the test case directory, called run_me_first.csh, run this one first by typing:

run_me_first.csh

This links some data files that you might need to run the case.

To run the initialization program, type

ideal.exe

This will generate wrfinput_d01 file in the same directory.

Note:

- ideal.exe cannot be run in parallel in general, OpenMP is ok.

- ideal.exe for the quarter_ss case can be run in MPI.

- For parallel compiles, run this on a single processor.

To run the model, type

wrf.exe

or variations such as

wrf.exe >& wrf.out &

Note:

- Two-dimensional ideal cases cannot be run in parallel. OpenMP is ok.

- The execution command may be different for MPI runs on different machines, e.g. mpirun.

After successful completion, you should see wrfout_d01_0001-01-01* and wrfrst* files, depending on how one specifies the namelist variables for output.

b. Real-data case

Type 'cd test/em_real' or 'cd run', and this is where you are going to run both the WRF initialization program, real.exe, and WRF model, wrf.exe.

Running a real-data case requires successfully running the WRF Standard Initialization program. Make sure wrf_real_input_em.* files from the Standard Initialization are in this directory (you may link the files to this directory).

NOTE: You do not have to rerun V2.1.2 real.exe in order to run V2.2 wrf.exe.

Edit namelist.input for dates, and domain size. Also edit time step, output options, and physics options.

Type 'real.exe' to run and this will produce wrfinput_d01 and wrfbdy_d01 files. In real data case, both files are required.

Run WRF model by typing 'wrf.exe'.

A successul run should produce one or several output files named like wrfout_d01_yyyy-mm-dd_hh:mm:ss. For example, if you start the model at 1200 UTC, January 24 2000, then your first output file should have the name:

wrfout_d01_2000-01-24_12:00:00

It is always good to check the times written to the output file by typing:

ncdump -v Times wrfout_d01_2000-01-24_12:00:00

You may have other wrfout files depending on the namelist options (how often you split the output files and so on using namelist option frames_per_outfile). You may also create restart files if you have restart frequency (restart_interval in the namelist.input file) set within your total integration length. The restart file should have names like

wrfrst_d01_yyyy-mm-dd_hh:mm:ss

For DM (distributed memory) parallel systems, some form of mpirun command will be needed here. For example, on a Linux cluster, the command may look like:

mpirun -np 4 real.exe

mpirun -np 4 wrf.exe

On IBM, the command is

poe real.exe

poe wrf.exe

in a batch job, and

poe real.exe -rmpool 1 procs 4

poe wrf.exe -rmpool 1 procs 4

for interactive runs.

How to Make a Two-way Nested Run?


V2.1 supports a two-way nest option, in both 3-D idealized cases (quarter_ss and b_wave) and real data cases. The model can handle multiple domains at the same nest level (no overlapping nest), or multiple nest levels (telescoping).

Most of options to start a nest run are handled through the namelist. All variables in the namelist.input file that have multiple columns of entries need to be editted with caution. The following are the key namelist variables to modify:

start_ and end_year/month/day/minute/second: these control the nest start and end times

input_from_file: whether a nest requires an input file (e.g. wrfinput_d02). This is typically an option for real data case.

fine_input_stream: which fields from the nest input file are used in the nest initialization. The fields to be used in nest initialization are defined in the Registry/Registry.EM file. Typically they include static fields (such as terrain, landuse), and time-varying masked land surface fields (such as skin temp, soil temperature and moisture).
= 0: all fields from nest input are used.
= 2: only static, and time-varying, masked land surface fields are used.

max_dom: setting this to a number > 1 will invoke nesting. For example, if you want to have one coarse domain and one nest, set this variable to 2.

grid_id: domain identifier, will be used in the wrfout naming convention.

parent_id: use the grid_id to define the parent_id number for a nest.

i_parent_start/j_parent_start: lower-left corner starting indices of the nest domain in its parent domain. You should find these numbers in your WPS or SI's namelist file.

parent_grid_ratio: integer parent-to-nest domain grid size ratio. If feedback is off, then this ratio can be even or odd. If feedback is on, then this ratio has to be odd.

parent_time_step_ratio: time ratio for the coarse and nest domains may be different from the parent_grid_ratio. For example, you may run a coarse domain at 30 km, and a nest at 10 km, the parent_grid_ratio in this case is 3. But you do not have to use 180 sec for the coarse domain and 60 for the nest domain. You may use, for example, 45 sec or 90 sec for the nest domain by setting this variable to 4 or 2.

feedback: this option takes the values of prognostic variables in the nest and overwrites the values in the coarse domain at the coincident points. This is the reason currently that it requires odd parent_grid_ratio with this option.

smooth_option: this a smoothing option for the parent domain if feedback is on. Three options are availabe: 0 - no smoothing; 1 - 1-2-1 smoothing; 2 - smoothing-desmoothing.

3-D Idealized Cases

For 3-D idealized cases, no additional input files are required. The key here is the specification of the namelist.input file. What the model does is to interpolate all variables required in the nest from the coarse domain fields. Set

input_from_file = F, F

Real Data Cases

For real-data cases, three input options are supported. One is similar to running the idealized cases. That is to have all fields for the nest interpolated from the coarse domain (namelist variable input_from_file set to F for each domain). The disadvantage of this option is obvious, one will not benefit from the higher resolution static fields (such as terrain, landuse, and so on).

The second option is to set input_from_file = T for each domain, which means that the nest will have a nested wrfinput file to read in (similar to MM5 nest option IOVERW = 1).

The third option is to set input_from_file = T, and fine_input_stream = 2. This allows atmospheric fields to be interpolated from the parent domain, but uses static fields and masked, time-varying fields from the nest (similar to MM5 nest option IOVERW = 2).

Option 2 and 3 requires running WPS/SI requesting two or more nest domains.

To prepare for the nested run, first follow the instruction in program WPS or SI to create nest domain files. In addtition to the files available for domain 1 (met_em.d01.yyyy-mm-dd_hh:mm:ss or wrf_real_input_em.d01.yyyy-mm-dd_hh:mm:ss for all time periods), you should have a file from WPS or SI that is for domain 2, and this should be for the first time period of your model run. Say you have created WPS output for a model run that starts at 1200 UTC Jan 24, 2000, using 6 hourly data, you should then have these files from WPS:

met_em.d01.2000-01-24_12:00:00
met_em.d01.2000-01-24_18:00:00
met_em.d01.2000-01-25_00:00:00
met_em.d01.2000-01-25_06:00:00
met_em.d01.2000-01-25_12:00:00

If you use the nested option in WPS, you should have one more file:

met_em.d02.2000-01-24_12:00:00

- Edit namelist.input file for all columns relevant to your nested run. Set max_dom = the total number of nests you would like to run.

- Run real.exe, and it will create wrfinput_d0n for all domains, and wrfbdy_d01 for all time periods for domain 1.

- Run wrf.exe as usual.

How to Make a One-way Nested Run?

Starting from WRF 2.0.2 (released on June 3, 2004), WRF supports one-way nested option. One-way nesting is defined as a finer grid resolution run made as a subsequent run after the coarser grid run and driven by coarse grid output as initial and lateral boundary conditions, together with input from higher resolution terrestrial fields (e.g. terrain, landuse, etc.)

When one-way nesting is used, the coarse-to-fine grid ratio is only restricted to an integer. An integer less than 5 is recommended.

It takes several steps to make a one-way nested run. It involves these steps:

1) Make a coarse grid run
2) Make temporary fine grid initial condition (only a single time period is required)
3) Run program ndown, with coarse grid WRF model output, and fine grid input to generate fine grid initial and boundary conditions
4) Make the fine grid run

To compile, choose an option that supports nesting.

Step 1: Make a coarse grid run

This is no different than any of the single domain WRF run as described above.

Step 2: Make a temporary fine grid initial condition file

The purpose of this step is to ingest higher resolution terrestrial fields and corresponding land-water masked soil fields.

Before doing this step, one would have run WPS(SI) and requested one coarse and one nest domain, and for the one time period one wants to start the one-way nested run. This should generate an WPS/SI output for the nested domain (domain 2) named met_em.d02.* (or wrf_real_input_em.d02.* from SI).

- Rename met_em.d02.* to met_em.d01.*. (Move the original domain 1 WPS output files to a different place before you do this.)
- Edit the namelist.input file for this domain (pay attention to column 1 only,) and edit in the correct start time, grid dimensions and physics options.
(Or you could follow the steps to make two-way nested run, run real.exe to produce wrfinput_d01 and wrfinput_d02.)
- Run real.exe for this domain and this will produce a wrfinput_d01 file.
- Rename this wrfinput_d01 file to wrfndi_d02.

Step 3: Make the final fine grid initial and boundary condition files

- Edit namelist.input again, and this time one needs to edit two columns: one for dimensions of the coarse grid, and one for the fine grid. Note that the boundary condition frequencey (namelist variable interval_seconds) is the time in seconds between the coarse grid output times.
- Run ndown.exe, with inputs from the coarse grid wrfout files, and wrfndi_d02 file generated from Step 2 above. This will produce wrfinput_d02 and wrfbdy_d02 files.

Note that one may run program ndown in mpi - if it is compiled so. For example,

mpirun -np 4 ndown.exe

Step 4: Make the fine grid WRF run

- Rename wrfinput_d02 and wrfbdy_d02 to wrfinput_d01 and wrfbdy_d01, respectively.
- Edit namelist.input one more time, and it is now for the fine grid only.
- Run WRF for this grid.

Moving nest



 
 
Home -- Model System -- User Support -- Doc / Pub -- Links -- Download -- WRF Real-time Forecast