Home Agenda Lectures Virtual Mtg. Etiquette



  Quick Links
Single Domain Run

Recall the flowchart from the initial practice exercise. You will follow the same process for this run, but will have the opportunity to make modifications to the namelist files.



**Click on the links to jump to the desired section.**

1. Set-up and Run Geogrid
2. Set-up and Run Ungrib
3. Set-up and Run Metgrid
4. Prepare to Run Real & WRF
5. Run the Real Program
6. Run the WRF Model
Check your output


1. Set-up and Run Geogrid

The geogrid program defines the map projection, as well as the geographic location, size, and resolution of the domain. It also interpolates static fields (e.g., topography height, landuse category, etc.) to the domain.


  1. Make sure you are in the wps directory.
    cd /glade/derecho/scratch/$USER/practice_exercises/wps
  2. Before starting this case, remove all files created in the initial practice session, and place them in a new directory, and remove files no longer needed.
    mkdir co_blizzard
    mv met_em* FILE* geo_em.d01.nc co_blizzard
    rm GRIBFILE*

  3. Use either vi, emacs, or gedit to edit namelist.wps to configure the simulation domain.


    Note:

    Some namelist parameters require a value for each domain, while others only need to be specified once, and is understood to represent a value for all domains. You do not need to remove entries in columns 2+ when configuring a single domain. Setting max_dom to 1 tells the code to ignore any entries in columns 2+.

    • You are creating a single domain over the eastern half of the United States.
    • The domain size is 100, x100, grid points, with a grid resolution of 30 km.
    • We will use a lambert projection.
    • Make the following changes to the &share and &geogrid records in the namelist.wps file (these are the two namelist records used by the geogrid.exe program).


    &share
    max_dom = 1,

    &geogrid
    e_we = 100,
    e_sn = 100,
    dx = 30000
    dy = 30000
    map_proj = 'lambert',
    ref_lat = 38
    ref_lon = -90
    truelat1 = 30.0
    truelat2 = 60.0
    stand_lon = -90
    geog_data_path = Untitled Document '/glade/work/wrfhelp/WPS_GEOG'





    Note:

    For additional namelist.wps details, see the WPS Chapter of the WRF-ARW Users' Guide, or the best practice namelist for WPS – both can be found by clicking on "Useful Links" at the top of this page.




  4. Before running geogrid.exe, preview the domain to make sure it is correct. Use the plotgrids_new.ncl script to do this.



    Note:

    Make sure you have X–forwarding installed to be able to view plots directly from Derecho. If you do not have x-forwarding capability on your system, WRF instructors are not able to help with this! You will need to work with a systems administrator at your institution to get it installed.

    • In the command line, type:
      ncl util/plotgrids_new.ncl

    • You should see the following plot appear on the screen:
    • To close the X11 display window, click on the image, or type 'ctrl-c'.



    Notes:

    • The output type is x11. In a real application, if you wanted to output another type, you could modify the plotgrids_new.ncl script. For example, to output a .pdf file, uncomment the line:
      type = "pdf"
      and run the script again. This creates a wps_show_dom.pdf file.

    • To open the .pdf file on Derecho, use the command
      display wps_show_dom.pdf



  5. Now run the geogrid.exe program to generate the geographical data file "geo_em.d01.nc."
    ./geogrid.exe

    If successful, you should see "Successful completion of geogrid" at the end of the print out. You should also now have the file geo_em.d01.nc in your wps directory.



    Notes:

    • A file called geogrid.log is automatically generated upon running geogrid.exe. This file can be useful for tracking errors, if needed.

    • If you run geogrid in a different environment sometime in the future, in the on–screen print–out, you may see a list of optional fields not processed by geogrid. This is okay, as those are just optional fields you are likely not interested in using.
  6.  



    Go back to top



    2. Set-up and Run Ungrib

    Model input data are often in "Grib" format. The ungrib program reads these data, extracts the meteorological fields, and then writes those fields to an intermediate file format that can be read by the next program (metgrid).

    The case we are using for this simulation is a severe storm outbreak from March of 2023 in the midwest and southeast United States. There were over 1000 severe weather reports that included 188 tornado reports. The data we are using are final analysis data from the GFS model.

    Ungrib uses the &share and &ungrib records in namelist.wps. Nothing needs to be changed in the &ungrib record for this simulation.



    1. Make the following changes to the &share namelist record to ingest data for the correct time period (if interested, see additional information about these namelist parameters).

      Again, no need to delete the second column. Ungrib only reads the first column since max_dom=1


      &share

      start_date = '2023-03-31_00:00:00 ',
      end_date = '2023-04-01_00:00:00 ',
      interval_seconds = 21600



      Notes:

      • interval_seconds refers to the number of seconds between data intervals in the gribbed data. Our GFS FNL data is provided every 6 hours, and therefore we set interval_seconds to 21600.

      • When running cases for your own research, try to use high frequency data, if possible.




    2. Link the GRIB data into the current directory using the script link_grib.csh
      ./link_grib.csh /glade/campaign/mmm/wmr/wrf_tutorial/input_data/severe_outbreak/fnl_20230

      Notes:

      • There is no "." at the end of this command because you are running a script, and not linking files with a command.
      • Take care to link to FILES and not to a directory. Only the root of the files is needed, and not a list of independent files; therefore, once you have typed through the name of the directory that holds the grib files, simply hit the 'tab' button and auto-complete provides the prefix of the files, which is all that is necessary.


      You should now have files beginning with the prefix GRIBFILE.AA* (one for each time period).


    3. Link the correct Vtable to the wps/ directory, renaming it the generic name "Vtable" so the ungrib program recognizes it when processing (recall input data for this case is GFS FNL, so use the GFS Vtable). Vtables are used to determine which fields are necessary to extract from a grib file (not all fields are needed).
      ln   -sf  ungrib/Variable_Tables/Vtable.GFS   Vtable

    4. Ungrib the input data by running the ungrib.exe program:
      ./ungrib.exe

      If successful, you should see "Successful completion of ungrib" at the end of the print out, and the following files should have been created:

      FILE:2023-03-31_00
      FILE:2023-03-31_06
      FILE:2023-03-31_12
      FILE:2023-03-31_18
      FILE:2023-04-01_00

      These files now have the data in intermediate format, ready for the metgrid.exe process.



      Note:

      A file called ungrib.log is automatically generated upon running ungrib.exe. This file can be useful for tracking errors, if needed.


    Go back to top



    3. Set-up and Run Metgrid

    The metgrid program horizontally interpolates meteorological data (extracted by the ungrib program) to simulation domains (defined by geogrid), and creates the input data files necessary to run the WRF model.




    Note:

    The "&metgrid" namelist record is used for this process; however, since we kept this case simple, no namelist changes are needed at this point.

    To run the metgrid program, type:

    ./metgrid.exe

    If successful, you should see "Successful completion of metgrid" at the end of the print out. You should now see the following files:

    met_em.d01.2023-03-31_00:00:00.nc
    met_em.d01.2023-03-31_06:00:00.nc
    met_em.d01.2023-03-31_12:00:00.nc
    met_em.d01.2023-03-31_18:00:00.nc
    met_em.d01.2023-04-01_00:00:00.nc




    Notes:

    • As in geogrid and ungrib, metgrid.exe automatically generates a log file called "metgrid.log" that can be useful for tracking errors.
    • Notice the '.nc' at the end of the files. This indicates these files are in 'netCDF' format. If interested, for a quick look at all variables in the file, type:
      ncdump -h met_em.d01.2023-03-31_00:00:00.nc
      If interested in viewing all fields in the met_em* files, use the ncview tool:
      ncview met_em.d01*
      To quit ncview, simply type "ctrl-c".


    Go back to top



    4. Prepare to Run Real & WRF

    1. Make sure you are in the WRF/test/em_real directory.
      cd /glade/derecho/scratch/$USER/practice_exercises/wrf/test/em_real

    2. Using vi, emacs, or gedit, edit the namelist.input file to reflect the domain and date information of your case:

      **Once again, there is no need to delete the second column. Programs real.exe and wrf.exe only read the first column since max_dom=1



    3. &time_control
      run_hours = 24,
      run_minutes = 0,
      start_year = 2023,
      start_month = 03,
      start_day = 31,
      start_hour = 00,
      end_year = 2023,
      end_month = 04,
      end_day = 01,
      end_hour = 00
      interval_seconds = 21600
      history_interval = 180,
      frames_per_outfile = 1,
      restart = .false.,
      restart_interval = 720,
       
      &domains
      time_step = 150
      max_dom = 1,
      e_we = 100,
      e_sn = 100,
      e_vert = 45,
      num_metgrid_levels = 34
      num_metgrid_soil_levels = 4
      dx = 30000
      dy = 30000

       

      Notes:

      • How did we know to set "num_metgrid_levels" to 34 and "num_metgrid_soil_levels" to 4? In the command line, try the commands:
        ncdump -h ../../../wps/met_em.d01.2023-03-31_00:00:00:00.nc | grep -i num_metgrid_levels

        ncdump -h ../../../wps/met_em.d01.2023-03-31_00:00:00:00.nc | grep -i num_metgrid_soil_levels

        We want to set these parameters to the values found in the input (met_em* files)

      • Setting restart_interval = 720 results in a restart output file created every 12 hours. Since this is not a restart run, make sure to set "restart=.false.". Restart runs are discussed in a later exercise.




      Note:

      For additional namelist.input details, see the Namelist Variables chapter of the WRF-ARW Users' Guide, or the best practice namelist for WRF. Both can be found by clicking on "Useful Links" at the top of this page.


    Go back to top



    5. Run the Real Program

    The real program uses horizontally-interpolated meteorological data (met_em* files from wps) and vertically interpolates them for use with the WRF model. It creates initial condition files and a lateral boundary file that are used by WRF.


    1. Link the metgrid output files from /wps to the current directory:
      ln -sf ../../../wps/met_em* .
    2. Run real.exe (which uses the met_em* files as input) to produce model initial condition and lateral boundary files. Since the real and wrf executables were built for parallel computing, you will have to submit a batch script to run them. The runreal.sh script is included in your test/em_real directory. Issue the following command.
      qsub runreal.sh
      This should run very quickly. To check if it's done, use the ls command to see if you have rsl.error.* and rsl.out.* files in your directory. If so, the following command should show the "success" message at the bottom.
      tail rsl.out.0000

      SUCCESS COMPLETE REAL_EM INIT
      and you should now have the initial condition and boundary files in your running directory.
      wrfbdy_d01
      wrfinput_d01


      Note:

      If you see the message
      MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1" or "BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
      your run was not successful and you need to check your rsl.error.0000 file for the cause.





    Go back to top



    6. Run the WRF Model


    The WRF model uses the intitial condition file(s) (wrfinput_d0*) and boundary condition file (wrfbdy_d01) generated by the real program to perform model integration, using user-specified options provided in namelist.input (e.g., physics options).


    1. Run wrf.exe (which uses the wrfbdy_d01 and wrfinput_d01 files as input) Since the wrf executable was built for parallel computing, you will have to submit a batch script to run it. The runwrf.sh script is included in your test/em_real directory. Issue the following command. for a model simulation and forecast:
      qsub runwrf.sh


      Note:

      If you are interested, take a look at the runwrf.sh script to see what is included. The following line indicates we are using a single node with 36 processors for the WRF simulation.
      #PBS -l select=1:ncpus=36:mpiprocs=36



      Running a 24–hour simulation for this case should only take a couple of minutes to complete (once it's out of the queue and running) since we have a small, coarse domain. Wait about one minute and then check the status of your request, using the following command:
      qstat -u $USER
      If your executable is still in the queue, or is running, it will show up after you issue the above "qstat" command. If nothing shows up, check to see if you have rsl.* files, and if so, check the end of those files using the tail command, and you should see the "success" message.
      tail rsl.out.0000

      SUCCESS COMPLETE WRF
      When the simulation is complete, you should have the following files.
      wrfout_d01_2023-03-31_00:00:00
      wrfout_d01_2023-03-31_03:00:00
      wrfout_d01_2023-03-31_06:00:00
      wrfout_d01_2023-03-31_09:00:00
      wrfout_d01_2023-03-31_12:00:00
      wrfout_d01_2023-03-31_15:00:00
      wrfout_d01_2023-03-31_18:00:00
      wrfout_d01_2023-04-01_00:00:00
       

      Note:

      Recall you set the namelist to generate restart files every 12 hours, so you should also see the following files (which are necessary later when running the 'restart' practice case):
      wrfrst_d01_2023-03-31_12:00:00
      wrfrst_d01_2023-04-01_00:00:00




    Check your output:

    • Take a look at the log files (rsl.out.0000). You can use commands such as 'gedit', 'vi,' 'cat', 'more', or 'tail' to look at the full file, or various parts of the files. e.g.,:
      cat rsl.out.0000
    • Try the netcdf data browser "ncview" to examine your wrf output files.
      ncview wrfout_d01*


      Notes:

      • ncview can be used to view multiple files, but only for 1 domain at a time. This case only runs a single domain, but keep this rule in mind later when running with nests.

      • All output fields are available to view. You can view one field at a time, and then can click through time periods and if the variables are in the 4d column, you can also look at each level. The following are fields you may be interested in:
        RAINC: Accumulated total cumulus precipitation (under the list of "3d vars")
        RAINNC: Accumulated total grid scale precipitation (3d)
        SNOWC: Snow coverage (3d)
        SNOWNC: Accumulated total grid scale snow and ice (3d)
        PSFC: Surface pressure (3d)
        Q2/T2: Water Vapor/Temperature at 2m above ground (3d)
        U10/V10: X/Y Component of wind (speed) at 10m above ground (3d)
        MU: Perturbation dry air mass in column (3d)
        PH: Perturbation geopotentail (under the list of "4d vars")
        QVAPOR/QCLOUD/QRAIN/QICE/QSNOW: Vapor/Cloud water/Rain water/Ice/Snow mixing ratio (4d)
        U/V: X/Y-wind component (speed) (4d)



    • Generate graphics with one of the supplied packages (you may do this now, or you may wait until you have practiced running the model a little more, and perhaps use these tools later in the week).




    *Organization Suggestion*

    In some later exercises, you are asked to use files you have already created. To preserve those files, we recommend making a separate directory for each exercise you complete in which to store your files (so they are not over-written). For example, now would be a good time to create a directory called something like "single_domain" (do this inside the test/em_real/ directory):
    mkdir single_domain
    and then copy the necessary files into that directory:
    cp wrfbdy_d01 wrfinput_d01 namelist.input wrfout* single_domain
    Now later, when asked to use those files, simply copy them back into the working directory. For example:
    cp single_domain/wrfbdy_d01 .


    WRF Tutorial Exercises



    Continue to More Exercises

    If you plan to attempt more exercises right now, you can access the cases studies menu by clicking here.