Single Domain Case - Ungrib

Unpack input GRIB data (ungrib.exe)

We plan on using this case study for a number of different model runs, including examples of adding SST data to a model run. So let's unpack the data first before we get started.

1. If you have not already created a directory in which to place your input data, do so now. For organizational purposes, create this directory at the same level as your WPS/ and WRF/ directories.

mkdir DATA

2. Download the Matthew case study data and place in the DATA/ directory. When you unpack the tar file, you will get a directory called matthew/ :

tar -xf matthew_1deg.tar.gz

GFS (Global Forecast System) is a model product from NCEP.

Type: GRIB2 data

Resolution:
    1deg global data
    Output frequency 6 hourly
    27 pressure levels (1000-10hPa ; excluding surface)

The data are available for the period 2016-10-06_00 to 2016-10-08_00 (data frequency is 6 hourly).
 

3. Let's get familiar with the data by running g2print on one of the data files:

./util/g2print.exe ../DATA/matthew/fnl_20161006_00_00.grib2 >& g2print.log

4. Link-in the GFS Vtable:

ln -sf ungrib/Variable_Tables/Vtable.GFS Vtable

Peruse the ungib/Variable_Tables/Vtable.GFS file to see which fields we are going to try and unpack from the GRIB files.
 

5. Link in the GRIB data by making use of the script link_grib.csh

./link_grib.csh ../DATA/matthew/fnl

    This will create the following links:

GRIBFILE.AAA -> ../DATA/matthew/fnl_20161006_00_00.grib2
GRIBFILE.AAB -> ../DATA/matthew/fnl_20161006_06_00.grib2
GRIBFILE.AAC -> ../DATA/matthew/fnl_20161006_12_00.grib2
GRIBFILE.AAD -> ../DATA/matthew/fnl_20161006_18_00.grib2
GRIBFILE.AAE -> ../DATA/matthew/fnl_20161007_00_00.grib2
GRIBFILE.AAF -> ../DATA/matthew/fnl_20161007_06_00.grib2
GRIBFILE.AAG -> ../DATA/matthew/fnl_20161007_12_00.grib2
GRIBFILE.AAH -> ../DATA/matthew/fnl_20161007_18_00.grib2
GRIBFILE.AAI -> ../DATA/matthew/fnl_20161008_00_00.grib2
 

6. Edit namelist.wps, and set the following:
    Note: For detailed explanations of these variables, as well as suggestions for best practices, see the Best Practices WPS Namelist page, or Chapter 3 of the WRF User's Guide.

max_dom = 1
start_date = '2016-10-06_00:00:00',
end_date = '2016-10-08_00:00:00',
interval_seconds = 21600,
prefix = 'FILE',
 

7. Run ungrib to create the intermediate files:

./ungrib.exe

    This will create the following files:

FILE:2016-10-06_00
FILE:2016-10-06_06
FILE:2016-10-06_12
FILE:2016-10-06_18
FILE:2016-10-07_00
FILE:2016-10-07_06
FILE:2016-10-07_12
FILE:2016-10-07_18
FILE:2016-10-08_00

The log file (ungrib.log) created during this step is very important. It contains information regarding the fields that were found in the input file and on which levels these fields are available.
Peruse this log to make sure all the fields in the Vtable were found. Additionally it can be used for troubleshooting in case the ungrib process is not completed successfully.
 

8. Familiarize yourself with the intermediate files:

./util/rd_intermediate.exe FILE:2016-10-06_00



Below are the output intermediate files generated by this step.
If you have problems generating these files, you can use this set to continue to the next step
Intermediate files



Read more about the availability of other GFS data sets.


 

While we are unpacking data, let's get some SST data too.

Note: We won't be using SST data for the single-domain case, but it will be used later if you are interested in running the sst-update case. For now, just store this data so that you will have it later. If you are not interested in running the sst-update case, then you can skip this section.

1. Download the SST data and place it in the directory ../DATA/. Once you unpack the data, you should have a directory called matthew_sst/, which is where the individual SST data files will be:

tar -xf matthew_sst.tar.gz

2. ln -sf ungrib/Variable_Tables/Vtable.SST Vtable

3. ./link_grib.csh ../DATA/matthew_sst/rtg_sst_grb

4. Edit namelist.wps

start_date = '2016-10-06_00:00:00',
end_date = '2016-10-08_00:00:00',
interval_seconds = 21600,
prefix = 'SST',

5. ./ungrib.exe



You may have noticed that the SST data are only available once a day, but we set interval_seconds to 6 hourly. By doing this, ungrib will temporally interpolate the available SST data to a frequency of 6 hourly.

Normally there is no advantage in temporally interpolating data at this stage, but as the SST data are only available every day, and we would like to use it with the Matthew data set, which is available every 6 hours, interpolating the SST data to the same frequency will ensure we can later use it with the Matthew data set.



Below are the output intermediate files generated by this step.
If you have problems generating this data, you can use this set to continue to the next step.
Intermediate files


Read more about the availability of other SST data sets.


Set up the model domain