The practical exercises on this page are designed to complement the lectures given at the 2018 MPAS-Atmosphere tutorial in Boulder, Colorado. The lecture slides from this tutorial may be accessed through the links, below.
This web page is intended to serve as a guide through the practice exercises of this tutorial. Exercises are split into five sections, corresponding to the five practical sessions of the tutorial. The details of the exercises in each section can be viewed by clicking on the headers.
All necessary input files (except source code to be downloaded as described in this tutorial) may be downloaded in a single .tar.gz file.
While going through this practice guide, you may find it helpful to have a copy
of the MPAS-Atmosphere Users' Guide available in another window.
Click here
to open the Users' Guide in a new window.
Throughout the tutorial, we'll be working directly in our home directories for simplicity. We'll also need to ensure that the following modules are loaded before beginning:
lab101.mmm.ucar.edu:/classroom/users/class101>module list Currently Loaded Modules: 1) ncarenv/1.0 3) gnu/5.2.0 5) mpich/3.0.4 7) netcdf/3.6.3 9) ncl/6.2.0 2) ncarbinlibs/1.1 4) ncarcompilers/1.0 6) pnetcdf/1.5.0 8) pio/1.9.19 10) ncview/2.1.5
Generally, we would need to install an MPI implementation (e.g., MPICH, OpenMPI, or MVAPICH) as well as parallel-netCDF, netCDF4, and PIO libraries ourselves — if these didn't already exist on our system — before proceeding with these exercises.
In this session, our goal is to obtain a copy of the MPAS source code directly from the MPAS GitHub repository. We'll then compile both the init_atmosphere_model and atmosphere_model programs before using the former to begin the processing of time-invariant, terrestrial fields for use in a real-data simulation. While this processing is taking place, in another terminal window, we'll use any extra time to practice preparing idealized initial conditions.
Since the default shell on the classroom machines is tcsh, all shell commands throughout this tutorial will be written for tcsh. However, if you prefer a different shell and don't mind mentally translating, e.g., setenv commands to export commands, then feel free to switch shells.
As described in the lectures, the MPAS code is distributed directly from the GitHub repository where it is developed. While it's possible to navigate to https://github.com/MPAS-Dev/MPAS-Model and obtain the code by clicking on the "Releases" tab, it's much faster to clone the repository directly on the command-line. From within our $HOME directory, we can:
$ git clone https://github.com/MPAS-Dev/MPAS-Model.git
Cloning the MPAS-Model repository may take a few seconds, and at the end of the process, the output to the terminal should look something like the following:
Cloning into 'MPAS-Model'... remote: Counting objects: 38647, done. remote: Total 38647 (delta 3), reused 3 (delta 3), pack-reused 38643 Receiving objects: 100% (38647/38647), 16.22 MiB | 14.53 MiB/s, done. Resolving deltas: 100% (30112/30112), done. Checking out files: 100% (1537/1537), done.
We should now have an MPAS-Model directory:
$ ls -l MPAS-Model total 48 -rw-r--r-- 1 class101 cbet 3131 Jul 28 16:22 INSTALL -rw-r--r-- 1 class101 cbet 2311 Jul 28 16:22 LICENSE -rw-r--r-- 1 class101 cbet 24651 Jul 28 16:22 Makefile -rw-r--r-- 1 class101 cbet 2555 Jul 28 16:22 README.md drwxr-xr-x 14 class101 cbet 4096 Jul 28 16:22 src drwxr-xr-x 4 class101 cbet 4096 Jul 28 16:22 testing_and_setup
That's it! We now have a copy of the latest release of the MPAS source code.
As mentioned in the lectures, there are two "cores" that need to be compiled from the same MPAS source code: the init_atmosphere core and the atmosphere core.
Before beginning the compilation process, it's worth verifying that we have MPI compiler wrappers in our path, and that environment variables pointing to the netCDF, parallel-netCDF, and PIO libraries are set:
$ which mpif90 /usr/local/cmpwrappers/mpif90 $ which mpicc /usr/local/cmpwrappers/mpicc $ echo $NETCDF /usr/local/netcdf-3.6.3-gfortran $ echo $PNETCDF /usr/local/parallel-netcdf-1.5.0-gfortran $ echo $PIO /usr/local/pio-1.9.19-gcc
If all of the above commands were successful, your shell environment should be sufficient to allow the MPAS cores to be compiled.
We'll begin by compiling the init_atmosphere core, producing an executable named init_atmosphere_model.
After changing to the MPAS-Model directory
$ cd MPAS-Model
we can issue the following command to build the init_atmosphere core with the gfortran compiler:
$ make gfortran CORE=init_atmosphere PRECISION=single
Note the inclusion of PRECISION=single on the build command; the default is to build MPAS cores as double-precision executables, but single-precision executables require less memory, produce smaller output files, run faster, and — at least for MPAS-Atmosphere — seem to produce results that are no worse than double-precision executables.
After issuing the make command, above, compilation should take several minutes, and if the compilation was successful, the end of the build process should have produced messages like the following:
******************************************************************************* MPAS was built with default single-precision reals. Debugging is off. Parallel version is on. Papi libraries are off. TAU Hooks are off. MPAS was built without OpenMP support. MPAS was built with .F files. The native timer interface is being used Using the PIO 1.x library. *******************************************************************************
If compilation of the init_atmosphere core was successful, we should also have an executable file named init_atmosphere_model:
$ ls -l drwxr-xr-x 2 class101 cbet 4096 Jul 28 16:40 default_inputs -rwxr-xr-x 1 class101 cbet 6801488 Jul 28 16:40 init_atmosphere_model -rw-r--r-- 1 class101 cbet 3131 Jul 28 16:22 INSTALL -rw-r--r-- 1 class101 cbet 2311 Jul 28 16:22 LICENSE -rw-r--r-- 1 class101 cbet 24651 Jul 28 16:22 Makefile -rw-r--r-- 1 class101 cbet 1181 Jul 28 16:40 namelist.init_atmosphere -rw-r--r-- 1 class101 cbet 2555 Jul 28 16:22 README.md drwxr-xr-x 14 class101 cbet 4096 Jul 28 16:40 src -rw-r--r-- 1 class101 cbet 660 Jul 28 16:40 streams.init_atmosphere drwxr-xr-x 4 class101 cbet 4096 Jul 28 16:22 testing_and_setup
Note, also, that default namelist and streams files for the init_atmosphere core have also been generated as part of the compilation process: these are the files named namelist.init_atmosphere and streams.init_atmosphere.
Now, we're ready to compile the atmosphere core. If we try this without cleaning any of the common infrastructure code, first, e.g.,
$ make gfortran CORE=atmosphere PRECISION=single
we would get an error like the following:
******************************************************************************* The MPAS infrastructure is currently built for the init_atmosphere_model core. Before building the atmosphere core, please do one of the following. To remove the init_atmosphere_model_model executable and clean the MPAS infrastructure, run: make clean CORE=init_atmosphere_model To preserve all executables except atmosphere_model and clean the MPAS infrastructure, run: make clean CORE=atmosphere Alternatively, AUTOCLEAN=true can be appended to the make command to force a clean, build a new atmosphere_model executable, and preserve all other executables. *******************************************************************************
After compiling one MPAS core, we need to clean up the shared infrastructure before compiling a different core. We can clean all parts of the infrastructure that are needed by the atmosphere core by running the following command:
$ make clean CORE=atmosphere
Then, we can proceed to compile the atmosphere core in single-precision with:
$ make gfortran CORE=atmosphere PRECISION=single
The compilation of the atmosphere_model executable can take 10 minutes or longer to complete. Similar to the compilation of the init_atmosphere core, a successful compilation of the atmosphere core should give the following message:
******************************************************************************* MPAS was built with default single-precision reals. Debugging is off. Parallel version is on. Papi libraries are off. TAU Hooks are off. MPAS was built without OpenMP support. MPAS was built with .F files. The native timer interface is being used Using the PIO 1.x library. *******************************************************************************
Now, our MPAS-Model directory should contain an executable file named atmosphere_model, as well as default namelist.atmosphere and streams.atmosphere files:
$ ls -lL -rwxr-xr-x 1 class101 cbet 10272664 Jul 28 17:02 atmosphere_model -rwxr-xr-x 1 class101 cbet 1588376 Jul 28 17:02 build_tables -rw-r--r-- 1 class101 cbet 20580056 Jul 28 16:59 CAM_ABS_DATA.DBL -rw-r--r-- 1 class101 cbet 18208 Jul 28 16:59 CAM_AEROPT_DATA.DBL drwxr-xr-x 2 class101 cbet 4096 Jul 28 17:02 default_inputs -rw-r--r-- 1 class101 cbet 261 Jul 28 16:59 GENPARM.TBL -rwxr-xr-x 1 class101 cbet 6801488 Jul 28 16:40 init_atmosphere_model -rw-r--r-- 1 class101 cbet 3131 Jul 28 16:22 INSTALL -rw-r--r-- 1 class101 cbet 15022 Jul 28 16:59 LANDUSE.TBL -rw-r--r-- 1 class101 cbet 2311 Jul 28 16:22 LICENSE -rw-r--r-- 1 class101 cbet 24651 Jul 28 16:22 Makefile -rw-r--r-- 1 class101 cbet 1728 Jul 28 17:02 namelist.atmosphere -rw-r--r-- 1 class101 cbet 1181 Jul 28 16:40 namelist.init_atmosphere -rw-r--r-- 1 class101 cbet 543744 Jul 28 16:59 OZONE_DAT.TBL -rw-r--r-- 1 class101 cbet 536 Jul 28 16:59 OZONE_LAT.TBL -rw-r--r-- 1 class101 cbet 708 Jul 28 16:59 OZONE_PLEV.TBL -rw-r--r-- 1 class101 cbet 2555 Jul 28 16:22 README.md -rw-r--r-- 1 class101 cbet 847552 Jul 28 16:59 RRTMG_LW_DATA -rw-r--r-- 1 class101 cbet 1694976 Jul 28 16:59 RRTMG_LW_DATA.DBL -rw-r--r-- 1 class101 cbet 680368 Jul 28 16:59 RRTMG_SW_DATA -rw-r--r-- 1 class101 cbet 1360572 Jul 28 16:59 RRTMG_SW_DATA.DBL -rw-r--r-- 1 class101 cbet 4417 Jul 28 16:59 SOILPARM.TBL drwxr-xr-x 14 class101 cbet 4096 Jul 28 17:02 src -rw-r--r-- 1 class101 cbet 1203 Jul 28 17:02 stream_list.atmosphere.diagnostics -rw-r--r-- 1 class101 cbet 907 Jul 28 17:02 stream_list.atmosphere.output -rw-r--r-- 1 class101 cbet 9 Jul 28 17:02 stream_list.atmosphere.surface -rw-r--r-- 1 class101 cbet 1303 Jul 28 17:02 streams.atmosphere -rw-r--r-- 1 class101 cbet 660 Jul 28 16:40 streams.init_atmosphere drwxr-xr-x 4 class101 cbet 4096 Jul 28 16:22 testing_and_setup -rw-r--r-- 1 class101 cbet 11511 Jul 28 16:59 VEGPARM.TBL
Note, also, the presence of files named stream_list.atmosphere.*. These are lists of output fields that are referenced by the streams.atmosphere file.
The new files like CAM_ABS_DATA.DBL, LANDUSE.TBL, RRTMG_LW_DATA, etc. are look-up tables and other data files used by physics schemes in MPAS-Atmosphere. These files should all be symbolic links to files in the src/core_atmosphere/physics/physics_wrf/files/ directory.
If we have init_atmosphere_model and atmosphere_model executables, then compilation of both MPAS-Atmosphere cores was successful, and we're ready to proceed to the next sub-section.
For our first MPAS simulation in this tutorial, we'll use a quasi-uniform 240-km mesh. Since we'll later be working with a variable-resolution mesh, it will be helpful to maintain separate sub-directories for these two simulations.
To begin processing of the static, terrestrial fields for the 240-km quasi-uniform mesh, let's change to our home directory and make a new sub-directory named 240km_uniform:
$ cd ${HOME} $ mkdir 240km_uniform $ cd 240km_uniform
We can find a copy of the 240-km quasi-uniform mesh at /classroom/wrfhelp/mpas/meshes/x1.10242.grid.nc. To save disk space, we'll symbolically link this file into our directory:
$ ln -s /classroom/wrfhelp/mpas/meshes/x1.10242.grid.nc .
We will also need the init_atmosphere_model executable, as well as copies of the namelist.init_atmosphere and streams.init_atmosphere files. We'll make a symbolic link to the executable, and copies of the other files (since we will be changing them):
$ ln -s ${HOME}/MPAS-Model/init_atmosphere_model . $ cp ${HOME}/MPAS-Model/namelist.init_atmosphere . $ cp ${HOME}/MPAS-Model/streams.init_atmosphere .
Before running the init_atmosphere_model program, we'll need to set up the namelist.init_atmosphere file as described in the lectures:
&nhyd_model config_init_case = 7 / &data_sources config_geog_data_path = '/classroom/wrfhelp/GEOG_DATA/WPS_GEOG/' config_landuse_data = 'USGS' config_topo_data = 'GTOPO30' / &preproc_stages config_static_interp = true config_native_gwd_static = true config_vertical_grid = false config_met_interp = false config_input_sst = false config_frac_seaice = false /
Note, in particular, the setting of the path to the geographical data sets, and the settings to enable only the processing of static data and the "GWD" static fields.
After editing the namelist.init_atmosphere file, it will also be necessary to tell the init_atmosphere_model program the name of our input grid file, as well as the name of the "static" output file that we would like to create. We do this by editing the streams.init_atmosphere file, where we first set the name of the grid file for the "input" stream:
<immutable_stream name="input" type="input" filename_template="x1.10242.grid.nc" input_interval="initial_only"/>
and then set the name of the static file to be created for the "output" stream:
<immutable_stream name="output" type="output" filename_template="x1.10242.static.nc" packages="initial_conds" output_interval="initial_only" />
When interpolating geographical fields to a mesh, the "surface" stream can be ignored (but, you should not delete the "surface" stream, or the init_atmosphere_model program will complain).
After editing the streams.init_atmosphere file, we can begin the processing of the static, time-invariant fields for the 240-km quasi-uniform mesh:
$ ./init_atmosphere_model
The processing of the static, time-invariant fields will take some time &mdash perhaps up to an hour. In order to verify that the processing is proceeding as expected, we can open another terminal window and change directories to the 240km_uniform directory to check on the progress of the init_atmosphere_model program:
$ cd 240km_uniform $ tail -f log.init_atmosphere.0000.out
We should expect to see lines of output similar to the following being printed every few seconds:
/classroom/wrfhelp/GEOG_DATA/WPS_GEOG/topo_30s/15601-16800.01201-02400 /classroom/wrfhelp/GEOG_DATA/WPS_GEOG/topo_30s/16801-18000.01201-02400 /classroom/wrfhelp/GEOG_DATA/WPS_GEOG/topo_30s/18001-19200.01201-02400 /classroom/wrfhelp/GEOG_DATA/WPS_GEOG/topo_30s/19201-20400.01201-02400 /classroom/wrfhelp/GEOG_DATA/WPS_GEOG/topo_30s/20401-21600.01201-02400 /classroom/wrfhelp/GEOG_DATA/WPS_GEOG/topo_30s/21601-22800.01201-02400
If lines similar to the above are being periodically written, we can kill the tail process with CTRL-C before proceeding to the next sub-section.
Important note: If lines of output are not being periodically written to the terminal when running the tail command as described above, look for a file named log.init_atmosphere.0000.err and detemine what went wrong before going on to the next sub-section! Exercises in the second session of this tutorial will require the successful processing of static, time-invariant fields!
While the init_atmosphere_model program is processing static, time-invariant fields for a real-data simulation in the 240km_uniform directory, we can try initializing and running an idealized simulation in a separate directory.
The init_atmosphere_model program supports the creation of idealized initial conditions for the following cases:
For each of these idealized cases, prepared input files are available on the MPAS-Atmosphere download page. We'll go through the process of creating initial conditions for the idealized supercell case; the general process is similar for the other two cases.
The prepared input files for MPAS-Atmosphere idealized cases may be found by going to the MPAS Homepage and clicking on the "MPAS-Atmosphere download" link at the left. Then, clicking on the "Configurations for idealized test cases" link will take you to the download links for idealized cases.
Although the input files for these idealized cases can be downloaded through the web page, we can also download, e.g., the supercell input files with wget once we know the URL. Beginning from your home directory, you can download the supercell input file archive and unpack it with:
$ cd ${HOME} $ wget http://www2.mmm.ucar.edu/projects/mpas/test_cases/v5.1/supercell.tar.gz $ tar xzvf supercell.tar.gz
After changing to the resulting supercell directory, the README file will give an overview of how to initialize and run the test case.
Once the initial conditions have been created as described in the README file, it's helpful to verify that there were no errors; the end of the log.init_atmosphere.0000.out file should look something like the following:
----------------------------------------- Total log messages printed: Output messages = 293 Warning messages = 10 Error messages = 0 Critical error messages = 0 -----------------------------------------
Assuming there were no errors, the simulation can be run using up to 4 MPI tasks with the mpiexec command:
$ mpiexec -n 4 ./atmosphere_model
The simulation may take about an hour to run. During one of the later practical sessions, if you would like to revisit the simulation to see the results, running the supercell.ncl NCL script should produce plots of the time evolution of several model fields:
$ ncl supercell.ncl
Having compiled both the init_atmosphere and atmosphere cores, and also having begun the interpolation of time-invariant, terrestrial fields to create a "static" file for real-data simulations, we'll interpolate atmospheric and land-surface fields to create complete initial conditions for an MPAS simulation in this section. We'll also process 10 days' worth of SST and sea-ice data to update these fields periodically as the model runs. Then, we'll start a five-day simulation, which will take around an hour to complete. Once the model has started running, we can use any extra time to compile the convert_mpas utility program.
In the previous session, we started the process of interpolating static geographical fields to the 240-km, quasi-uniform mesh in the 240km_uniform directory. If this interpolation was successful, this directory should contain two new files: log.init_atmosphere.0000.out and x1.10242.static.nc.
$ cd ${HOME}/240km_uniform $ ls -l log.init_atmosphere.0000.out x1.10242.static.nc
The end of the log.init_atmosphere.0000.out file should show that there were no errors:
----------------------------------------- Total log messages printed: Output messages = 3952 Warning messages = 7 Error messages = 0 Critical error messages = 0 -----------------------------------------
We can also make a plot of the terrain elevation field in the x1.10242.static.nc file with an NCL script named plot_terrain.ncl. We'll discuss the use of NCL scripts for visualization in a later session, so for now we can simply execute the following commands to plot the terrain field:
$ cp /classroom/wrfhelp/mpas/ncl_scripts/plot_terrain.ncl . $ setenv FNAME x1.10242.static.nc $ ncl plot_terrain.ncl
Running the script may take up to a minute, but the result should be the display of a figure that looks like the following:
Now that we have convinced ourselves that the processing of static, geographical fields was successful, we can proceed to interpolate atmosphere and land-surface initial conditions for our 240-km simulation. Generally, it is necessary to use the ungrib component of the WRF Pre-Processing System to prepare atmospheric datasets in the intermediate format used by MPAS. For the purposes of this tutorial, we will simply assume that these data have already been processed with the ungrib program in /classroom/wrfhelp/mpas/met_data/.
To interpolate the NCEP GFS data valid at 0000 UTC on 10 September 2014 to our 240-km mesh, we will symbolically link the GFS:2014-09-10_00 to our working directory:
$ ln -s /classroom/wrfhelp/mpas/met_data/GFS:2014-09-10_00 .
Then, we will need to edit the namelist.init_atmosphere as described in the lectures to instruct the init_atmosphere_model program to interpolate the GFS data to our mesh. The critical items to set in the namelist.init_atmosphere file are:
&nhyd_model config_init_case = 7 config_start_time = '2014-09-10_00:00:00' / &dimensions config_nvertlevels = 41 config_nsoillevels = 4 config_nfglevels = 38 config_nfgsoillevels = 4 / &data_sources config_met_prefix = 'GFS' config_landuse_data = 'USGS' config_topo_data = 'GTOPO30' config_use_spechumd = false / &vertical_grid config_ztop = 30000.0 config_nsmterrain = 1 config_smooth_surfaces = true config_dzmin = 0.3 config_nsm = 30 config_tc_vertical_grid = true / &preproc_stages config_static_interp = false config_native_gwd_static = false config_vertical_grid = true config_met_interp = true config_input_sst = false config_frac_seaice = true /
You may find it easier to copy the default namelist.init_atmosphere file from ${HOME}/MPAS-Model/namelist.init_atmosphere before making the edits highlighted above.
When editing the namelist.init_atmosphere file, the key changes are to the starting time for our real-data simulation, the prefix of the intermediate file that contains the GFS atmospheric and land-surface fields, and the pre-processing stages that will be run. For this simulation, we'll also use just 41 vertical layers in the atmosphere, rather than the default of 55, so that the simulation will run faster.
After editing the namelist.init_atmosphere file, we will also need to set the name of the input file in the streams.init_atmosphere file to the name of the "static" file that we just produced:
<immutable_stream name="input" type="input" filename_template="x1.10242.static.nc" input_interval="initial_only"/>
and we will also need to set the name of the output file, which will be the MPAS real-data initial conditions file:
<immutable_stream name="output" type="output" filename_template="x1.10242.init.nc" packages="initial_conds" output_interval="initial_only" />
Once we've made the above changes to the namelist.init_atmosphere and streams.init_atmosphere files, we can run the init_atmosphere_model program:
$ ./init_atmosphere_model
The program should take just a minute or two to run, and the result should be an x1.10242.init.nc file and a new log.init_atmosphere.0000.out file. Assuming no errors were reported at the end of the log.init_atmosphere.0000.out file, we can plot, e.g., the skin temperature field in the x1.10242.init.nc file using the plot_tsk.ncl NCL script:
$ cp /classroom/wrfhelp/mpas/ncl_scripts/plot_tsk.ncl . $ setenv FNAME x1.10242.init.nc $ ncl plot_tsk.ncl
Again, we'll examine the use of NCL scripts for visualization later, so for now we only need to verify that a plot like the following is produced:
If a plot like the above is produced, then we have completed the generation of an initial conditions file for our 240-km real-data simulation!
Because MPAS-Atmosphere — at least, when run as a stand-alone model — does not contain prognostic equations for the SST and sea-ice fraction, these fields would remain constant if not updated from an external source; this is, of course, not realistic, and it will generally impact the quality of longer model simulations. Consequently, for MPAS-Atmosphere simulations longer than roughly a week, it is typically necessary to periodically update the sea-surface temperature (SST) field in the model. For real-time simulations, this is generally not an option, but it is feasible for retrospective simulations, where we have observed SST analyses available to us.
In order to create an SST update file, we will make use of a sequence of intermediate files containing SST and sea-ice analyses that are available in /classroom/wrfhelp/mpas/met_data. Before proceeding, we'll link all of these SST intermediate files into our working directory:
$ ln -sf /classroom/wrfhelp/mpas/met_data/SST* .
Following Section 7.2.2 of the MPAS-Atmosphere Users' Guide, we must edit the namelist.init_atmosphere file to specify the range of dates for which we have SST data, as well as the frequency at which the SST intermediate files are available. The key namelist options that must be set are shown below; other options can be ignored.
&nhyd_model config_init_case = 8 config_start_time = '2014-09-10_00:00:00' config_stop_time = '2014-09-20_00:00:00' / &data_sources config_sfc_prefix = 'SST' config_fg_interval = 86400 / &preproc_stages config_static_interp = false config_static_interp = false config_vertical_grid = false config_met_interp = false config_input_sst = true config_frac_seaice = true /
Note in particular that we have set the config_init_case variable to 8! This is the initialization case used to create surface update files, instead of real-data initial condition files.
As before, we also need to edit the streams.init_atmosphere file, this time setting the name of the SST update file to be created by the "surface" stream, as well as the frequency at which this update file should contain records:
<immutable_stream name="surface" type="output" filename_template="x1.10242.sfc_update.nc" filename_interval="none" packages="sfc_update" output_interval="86400"/>
Note that we have set both the filename_template and output_interval attributes of the "surface" stream. The output_interval should match the interval specified in the namelist for the config_fg_interval variable. The other streams ("input" and "output") can remain unchanged — the input file should still be set to the name of the static file.
After setting up the namelist.init_atmosphere and streams.atmosphere files, we're ready to run the init_atmosphere_model program:
$ ./init_atmosphere_model
After the job completes, as before, the log.init_atmosphere.0000.out file should report that there were no errors. Additionally, the file x1.10242.sfc_udpate.nc file should have been created.
We can confirm that the x1.10242.sfc_udpate.nc contains the requested valid times for the SST and sea-ice fields by printing the "xtime" variable from the file; in MPAS, the "xtime" variable in a netCDF file is always used to store the times at which records in the file are valid.
$ ncdump -v xtime x1.10242.sfc_update.nc
This ncdump command should print the following times:
xtime = "2014-09-10_00:00:00 ", "2014-09-11_00:00:00 ", "2014-09-12_00:00:00 ", "2014-09-13_00:00:00 ", "2014-09-14_00:00:00 ", "2014-09-15_00:00:00 ", "2014-09-16_00:00:00 ", "2014-09-17_00:00:00 ", "2014-09-18_00:00:00 ", "2014-09-19_00:00:00 ", "2014-09-20_00:00:00 " ;
If the times shown above were printed, then we have successfully created an SST and sea-ice
update file for MPAS-Atmosphere. As a final check, we can use
the NCL script from /classroom/wrfhelp/mpas/ncl_scripts/plot_delta_sst.ncl to plot the difference in
the SST field between the last time in the surface update file and the first time in the file.
Because the surface update file contains no information about the latitude and longitude of
grid cells, we need to specify two environment variables when running this script: one to give
the name of the surface update file, and the second to give the name of the grid file, which
contains cell latitude and longitude fields:
$ cp /classroom/wrfhelp/mpas/ncl_scripts/plot_delta_sst.ncl . $ setenv FNAME x1.10242.sfc_update.nc $ setenv GNAME x1.10242.static.nc $ ncl plot_delta_sst.ncl
In this plot, we have masked out the SST differences over land, since the values of the field over land are not representative of actual SST differences, but may represent differences in, e.g., skin temperature or 2-m air temperature, depending on the source of the SST analyses.
Assuming that an initial condition file and a surface update file have been created, we're ready to run the MPAS-Atmosphere model itself!
Working from our 240km_uniform directory, we'll begin by creating a symbolic link to the atmosphere_model executable that we compiled in ${HOME}/MPAS-Model. We'll also make copies of the namelist.atmosphere and streams.atmosphere files as well. Recall from when we compiled the model that there are stream_list.atmosphere.* files that accompany the streams.atmosphere file; we'll copy these as well.
$ ln -s ${HOME}/MPAS-Model/atmosphere_model . $ cp ${HOME}/MPAS-Model/namelist.atmosphere . $ cp ${HOME}/MPAS-Model/streams.atmosphere . $ cp ${HOME}/MPAS-Model/stream_list.atmosphere.* .
You'll also recall that there were many look-up tables and other data files that are needed by various physics parameterizations in the model. Before we can run MPAS-Atmosphere, we'll need to create symbolic links to these files as well in our working directory:
$ ln -s ${HOME}/MPAS-Model/src/core_atmosphere/physics/physics_wrf/files/* .
Before running the model, we will need to edit the namelist.atmosphere file, which is the namelist
for the MPAS-Atmosphere model itself. The default namelist.atmosphere is set up for a 120-km mesh;
in order to run the model on a different mesh, there are several key parameters that depend on
the model resolution:
To tell the model the date and time at which integration will begin (important, e.g., for computing
solar radiation parameters), and to specify the length of the integration to be run, there are
two other parameters that must be set for each simulation:
Lastly, when running the MPAS-Atmosphere model in parallel, we must tell the model the prefix
of the filenames that contain mesh partitioning information for different MPI task counts:
Accordingly, we must edit at least these parameters in the namelist.atmosphere file; other parameters are described in Appendix B of the MPAS-Atmosphere Users' Guide, and do not necessarily need to be changed:
&nhyd_model config_dt = 1200.0 config_start_time = '2014-09-10_00:00:00' config_run_duration = '5_00:00:00' config_len_disp = 240000.0 / &decomposition config_block_decomp_file_prefix = 'x1.10242.graph.info.part.' /
For a relatively coarse mesh like the 240-km mesh, we can also call the radiation schemes less frequently to allow the model to run a little more quickly. Let's set the radiation calling interval to 1 hour:
&physics config_radtlw_interval = '01:00:00' config_radtsw_interval = '01:00:00' /
After changing the parameters shown above in the namelist.atmosphere file, we must also set the name of the initial condition file, the name of the surface update file, and the interval at which we would like to read the surface update file in the streams.atmosphere file:
<immutable_stream name="input" type="input" filename_template="x1.10242.init.nc" input_interval="initial_only"/> <stream name="surface" type="input" filename_template="x1.10242.sfc_update.nc" filename_interval="none" input_interval="86400"> <file name="stream_list.atmosphere.surface"/> </stream>
Having set up the namelist.atmosphere and streams.atmosphere files, we're almost ready to run the model in parallel. You'll recall from the lectures that we need to supply a graph partition file to tell MPAS how the horizontal domain is partitioned among MPI tasks. Accordingly, we'll create a symbolic link to the partition file for 4 MPI tasks for the 240-km mesh in our working directory:
$ ln -s /classroom/wrfhelp/mpas/meshes/x1.10242.graph.info.part.4 .
As a general rule, a mesh with N grid columns should be run on at most N/160 processors in MPAS-Atmosphere to make efficient use of the processors. So, for the mesh in this exercise, which has 10242 grid columns, we could in principle use up to about 64 MPI tasks while still making relatively efficient use of the computing resources. For this exercise, however, we only have 4 cores available to us.
Now, we should be able to run MPAS using 4 MPI tasks:
$ mpiexec -n 4 ./atmosphere_model
Once the model begins to run, it will write information about its progress to the log.atmosphere.0000.out file. From another terminal window, we can use tail -f to follow the updates to the log file as the model runs:
$ tail -f log.atmosphere.0000.out
If the model has started up successfully and begun to run, we should see messages like this
Begin timestep 2014-09-10_06:00:00 --- time to run the LW radiation scheme L_RADLW =T --- time to run the SW radiation scheme L_RADSW =T --- time to run the convection scheme L_CONV =T --- time to apply limit to accumulated rainc and rainnc L_ACRAIN =F --- time to apply limit to accumulated radiation diags. L_ACRADT =F --- time to calculate additional physics_diagnostics =F split dynamics-transport integration 3 global min, max w -0.177687 0.512867 global min, max u -112.127 112.008 Timing for integration step: 5.78077 s
written for each timestep that the model takes. If not, a log.atmosphere.0000.err file may have been created, and if so, it may have messages to indicate what might have gone wrong.
The model simulation will take about an hour to complete. As it runs, you may notice that several netCDF output files are being created:
If the model has successfully begun running, we can move to the next sub-section to obtain and compile the convert_mpas utility for interpolating model output files to a lat-lon grid for quick visualization.
We saw in earlier sections that NCL scripts may be used to visualize fields in MPAS netCDF files. Although these scripts can produce publication-quality figures, many times all that is needed is a way to quickly inspect the fields in a file. The convert_mpas program is designed to interpolate fields from the unstructured MPAS mesh to a regular lat-lon grid for easy checking, e.g., with ncview.
As with the MPAS-Model code, the convert_mpas code may be obtained most easily by cloning a GitHub repository. Working from our home directory, we'll download the code in this way:
$ cd ${HOME} $ git clone https://github.com/mgduda/convert_mpas.git $ cd convert_mpas
The convert_mpas utility is compiled via a simple Makefile. In general, one can edit the Makefile in the main convert_mpas directory. On these classroom machines, the netCDF library version is quite old, so we'll need to make one change in the Makefile, deleting the string "-DHAVE_NF90_INQ_VARIDS".
We also need to make one small code change for these classroom machines: in the src/file_output.F file, we need to replace the string "NF90_NETCDF4" with "NF90_64BIT_OFFSET".
After making these two changes, we can build the convert_mpas utility with the make command:
$ make
If compilation was successful, there should now be an executable file named convert_mpas. So that we can run this program from any of our working directories, we'll add our current directory to our shell ${PATH} environment variable:
$ setenv PATH ${HOME}/convert_mpas:${PATH}
You may like to take a few minutes to read through the README.md file to familiarize yourself with the usage of the convert_mpas program. We'll have a chance to exercise the convert_mpas utility in the next practical session.
In the previous sessions, we were able to make a five-day, real-data simulation on a 240-km quasi-uniform mesh. In this session, we'll practice preparing a variable-resolution, 240km – 48km mesh with the grid_rotate tool, after which we'll proceed as we did for the quasi-uniform case to process static, terrestrial fields on this variable-resolution mesh.
As we saw in the first session, processing the static fields can take some time, so while this processing is taking place, we'll practice using the convert_mpas tool to interpolate output from our 240-km quasi-uniform simulation to a lat-lon grid for quick visualization.
Once the static field processing has completed for the 240km – 48km variable-resolution mesh, we'll interpolate the initial conditions fields and start a simulation on this mesh. This simulation will take several hours to complete, so we'll want to get this running so that it will have completed by the next session.
A common theme with MPAS is the need to obtain source code from git repositories hosted on GitHub. The grid_rotate utility is no exception in this regard. One difference, however, is that the grid_rotate code is contained in a repository that houses many different tools for MPAS cores, including the MPAS-Albany Land Ice core, the MPAS-Ocean core, and the MPAS-Seaice core.
The grid_rotate utility is found in the MPAS-Tools repository, which we can clone into our home directory with the following commands:
$ cd ${HOME} $ git clone https://github.com/MPAS-Dev/MPAS-Tools.git
Within the MPAS-Tools repository, the grid_rotate code resides in the mesh_tools/grid_rotate sub-directory. On these classroom machines, we can change to this directory and build the grid_rotate utility with the make command:
$ cd MPAS-Tools/mesh_tools/grid_rotate $ make
The netCDF library on these machines is quite old, so there may be some build warnings about missing nc-config commands; in this particular case, these can be ignored, but in general, one should be concerned with build warnings.
So that we can more easily run the grid_rotate program from any working directory, we'll add our current working directory to our shell ${PATH}.
$ setenv PATH ${HOME}/MPAS-Tools/mesh_tools/grid_rotate:${PATH}
Now that we have a means to rotate the location of refinement in a variable-resolution MPAS mesh, we can begin the process of preparing such a mesh. For this tutorial, we'll be using a mesh that refines from 240-km grid spacing down to about 48-km grid spacing over an elliptic region.
When creating initial conditions and running the 240-km quasi-uniform simulation, we worked in a sub-directory named 240km_uniform. For our variable-resolution simulation, we'll work in a sub-directory named 240-48km_variable to keep our work separated. Let's create the 240-48km_variable directory and link in the variable-resolution mesh from /classroom/wrfhelp/mpas/meshes into that directory:
$ cd ${HOME} $ mkdir 240-48km_variable $ cd 240-48km_variable $ ln -s /classroom/wrfhelp/mpas/meshes/x5.30210.grid.nc .
The x5.30210.grid.nc mesh contains 30210 grid cells and refines by a factor of five (from 240-km grid spacing to 48-km grid spacing). The refinened region is centered over (0.0 lat, 0.0 lon), and we can get a quick idea of the size of the refinement by plotting contours of the approximate mesh resolution with the mesh_resolution.ncl NCL script:
$ cp /classroom/wrfhelp/mpas/ncl_scripts/mesh_resolution.ncl . $ setenv FNAME x5.30210.grid.nc $ ncl mesh_resolution.ncl
When running the mesh_resolution.ncl script, a plot like that below should be produced.
Using the grid_rotate tool, we can change the location and orientation of this refinement to an area of our choosing. Before running the grid_rotate program, we need a copy of the input namelist that is read by this program:
$ cp ${HOME}/MPAS-Tools/mesh_tools/grid_rotate/namelist.input .
As described in the lectures, we'll set the parameters in the namelist.input file to relocate the refined region in the mesh. For example, to move the refinement over South America, we might use:
&input config_original_latitude_degrees = 0 config_original_longitude_degrees = 0 config_new_latitude_degrees = -19.5 config_new_longitude_degrees = -62 config_birdseye_rotation_counter_clockwise_degrees = 90 /
Feel free to place the refinement wherever you would like!
After making changes to the namelist.input file, we can rotate the x5.30210.grid.nc file to create a new "grid" file with any name we choose. For example:
$ grid_rotate x5.30210.grid.nc SouthAmerica.grid.nc
As with the original x5.30210.grid.nc file, we can plot the approximate resolution of our rotated grid file with commands like the following:
$ setenv FNAME SouthAmerica.grid.nc $ ncl mesh_resolution.ncl
Of course, the name SouthAmerica.grid.nc should be replaced with the name that you have chosen for your rotated grid when setting the FNAME environment variable. In our example, the resulting plot looks like the one below.
It may take some iteration to get the refinement just where you would like it. You can go back and edit namelist.input file, then re-run the grid_rotate program, and finally, re-run the ncl mesh_resolution.ncl command until you are satisfied!
Interpolating static, geographical fields to a variable-resolution MPAS mesh works in essentially the same way as it does for a quasi-uniform mesh. So, the steps taken in this section should seem familiar — they mirror what we did in Section 1.3!
To begin the process, we'll symbolically link the init_atmosphere_model executable into our 240-48km_variable directory, and we'll copy the default namelist.init_atmosphere and streams.init_atmosphere files as well:
$ ln -s ${HOME}/MPAS-Model/init_atmosphere_model . $ cp ${HOME}/MPAS-Model/namelist.init_atmosphere . $ cp ${HOME}/MPAS-Model/streams.init_atmosphere .
As before, we'll set the following variables in the namelist.init_atmosphere file:
&nhyd_model config_init_case = 7 / &data_sources config_geog_data_path = '/classroom/wrfhelp/GEOG_DATA/WPS_GEOG/' config_landuse_data = 'USGS' config_topo_data = 'GTOPO30' / &preproc_stages config_static_interp = true config_native_gwd_static = true config_vertical_grid = false config_met_interp = false config_input_sst = false config_frac_seaice = false /
In editing the streams.init_atmosphere file, we'll choose the filename for the "input" stream to match whatever name we chose for our rotated grid file:
<immutable_stream name="input" type="input" filename_template="SouthAmerica.grid.nc" input_interval="initial_only"/>
and we'll set the filename for the "output" stream to follow the same naming convention, but using "static" instead of "grid":
<immutable_stream name="output" type="output" filename_template="SouthAmerica.static.nc" packages="initial_conds" output_interval="initial_only" />
With these changes, we can run the init_atmosphere_model program:
$ ./init_atmosphere_model
As before, processing of the static, geographical fields may take up to an hour.
While the static, geographical fields are being processed for our 240km – 48km variable-resolution mesh, we can experiment with the use of the convert_mpas utility to interpolate output from our 240-km, quasi-uniform simulation.
If you didn't have a chance to download and compile the convert_mpas program back in Section 2.4, you will need to go back to that section before proceeding with the rest of this section.
In another terminal window, we'll change to the 240km_uniform directory, and we may also need to add the directory containing the convert_mpas utility to our shell ${PATH} variable again:
$ cd ${HOME}/240km_uniform $ setenv PATH ${HOME}/convert_mpas:${PATH}
In the 240km_uniform directory, we should see output files from MPAS that are named history*nc and diag*nc:
$ ls -l history*nc diag*nc -rw-r--r-- 1 class101 cbet 4526928 Jul 28 21:03 diag.2014-09-10_00.00.00.nc -rw-r--r-- 1 class101 cbet 4526928 Jul 28 21:04 diag.2014-09-10_03.00.00.nc ... -rw-r--r-- 1 class101 cbet 4526928 Jul 28 21:26 diag.2014-09-14_21.00.00.nc -rw-r--r-- 1 class101 cbet 4526928 Jul 28 21:26 diag.2014-09-15_00.00.00.nc -rw-r--r-- 1 class101 cbet 94457824 Jul 28 21:03 history.2014-09-10_00.00.00.nc -rw-r--r-- 1 class101 cbet 94457824 Jul 28 21:04 history.2014-09-10_06.00.00.nc ... -rw-r--r-- 1 class101 cbet 94457824 Jul 28 21:25 history.2014-09-14_18.00.00.nc -rw-r--r-- 1 class101 cbet 94457824 Jul 28 21:26 history.2014-09-15_00.00.00.nc
As a first try, we can interpolate the fields from the final model "history" file, history.2014-09-15_00.00.00.nc, to a 0.5-degree lat-lon grid with the command:
$ convert_mpas history.2014-09-15_00.00.00.nc
The interpolation should take less than a minute, and the result should be a file named latlon.nc. Let's take a look at this file with the ncview utility:
$ ncview latlon.nc
Depending on which field you've chosen to view first, you may need to change the coordinate axes that are displayed in ncview, e.g.,
After setting the coordinate axes in the "Axes" menu, we can, e.g., view the qv field, which may look something like the plot below at the lowest model level.
When we're done viewing the latlon.nc file, we'll delete it. The convert_mpas utility attempts to append to existing latlon.nc files when converting a new MPAS netCDF file, and this often doesn't work as expected, so to avoid confusion, it's best to remove the latlon.nc file we just created with
$ rm latlon.nc
before converting another MPAS output file.
As we just saw, it can take upwards of a minute to interpolate a single MPAS "history" file (at least, on these classroom machines). If we are only interested in a small subset of the fields in an MPAS file, we can restrict the interpolation to just those fields by providing a file named include_fields in our working directory.
Using an editor of your choice, create a new text file named include_fields with the following contents:
u10 v10 precipw
After saving the file, we can re-run the convert_mpas program as before to produce a latlon.nc file with just the 10-meter surface winds and precipitable water:
$ convert_mpas history.2014-09-15_00.00.00.nc $ ncview latlon.nc
The convert_mpas program should have taken just a few seconds to run; if not, make sure the latlon.nc file was deleted before re-running convert_mpas. The only fields in the new latlon.nc file should be "precipw", "u10", and "v10".
Rather than converting just a single file at a time, we can convert multiple MPAS output files and write the results to the same latlon.nc file. The key to doing this is to use the first command-line argument to convert_mpas to specify a file that contains horizontal (SCVT) mesh information, and to let subsequent command-line arguments specify MPAS netCDF files that contain fields to be remapped.
To interpolate the "precipw", "u10", and "v10" fields for our entire 5-day simulation, we can use the following commands:
$ rm latlon.nc $ convert_mpas x1.10242.init.nc history.*.nc $ ncview latlon.nc
Note, again, that the first command-line argument to convert_mpas is only used to obtain SCVT mesh information by the convert_mpas program when multiple command-line arguments are given, and all remaining command-line arguments specifie netCDF files with fields to be interpolated!
When viewing a netCDF file with multiple time records, you can step through the time periods in ncview with the "forward" and "backward" buttons to the right and left of the "pause" button:
As a final exercise in viewing MPAS output via the convert_mpas and ncview programs, we'll zoom in on a part of the globe, looking at the 250 hPa vertical vorticity field over East Asia on a 1/10-degree lat-lon grid.
To do this, we will first need to edit the include_fields file so that it contains just the following line:
vorticity_250hPa
Then, we'll create a new file named target_domain with the editor of our choice. This file should contain the following lines:
startlat=30 endlat=60 startlon=90 endlon=150 nlat=300 nlon=600
After editing the include_fields file and creating the target_domain file, we can remove the old latlon.nc file and use the convert_mpas program to interpolate the 250 hPa vorticity field from our diag.*.nc files:
$ rm latlon.nc $ convert_mpas x1.10242.init.nc diag*nc $ ncview latlon.nc
If the processing of the static, geographical fields for the 240km – 48km variable-resolution mesh has not finished, yet, you can feel free to experiment more with the convert_mpas program!
Once the processing of the static, geographical fields has completed in the 240-48km_variable directory, we should have a new "static" file as well as a log.init_atmosphere.0000.out file. In our example, the static file is named SouthAmerica.static.nc, but your file should have whatever name was specified in the definition of the "output" stream in the streams.init_atmosphere file.
Before proceeding, it's a good idea to verify that there were no errors reported in the log.init_atmosphere.0000.out file. We can also make a plot of the terrain height field as we did for the 240-km quasi-uniform simulation:
$ cp /classroom/wrfhelp/mpas/ncl_scripts/plot_terrain.ncl . $ setenv FNAME SouthAmerica.static.nc $ ncl plot_terrain.ncl
Once we have convinced ourselves that the static, geographical processing was successful, we can proceed as we did for the 240-km quasi-uniform simulation. We'll create a symbolic link to the NCEP GFS intermediate file valid at 0000 UTC on 10 September 2014:
$ ln -s /classroom/wrfhelp/mpas/met_data/GFS:2014-09-10_00 .
As before, we'll set up the namelist.init_atmosphere file to instruct the init_atmosphere_model program to interpolate meteorological and land-surface initial conditions:
&nhyd_model config_init_case = 7 config_start_time = '2014-09-10_00:00:00' / &dimensions config_nvertlevels = 41 config_nsoillevels = 4 config_nfglevels = 38 config_nfgsoillevels = 4 / &data_sources config_met_prefix = 'GFS' config_landuse_data = 'USGS' config_topo_data = 'GTOPO30' config_use_spechumd = false / &vertical_grid config_ztop = 30000.0 config_nsmterrain = 1 config_smooth_surfaces = true config_dzmin = 0.3 config_nsm = 30 config_tc_vertical_grid = true / &preproc_stages config_static_interp = false config_native_gwd_static = false config_vertical_grid = true config_met_interp = true config_input_sst = false config_frac_seaice = true /
Besides the namelist.init_atmosphere file, we must also edit the XML I/O configuration file, streams.init_atmosphere, to specify the name of the static file that will serve as input to the init_atmosphere_model program, as well as the name of the MPAS-Atmosphere initial condition file to be created. Specifically, we must set the filename_template for the "input" stream to the name of our static file:
<immutable_stream name="input" type="input" filename_template="SouthAmerica.static.nc" input_interval="initial_only"/>
and we must set the name of the initial condition file to be created in the "output" stream:
<immutable_stream name="output" type="output" filename_template="SouthAmerica.init.nc" packages="initial_conds" output_interval="initial_only" />
After editing the namelist.init_atmosphere and streams.atmosphere files, and linking the intermediate file to our run directory, we can run the init_atmosphere_model program.
$ ./init_atmosphere_model
Once the init_atmosphere_model program finishes, as always, it is best to verify that there were no error messages reported in the log.init_atmosphere.0000.out file. We should also have a file whose name matches the filename we specified for the "output" stream in the streams.init_atmosphere file. In our example, we should have a file named SouthAmerica.init.nc.
If all was successful, we are ready to run a variable-resolution MPAS simulation. For this simulation, we won't prepare an SST and sea-ice surface update file for the model; however, if you would like to try doing so following what was done in Section 2.2, feel free to do so!
Assuming that an initial condition file, and, optionally, a surface update file have been created, we're ready to run a variable-resolution, 240km – 48km simulation with refinement over a part of the globe that interests us.
As a first step, you may recall that we need to create symbolic links to the atmosphere_model executable, as well as to the physics look-up tables, into our working directory:
$ ln -s ${HOME}/MPAS-Model/atmosphere_model . $ ln -s ${HOME}/MPAS-Model/src/core_atmosphere/physics/physics_wrf/files/* .
We will also need copies of the namelist.atmosphere, streams.atmosphere, and stream_list.atmosphere.* files:
$ cp ${HOME}/MPAS-Model/namelist.atmosphere . $ cp ${HOME}/MPAS-Model/streams.atmosphere . $ cp ${HOME}/MPAS-Model/stream_list.atmosphere.* .
Next, we will need to edit the namelist.atmosphere file. Recall from
before that the default namelist.atmosphere is set up for a 120-km mesh;
in order to run the model on a different mesh, there are several key parameters that depend on
the model resolution:
The model timestep, config_dt must be set to a value that is appropriate for the smallest grid distance in the mesh. In our case, we need to choose a time step that is suitable for a 48-km grid distance; a value of 240 seconds is reasonable. In the lecture on MPAS dynamics and physics, we will say more about how to choose the model timestep.
Similarly, we need to set the mixing length-scale according to the smallest nominal grid distance in a variable-resolution mesh, so we can set config_len_disp to 48000 meters.
To tell the model the date and time at which integration will begin, and to specify the length
of the integration to be run, there are two other parameters that must be set for each simulation:
In order to make maximal use of the time between this practical session and the next, we'll try to run a 10-day simulation starting on 2014-09-10_00.
Lastly, when running the MPAS-Atmosphere model in parallel, we must tell the model the prefix
of the filenames that contain mesh partitioning information for different MPI task counts:
For the 240km – 48km variable-resolution mesh, we can find a mesh partition file with 4 partitions at /classroom/wrfhelp/mpas/meshes/x5.30210.graph.info.part.4. Let's symbolically link that file to our run directory now before specifying a value of 'x5.30210.graph.info.part.' for the config_block_decomp_prefix in our namelist.atmosphere file.
$ ln -s /classroom/wrfhelp/mpas/meshes/x5.30210.graph.info.part.4 .
Before reading further, you may wish to try making the changes to the namelist.atmosphere yourself. Once you've done so, you can verify that the options highlighted below match the changes you have made.
&nhyd_model config_dt = 240.0 config_start_time = '2014-09-10_00:00:00' config_run_duration = '10_00:00:00' config_len_disp = 48000.0 / &decomposition config_block_decomp_file_prefix = 'x5.30210.graph.info.part.' /
After changing the parameters shown above in the namelist.atmosphere file, we must also set the name of the initial condition file in the streams.atmosphere file:
<immutable_stream name="input" type="input" filename_template="SouthAmerica.init.nc" input_interval="initial_only"/>
Optionally, if you decided to create an SST update file for this variable-resolution simulation in the previous section, you can also edit the "surface" stream in the streams.atmosphere file to specify the name of that surface update file, as well as the interval at which the model should read updates from it:
<stream name="surface" type="input" filename_template="SouthAmerica.sfc_update.nc" filename_interval="none" input_interval="86400"> <file name="stream_list.atmosphere.surface"/> </stream>
Having set up the namelist.atmosphere and streams.atmosphere files, we're ready to run the model in parallel:
$ mpiexec -n 4 ./atmosphere_model
Once the model begins to run, you may like to open another terminal window, change to the 240-48km_variable directory, and to "tail" the log.atmosphere.0000.out file with the "-f" option to follow the progress of the MPAS simulation:
$ tail -f log.atmosphere.0000.out
If the model has started up successfully and begun to run, we should see messages like this:
Begin timestep 2014-09-10_00:12:00 --- time to run the LW radiation scheme L_RADLW =F --- time to run the SW radiation scheme L_RADSW =F --- time to run the convection scheme L_CONV =T --- time to apply limit to accumulated rainc and rainnc L_ACRAIN =F --- time to apply limit to accumulated radiation diags. L_ACRADT =F --- time to calculate additional physics_diagnostics =F split dynamics-transport integration 3 global min, max w -1.06611 0.722390 global min, max u -117.781 118.251 Timing for integration step: 4.18681 s
This simulation will take several hours to run. Now may be a convenient time to ask any questions that you may have!
In this session, we'll restart a simulation from the end of the five-day, 240km – 48km variable-resolution simulation that we started in the previous session. Before restarting the simulation, though, we'll define new output streams.
Optionally, before restarting the simulation, we can also try adding a list of sounding locations for the simulation.
After launching the restart simulation with new output streams (and perhaps some sounding output), we'll practice using some NCL scripts to visualize the first five days of output from the variable-resolution simulation
As we saw in the lectures, setting up a simulation to restart from a previously run simulation is relatively simple. We'll begin by making sure that we're in the directory where we ran the original, 5-day 240-km simulation:
$ cd ${HOME}/240km_uniform
Before making any changes to the namelist.atmosphere file, we will first check to see which restart files are available; if our original simulation completed successfully, we should have one restart file per day, valid at the end of each of the five days of the simulation:
$ ls -l restart*nc -rw-r--r-- 1 class101 cbet 196953464 Jul 29 12:26 restart.2014-09-11_00.00.00.nc -rw-r--r-- 1 class101 cbet 196953464 Jul 29 12:31 restart.2014-09-12_00.00.00.nc -rw-r--r-- 1 class101 cbet 196953464 Jul 29 12:35 restart.2014-09-13_00.00.00.nc -rw-r--r-- 1 class101 cbet 196953464 Jul 29 12:40 restart.2014-09-14_00.00.00.nc -rw-r--r-- 1 class101 cbet 196953464 Jul 29 12:44 restart.2014-09-15_00.00.00.nc
In principle, we could restart our simulation from any of these files, but we'll begin from the last available file, valid at 0000 UTC on 15 September.
There are just two changes that need to be made to the namelist.atmosphere file: we need to set the starting time of the simulation to 2014-09-10_00:00:00 and we need to set the config_do_restart option to true:
&nhyd_model config_start_time = '2014-09-15_00:00:00' / &restart config_do_restart = true /
We can keep all of the other namelist options as they were, including the config_run_duration option — restarting the simulation and running for another five days should be fine, since we had a total of 10 days of SST and sea-ice information in our surface update file.
Before running the model, we'll make a few other changes as described in the next sections!
We may also like to write out some additional fields from our restart simulation at a higher temporal frequency. You may recall from the lectures that we can define new output streams in the streams.atmosphere file.
We will provide an example here of writing out the 10-meter AGL horizontal winds every 60 minutes, but you can feel free to look through Appendix D of the MPAS-Atmosphere Users' Guide to see if there are any other fields that you may like to write out.
Let's define a new output stream named "sfc_winds" in the streams.atmosphere file:
<stream name="sfc_winds" type="output" filename_template="surface_winds.$Y-$M-$D_$h.nc" filename_interval="output_interval" output_interval="60:00" > <var name="u10"/> <var name="v10"/> </stream>
This new stream must be defined somewhere between the opening <streams> and </streams> tags in the streams.atmosphere file.
Note in the stream definition above that the files that will be written will have names like surface_winds.2014-09-15_00.nc, surface_winds.2014-09-15_01.nc, etc. by our choice of the "filename_template". Also, MPAS will create new files at each output time, rather than packing many time records into the same file; this is controlled by the "filename_interval" option. Finally, note that the stream will be written every 60 minutes. Since our simulation starts at 00:00:00 UTC, the minutes and seconds part of the output files is unnecessary. Of course, we may want to add $m and $s to our filename template if we were writing this stream more frequently than once per hour.
In the next section, we'll describe the steps to write out vertical profiles at specific (lat,lon) locations before we run our restart simulation.
It may at times also be helpful to write out vertical profiles of basic model fields at specified (lat,lon) locations from a model simulation. In MPAS terminology, this is described as writing "soundings" from the simulation.
In principle, we can write out soundings for as many locations as we would like. Here, we'll describe how to write out soundings from three locations: Boulder, Colorado; McMurdo Station, Antarctica; and New Delhi, India.
We will need to specify the locations for the soundings in a text file named sounding_locations.txt in our working directory:
40.0 -105.25 Boulder 28.7 77.2 NewDelhi -77.85 166.67 McMurdo
The three columns of the file contain the latitude, longitude, and name of the sounding. The latitude and longitude are given as decimal degrees, and the name cannot contain whitespace (since it will be used to name output files).
For this exercise, if you would prefer to write out soundings for different locations, feel free to change the list of locations as you would like!
Lastly, we need to specify how frequently these soundings will be written during the model simulation. This is done through the config_sounding_interval option in the namelist.atmosphere file. By default, this option can be found near the bottom of the file. Let's choose to write soundings every three hours:
&soundings config_sounding_interval = '3:00:00' /
Having set the output interval for soundings, we're ready to make a restart simulation.
Now that we've edited our namelist.atmosphere file for our restart run, we've added a new output stream with surface winds to the streams.atmosphere file, and we've specified a list of locations for which profiles will be written every three hours, we can run the model with 4 MPI tasks:
$ mpiexec -n 4 ./atmosphere_model
In another terminal window, it's a good idea to change to the 240km_uniform directory and check several points:
If all of the above check out, our restart simulation is successfully running! In the next section, we'll look at a couple of new NCL scripts for plotting model output.
Now that our restart simulation is running, we can examine in more detail an example NCL script for plotting horizontal fields as individual cells. To try out this script, we can work with output from our 240km – 48km variable-resolution simulation.
In order to plot horizontal fields, we can begin by ensuring that we are in the 240-48km_variable directory before copying over the latlon_cells.ncl script:
$ cd ${HOME}/240-48km_variable $ cp /classroom/wrfhelp/mpas/ncl_scripts/latlon_cells.ncl .
This script reads the name of the MPAS netCDF file with SCVT mesh information from and environment variable named GNAME (roughly, "G" means "grid"). The MPAS netCDF file that contains the actual field to be plotted is identified by the environment variable FNAME (roughly, "F" means "field").
We'll plot a field from the diag.2014-09-11_00.00.00.nc file. The mesh information can be found in the MPAS initial conditions file, which may be named SouthAmerica.init.nc, though you may have chosen a different name corresponding to a different location of refinement. We'll set the GNAME and FNAME environment variables before proceeding:
$ setenv GNAME SouthAmerica.init.nc $ setenv FNAME diag.2014-09-11_00.00.00.nc
Remember, your initial conditions file may have a different name!
By default, the latlon_cells.ncl script plots the lowest-model-level wind speed, and the output is saved to a PNG image in a file named cell_plot.png. Running the script should produce this plot, which you can view with the eog command.
$ ncl latlon_cells.ncl $ eog cell_plot.png
Since we are working with a variable-resolution mesh, we may like to set a plotting window to cover only part of the refined region in our simulation. We can set a plotting window in the latlon_cells.nclscript by editing the section of the script that looks like this:
; The bounding box for the plot mapLeft = -180.0 mapRight = 180.0 mapBottom = -90.0 mapTop = 90.0
Feel free to set the plotting window however you would like for your variable-resolution simulation. For example, if we were to set the plotting window to cover South America before re-running the script, the result might look like the following:
Finally, we may like to plot other fields. We can see which fields were written to the diag.*.nc files using ncdump:
$ ncdump -h diag.2014-09-11_00.00.00.nc
If, for example, we would like to plot Convective Available Potential Energy (CAPE), we could edit the latlon_cells.ncl script near the comment "The field to be plotted" so that it looked like the following:
; The field to be plotted h = f->cape(0,:) ; The field name field_name = "CAPE" field_units = "[J/kg]" ; The range of field values to plot (start, stop, number) cnLevels = fspan(0,3000,50)
Generally, we will need to assign the variable to be plotted, specify the plotting name and units for the field, and specify a range of value for the field. Making these changes and re-running the NCL script might produce a plot like the following:
Now that you have a feel for how to edit the latlon_cells.ncl script, feel free to try plotting other fields in any remaining time in this session!
In this final session, we'll get some experience in making source-code changes to add a passive tracer to the model. Although the exercises in this session are independent of those in other session, it may be best to finish the exercises from the previous sessions, first, since we will be changing the model source code and re-compiling the model executable.
We'll begin by defining our new passive tracer in the Registry.xml file for the init_atmosphere core. Let's change directories to the copy of the MPAS-Model source code in our ${HOME} directory:
cd ${HOME}/MPAS-Model
Opening up the src/core_init_atmosphere/Registry.xml file with our editor of choice, we will look for the var_array section named "scalars"; it should look something like this:
<var_array name="scalars" type="real" dimensions="nVertLevels nCells Time"> <var name="qv" array_group="moist"/> <var name="qc" array_group="moist"/> <var name="qr" array_group="moist"/> </var_array>
As we saw in the lectures, we will need to add our new passive tracers here. Let's add a new passive tracer named "mytracer1". The array_group that we assign to this tracer can be anything other than "moist". By convention, we'll call the array group for this new tracer "passive". After adding our new tracer, the "scalars" var_array should look like this:
<var_array name="scalars" type="real" dimensions="nVertLevels nCells Time"> <var name="qv" array_group="moist"/> <var name="qc" array_group="moist"/> <var name="qr" array_group="moist"/> <var name="mytracer1" array_group="passive"/> </var_array>
After making this change to the Registry.xml file, we'll clean and re-compile the init_atmosphere core. Recall that, after we make changes to the Registry.xml file, we need to "clean" the build first so that the changes take effect. Later, when we modify only Fortran source code, there will be no need to "clean" before re-compiling. The following two commands should suffice:
$ make clean CORE=init_atmosphere $ make gfortran CORE=init_atmosphere PRECISION=single
If compilation was successful, we should now have a new init_atmosphere_model executable. At this point, if we were to create new initial conditions, the "mytracer1" field would be present in the initial conditions, but it would be zero everywhere. We next need to add code to define the initial conditions for "mytracer1".
To define the initial value of our new tracer, use your preferred editor to open the src/core_init_atmosphere/mpas_init_atm_cases.F file. Search for these two lines of code that call the routine to initialize real-data test cases:
call init_atm_case_gfs(block_ptr, mesh, nCells, nEdges, nVertLevels, fg, state, & diag, diag_physics, config_init_case, block_ptr % dimensions, block_ptr % configs)
An appropriate place to add a call to our tracer initialization routine would be directly after the call to init_atm_case_gfs. Let's add a new line of code below this to call a new subroutine, which we'll name init_tracers.
call init_atm_case_gfs(block_ptr, mesh, nCells, nEdges, nVertLevels, fg, state, & diag, diag_physics, config_init_case, block_ptr % dimensions, block_ptr % configs) call init_tracers(mesh, state)
Notice that this new routine (which we have yet to define!) takes as input two arguments that are MPAS "pools".
As a start, we will define the init_tracers function so that it simply writes a message to the log file. At the bottom of the src/core_init_atmosphere/mpas_init_atm_cases.F, but before the line "end module init_atm_cases", let's add the following Fortran code:
subroutine init_tracers(mesh, state) implicit none type (mpas_pool_type), intent(in) :: mesh type (mpas_pool_type), intent(inout) :: state call mpas_log_write('====== Handling tracer initialization ======') end subroutine init_tracers
After saving our code changes, we can re-compile the init_atmosphere core with:
$ make gfortran CORE=init_atmosphere PRECISION=single
Again, because we have not changed the Registry.xml file since we last re-compiled, there's no need to "make clean".
Compilation should take just a minute, and if it complete's successfully, we're ready to try out our modified code, even though it won't actually create initial conditions for our passive tracer, yet. So that we don't overwrite files from any of our previous sessions, we'll work in a new directory, passive_tracer, inside our home directory:
$ cd ${HOME} $ mkdir passive_tracer $ cd passive_tracer
We'll test out our passive tracer on the 240-km uniform mesh, for which we have already created a "static" file. Now that we have experience in creating initial conditions, the following procedure should seem familiar:
$ ln -s ${HOME}/240km_uniform/x1.10242.static.nc . $ ln -s /classroom/wrfhelp/mpas/met_data/GFS:2014-09-10_00 . $ ln -s ${HOME}/MPAS-Model/init_atmosphere_model . $ cp ${HOME}/MPAS-Model/namelist.init_atmosphere . $ cp ${HOME}/MPAS-Model/streams.init_atmosphere .
As in Section 2.1, we'll edit the namelist.init_atmosphere so that it looks like the following:
&nhyd_model config_init_case = 7 config_start_time = '2014-09-10_00:00:00' / &dimensions config_nvertlevels = 41 config_nsoillevels = 4 config_nfglevels = 38 config_nfgsoillevels = 4 / &data_sources config_met_prefix = 'GFS' config_landuse_data = 'USGS' config_topo_data = 'GTOPO30' config_use_spechumd = false / &vertical_grid config_ztop = 30000.0 config_nsmterrain = 1 config_smooth_surfaces = true config_dzmin = 0.3 config_nsm = 30 config_tc_vertical_grid = true / &preproc_stages config_static_interp = false config_native_gwd_static = false config_vertical_grid = true config_met_interp = true config_input_sst = false config_frac_seaice = true /
We'll then edit the filename that will be read by the "input" stream in the streams.init_atmosphere file so that it matches the name of our "static" file:
<immutable_stream name="input" type="input" filename_template="x1.10242.static.nc" input_interval="initial_only"/>
Also as in Section 2.1, we'll change the name of the initial conditions file to be created:
<immutable_stream name="output" type="output" filename_template="x1.10242.init.nc" packages="initial_conds" output_interval="initial_only" />
We should now be able to run the init_atmosphere_model executable to produce an x1.10242.init.nc file:
$ ./init_atmosphere_model
If there were no errors in the log.init_atmosphere.0000.out file, we can check for the presence of our passive tracer in the x1.10242.init.nc file:
$ ncdump -h x1.10242.init.nc | grep mytracer1
Besides the creation of the x1.10242.init.nc file, we should also look for this message in the log.init_atmosphere.0000.out file:
====== Handling tracer initialization ======
If all was successful, we can go back to the MPAS-Model directory and add code to provide initial conditions for our passive tracer. Let's change back to the MPAS source code directory:
$ cd ${HOME}/MPAS-Model
Using our favorite editor, let's open the src/core_init_atmosphere/mpas_init_atm_cases.F file and change the code for our init_tracers subroutine so that it looks like the following:
subroutine init_tracers(mesh, state) implicit none type (mpas_pool_type), intent(in) :: mesh type (mpas_pool_type), intent(inout) :: state integer, pointer :: index_mytracer1 integer, pointer :: nCells integer, pointer :: nVertLevels integer :: k, iCell real (kind=RKIND), dimension(:), pointer :: latCell real (kind=RKIND), dimension(:), pointer :: lonCell real (kind=RKIND), dimension(:,:,:), pointer :: scalars call mpas_log_write('====== Handling tracer initialization ======') call mpas_pool_get_dimension(mesh, 'nCells', nCells) call mpas_pool_get_dimension(mesh, 'nVertLevels', nVertLevels) call mpas_pool_get_dimension(state, 'index_mytracer1', index_mytracer1) call mpas_pool_get_array(state, 'scalars', scalars) call mpas_pool_get_array(mesh, 'latCell', latCell) call mpas_pool_get_array(mesh, 'lonCell', lonCell) do iCell = 1, nCells do k = 1, nVertLevels scalars(index_mytracer1,k,iCell) = & 0.5 * (sin(latCell(iCell)*4.0) * & sin(lonCell(iCell)*8.0) + 1.0) end do end do end subroutine init_tracers
After saving our changes, we can re-compile the init_atmosphere core once again:
$ make gfortran CORE=init_atmosphere PRECISION=single
We can now try creating new initial conditions with our updated init_tracers routine. Let's change back to the passive_tracer directory, remove the old x1.10242.init.nc file, and create a new initial conditions file:
$ cd ${HOME}/passive_tracer $ rm x1.10242.init.nc $ ./init_atmosphere_model
If the creation of the new x1.10242.init.nc file was successful, you may like to try using the convert_mpas utility to see what our new "mytracer1" field looks like!
Now that we have a new passive tracer in our initial conditions, we'll need to edit the Registry.xml file for the atmosphere core so that the model itself will know about the new tracer.
Let's begin by changing back to the model source code directory:
$ cd ${HOME}/MPAS-Model
We can edit the src/core_atmosphere/Registry.xml file, recalling from the lectures that we not only need to define the tracer, but we also need to define an array that will hold the tendencies for the tracer. Even if we don't supply any sources or sinks, the model needs memory to be allocated for the tracer tendency, e.g., for use in the scalar transport routines.
In the src/core_atmosphere/Registry.xml file, search for the word "var_array" several times until you come to the definition of the scalars var_array tag, which should begin with lines that look like this:
<var_array name="scalars" type="real" dimensions="nVertLevels nCells Time"> <var name="qv" array_group="moist" units="kg kg^{-1}" description="Water vapor mixing ratio"/>
Let's add a new XML var tag inside the var_array tag to define our new "mytracer1" tracer; this new tag may look something like the following:
<var name="mytracer1" array_group="passive" units="kg kg^{-1}" description="Our first passive tracer"/>
Remember: it's important that the passive tracer belongs to an array_group other than "moist"; here, we've just used the string "passive", though in principle you could choose another name.
Similarly, we need to define a tendency array for our new passive tracer. Searching for the word "scalars_tend", we should find a section of XML for the scalars_tend var_array that begins with the following lines:
<!-- scalar tendencies --> <var_array name="scalars_tend" type="real" dimensions="nVertLevels nCells Time"> <var name="tend_qv" name_in_code="qv" array_group="moist" units="kg m^{-3} s^{-1}" description="Tendency of water vapor mass per unit volume divided by d(zeta)/dz"/>
We'll define a new var entry in this var_array for our passive tracer tendency; this new entry might look something like the following:
<var name="tend_mytracer1" name_in_code="mytracer1" array_group="passive" units="kg m^{-3} s^{-1}" description="Tendency of our first tracer mass per unit volume divided by d(zeta)/dz"/>
Don't forget to set the value for array_group to "passive"!
With these changes to the Registry.xml file, we can re-compile the atmosphere core. But, we'll first need to "clean" the code, since the last core that we compiled was the init_atmosphere core.
$ make clean CORE=atmosphere $ make gfortran CORE=atmosphere PRECISION=single
If compilation was successful, we can change directories to our passive_tracer directory and prepare to run MPAS:
$ cd ${HOME}/passive_tracer
You may recall from Section 2.3 that we need the MPAS executable, atmosphere_model, as well as a number other files in order to run a simulation. Let's link or copy these files as we did in Section 2.3:
$ ln -s ${HOME}/MPAS-Model/atmosphere_model . $ cp ${HOME}/MPAS-Model/namelist.atmosphere . $ cp ${HOME}/MPAS-Model/streams.atmosphere . $ cp ${HOME}/MPAS-Model/stream_list.atmosphere.* . $ ln -s ${HOME}/MPAS-Model/src/core_atmosphere/physics/physics_wrf/files/* .
Now, you can proceed to set up the namelist.atmosphere and streams.atmosphere files as you did in Section 2.3 before starting the model simulation using 4 MPI tasks.
The entire set of scalars is written to the default "history" files, so as soon as the first several output files are available, you can try making plots of our new "mytracer1" passive tracer!