3. Building MPAS¶
3.1 Prerequisites¶
- The following software packages are required to build MPAS:
Compatible C and Fortran compilers
The PIO parallel I/O library - to read and write model fields
The standard netCDF libary - required by the PIO library
The parallel-netCDF libarary from Argonne National Laboratory
An MPI installation such as MPICH or OpenMPI
Additional Requirements¶
All libraries must be compiled with the same compilers that will be used to build MPAS. 3.2 Compiling I/O Libraries (below) summarizes the basic procedure for installing the required I/O libraries for MPAS.
Set environment variables *PIO*, *PNETCDF*, and *NETCDF* to their respective root installation directories. Doing this allows the MPAS makefiles to find the PIO, parallel-netCDF, and netCDF include files and libraries.
MPAS-Atmosphere v5.0 introduces the capability to use hybrid parallelism using MPI and OpenMP. However, the use of OpenMP should be considered experimental and generally does not offer any performance advantage. The primary reason for releasing a shared-memory capability is to make this code available to collaborators for future development.
Note
There is no option to build a serial version of the MPAS executables.
3.2 Compiling I/O Libraries¶
Note
MPAS developers cannot assume responsibility for third-party libraries.
The instructions provided in this section for installing libraries have been successfully used by MPAS developers, but due to differences in library versions, compilers, and system configurations, it is recommended that users consult documentation provided by individual library vendors should problems arise during installation.
The most tested versions of these libraries are netCDF 4.4.x and parallel-netCDF 1.8.x. Users are strongly encouraged to use either the latest PIO 2.x version from the ParallelIO GitHub Repository, or PIO versions 1.7.1 or 1.9.23, as other versions have not been tested or are known to not work with MPAS. The netCDF and parallel-netCDF libraries must be installed before building the PIO library.
3.2.1 NetCDF¶
Download the netCDF library from Unidata. This downloads page provides detailed instructions for building the netCDF C and Fortran libraries; both the C and Fortran interfaces are needed by PIO. If netCDF-4 support is desired, the zlib and HDF5 libraries will need to be installed prior to building netCDF.
Note
Before proceeding to compile PIO, the NETCDF environment variable should be set to the netCDF root installation directory.
3.2.2 Parallel-NetCDF¶
The parallel-netCDF library may be downloaded from the Parallel-netcdf GitHub page.
Note
Before proceeding to compile PIO the PNETCDF environment variable should be set to the parallel-netCDF root installation directory.
3.2.3 PIO¶
Either the PIO 1.x or 2.x library versions may be used, beginning with MPAS v5.2. The two major versions have slightly different APIs; by default, the MPAS build system assumes the PIO 1.x API, but the PIO 2.x library versions may be used by adding the USE PIO2=true option when compiling MPAS as described in 3.3 Compiling MPAS. If compiling with the PIO 1.x library versions, users are strongly encouraged to choose either PIO 1.7.1 or PIO 1.9.23, as other 1.x versions may not work.
The PIO 2.x library versions support integrated performance timing with the GPTL library; however, the MPAS infrastructure does not currently provide calls to initialize this library when it is used in PIO 2.x. Therefore, it is recommended to add -DPIO_ENABLE_TIMING=OFF when running the cmake command to build PIO 2.x versions.
After PIO is built and installed the PIO environment variable should be set to the directory where PIO was installed. Recent versions of PIO support the specification of an installation prefix, while some older versions do not, in which case the PIO environment variable should be set to the directory where PIO was compiled.
3.3 Compiling MPAS¶
Note
Before compiling MPAS, the NETCDF, PNETCDF, and PIO environment variables must be set to the library installation directories as described in the previous section.
All information about compilers, compiler flags, etc., is contained in the top-level “Makefile”. The MPAS code uses only the “make” utility for compilation. Rather than employing a separate configuration step before building the code, all information about compilers, compiler flags, etc., is contained in the top-level Makefile. Each supported combination of compilers (i.e., a configuration) is included in the Makefile as a separate make target, and the user selects among these configurations by running make with the name of a build target specified on the command-line, e.g., (to build the code using the GNU Fortran and C compilers)
> make gfortran
Some available targets are listed in the table below. Additional targets can be added by editing the Makefile in the top-level directory.
Target |
Fortran Compiler |
C Compiler |
MPI Wrappers |
|---|---|---|---|
xlf |
xlf90 |
xlc |
mpxlf90 / mpcc |
pgi |
pgf90 |
pgcc |
mpif90 / mpicc |
ifort |
ifort |
gcc |
mpif90 / mpicc |
gfortran |
gfortran |
gcc |
mpif90 / mpicc |
llvm |
flang |
clang |
mpifort / mpicc |
bluegene |
bgxlf95_r |
bgxlc_r |
mpxlf95_r / mpxlc_r |
The name of the model core to build must be specified. The MPAS framework supports multiple cores - a shallow water model, an ocean model, a land-ice model, a non-hydrostatic atmosphere model, and a non-hydrostatic atmosphere initialization core - therefore the build process must be told which core to build. This is done by either setting the environment variable CORE to the name of the model core to build, or by specifying the core to be built explicitly on the command-line when running make. For the atmosphere core, for example, one may run either
> setenv CORE atmosphere
> make gfortran
or
> make gfortran CORE=atmosphere
If the CORE environment variable is set and a core is specified on the command-line, the command-line value takes precedence; if no core is specified, either on the command line or via the CORE environment variable, the build process will stop with an error message stating such. Assuming compilation is successful, the model executable, named ${CORE}_model (e.g., atmosphere_model), should be created in the top-level MPAS directory.
To obtain a list of available cores, run the top-level Makefile without setting the CORE environment variable or passing the core via the command-line. An example of the output from this can be seen below.
> make
( make error )
make[1]: Entering directory '/scratch/MPAS-Release'
Usage: make target CORE=[core] [options]
Example targets:
ifort
gfortran
xlf
pgi
Availabe Cores:
atmosphere
init_atmosphere
landice
ocean
seaice
sw
test
Available Options:
DEBUG=true - builds debug version. Default is optimized version.
USE_PAPI=true - builds version using PAPI for timers. Default is off.
TAU=true - builds version using TAU hooks for profiling. Default is off.
AUTOCLEAN=true - forces a clean of infrastructure prior to build new core.
GEN_F90=true - Generates intermediate .f90 files through CPP, and builds
with them.
TIMER_LIB=opt - Selects the timer library interface to be used for
profiling the model. Options areTIMER_LIB=native
- Uses native built-in timers in MPAS
TIMER_LIB=gptl - Uses gptl for the timer interface instead of the native
interface
TIMER_LIB=tau - Uses TAU for the timer interface instead of the native
interface
OPENMP=true - builds and links with OpenMP flags. Default is to not use
OpenMP.
USE_PIO2=true - links with the PIO 2 library. Default is to use the
PIO 1.x library.
PRECISION=single - builds with default single-precision real kind.
Default is to use double-precisioEnsure that NETCDF,
PNETCDF, PIO, and PAPI (if USE_PAPI=true) are
environment variables
Ensure that NETCDF, PNETCDF, PIO, and PAPI (if USE_PAPI=true) are environment
variables that point to the absolute paths for the libraries.
************ ERROR ************
No CORE specified. Quitting.
************ ERROR ************
3.4 Selecting a Single-precision Build¶
Beginning with version 2.0, MPAS-Atmosphere can be compiled and run in single-precision, offering faster model execution and smaller input and output files. Beginning with version 5.0, the model precision selection can be made on the command-line, with no need to edit the Makefile. To compile a single-precision MPAS-Atmosphere executable, add PRECISION=single to the build command, e.g.,
> make gfortran CORE=atmosphere PRECISION=single
Single- or double-precision input files may be used, regardless of which precision the MPAS-Atmosphere cores were compiled with. In general, the MPAS infrastructure should detect the precision of input files, but the precision of files in an input stream can be specified by adding the precision attribute to the stream definition as described in the Optional Stream Attributes.
3.5 Linking with an external ESMF library¶
Beginning with MPAS v8.4.0, the Make build system provides a new option, MPAS_ESMF, to simplify linking with an external Earth System Modeling Framework (ESMF) library. Although users of stand-alone MPAS-Atmosphere will generally see no benefit to using the full, external ESMF library, linking with this library supports ongoing development work to couple MPAS-Atmosphere with other Earth-system component models.
If MPAS_ESMF is not specified,it takes on its default value of embedded. Equivalently, MPAS_ESMF=embedded may be specified in the build command. In this case, MPAS will use its built-in (“embedded”) subset of an older ESMF library for time management.
If MPAS_ESMF is specified as external (i.e., MPAS_ESMF=external is specified in the build command), then the environment variable $ESMFMKFILE will be used by the MPAS build system to discover details of the full, external ESMF library to be used. If MPAS has been successfully compiled and linked with the full, external ESMF library, the build summary will indicate this with a message:
MPAS was built with default single-precision reals.
Debugging is off.
Parallel version is on.
Using the mpi_f08 module.
Papi libraries are off.
TAU Hooks are off.
MPAS was built without OpenMP support.
MPAS was built without OpenMP-offload GPU support.
MPAS was built without OpenACC accelerator support.
MPAS was not linked with the MUSICA-Fortran library.
MPAS has been linked with the Scotch graph partitioning library.
Position-dependent code was generated.
MPAS was built with .F files.
The native timer interface is being used
Using the SMIOL library.
MPAS was built with an external ESMF library using ESMFMKFILE
3.6 Linking with the MUSICA library¶
To support the development of a chemistry capability for MPAS-Atmosphere (tentatively named “CheMPASA” — Chemistry for MPAS-A), MPAS v8.4.0 introduces an option in the Make build system to link with the Multi-Scale Infrastructure for Chemistry and Aerosols (MUSICA) library. After installing the MUSICA library — including its Fortran interface - ensure that the $PKG_CONFIG_PATH environment variable contains MUSICA’s pkgconfig directory so that pkg-config can find the musica-fortran package. If the MUSICA Fortran interface was successfully installed and findable by pkg-config, running
> pkg-config –version musica-fortran
should print the MUSICA library version. Linking with the MUSICA library when building MPAS is accomplished by simply adding MUSICA=true to the build command. If MPAS has been successfully compiled and linked with the MUSICA library, the build summary will indicate this with a message:
MPAS was built with default single-precision reals.
Debugging is off.
Parallel version is on.
Using the mpi_f08 module.
Papi libraries are off.
TAU Hooks are off.
MPAS was built without OpenMP support.
MPAS was built without OpenMP-offload GPU support.
MPAS was built without OpenACC accelerator support.
MPAS was linked with the MUSICA-Fortran library version 0.12.2.
MPAS has been linked with the Scotch graph partitioning library.
Position-dependent code was generated.
MPAS was built with .F files.
The native timer interface is being used
Using the SMIOL library.
MPAS was built with the embedded ESMF timekeeping library.
When running MPAS-Atmosphere, the log files will also indicate that the executable was compiled with the MUSICA library:
Compile-time options:
Build target: gnu
OpenMP support: no
OpenACC support: no
Default real precision: single
Compiler flags: optimize
I/O layer: SMIOL
MUSICA support: yes
- MICM version: 3.9.0
SCOTCH support: no
Note that the version of the Model Independent Chemistry Module (MICM), which is provided by MUSICA, is independent of the MUSICA library version.
3.7 Linking with the PT-Scotch graph partitioning library¶
Starting with the v8.4.0 release, MPAS also provides an option to enable online graph partitioning using the PT-Scotch library. To be able to use this feature, the PT-Scotch library must first be installed on the system, and then MPAS must be linked with the PT-Scotch library.
Instructions for building PT-Scotch are available on the Scotch website. The minimum supported PT-Scotch version is v7.0.8. As MPAS does not presently support 64-bit integer indices, PT-Scotch build must also be built with 32-bit integers to maintain compatibility with MPAS. To be able to use the distributed graph partitioning provided by PT-Scotch, you will need to pass an appropriate MPI installation path during the Scotch build. Other system prerequisites are Bison and Flex. Note that a more recent version of Flex (>v2.6.4) is recommended. Scotch documentation also references a requirement for a Bison version > 3.4.
After building and installing PT-Scotch, with a minimum version of at least 7.0.8, the MPAS atmosphere and init_atmosphere cores can be linked with Scotch by setting the path to SCOTCH prior to building MPAS.
> export SCOTCH=/path/to/scotch/installation
> make nvhpc CORE=atmosphere
If MPAS has been successfully compiled and linked with the PT-Scotch library, the build summary will indicate this with a message:
MPAS was built with default single-precision reals.
Debugging is off.
Parallel version is on.
Using the mpi_f08 module.
Papi libraries are off.
TAU Hooks are off.
MPAS was built without OpenMP support.
MPAS was built without OpenMP-offload GPU support.
MPAS was built without OpenACC accelerator support.
MPAS was not linked with the MUSICA-Fortran library.
MPAS has been linked with the Scotch graph partitioning library.
Position-dependent code was generated.
MPAS was built with .F files.
The native timer interface is being used
Using the SMIOL library.
MPAS was built with the embedded ESMF timekeeping library.
When running MPAS-Atmosphere, the log files will also indicate that the executable was compiled with the PT-Scotch library:
Compile-time options:
Build target: gnu
OpenMP support: no
OpenACC support: no
Default real precision: single
Compiler flags: optimize
I/O layer: SMIOL
MUSICA support: no
SCOTCH support: yes
See Online Graph Partitioning with PT-SCOTCH for instructions on how to use the PT-Scotch online graph partitioning feature with MPAS.
3.8 Cleaning¶
To remove all files that were created when the model was built, including the model executable itself, make may be run for the “clean” target:
> make clean
The core to be cleaned is specified by the CORE environment variable, or by specifying a core explicitly on the command-line with CORE=.