WRF
Software Testimonials
I am a new user of WRF. I had little difficulty
porting to our IA32 and IA64 Linux clusters and running ideal and real
(including benchmark) test cases. I am quite impressed by the volume and
quality of the documentation and by the quick response of the WRF help desk.
Art Mirin
Leader, Scientific Computing Group
Center for Applied Scientific Computing
Lawrence Livermore National Laboratory
The WRF software has been a valuable component of the DoD High Performance Computing Modernization Program's Applications Benchmark Test Suite for the past two years. Portability of the software has been excellent, and the "configure" tool makes WRF easy to install on virtually every modern HPC platform. These characteristics of WRF, as well as its excellent scalability in parallel environments, enable a very useful comparison of HPC platforms of interest to the DoD.
Tom Oppe,
Applications Programmer/Analyst
Engineer
Research and Development Center
Major Shared Resource Center (ERDC MSRC)
My first exposure to WRF was in a project
where I was tasked with coupling it with an ocean model running on a different
grid, and providing it to several different users, each with different needs
(different physics options, different dynamical cores, and different boundary
conditions and time/length scales). We
were running on several different platforms, including the new (at the time)
Itanium-64 clusters, and coupling across the TeraGrid. No small feat!
We found that WRF's modular software
structure was both easy to understand and well-documented. The user community was helpful when we did need a hand. Questions about the
physics and what variables needed to be passed to the ocean model using
different dynamical cores were answered by researchers at NCAR whose expertise
is unparalleled, and questions about the software design and how best to pass
data in parallel were also well addressed by the WRF software engineers.
In all, we've had a good experience with WRF,
and continue to enjoy not only it's support as a leading research tool by the
atmospheric science community, but its well-designed software structure, too.
Christopher
Moore
Research
Scientist, Oceanography
University of
Washington/JISAO/NOAA-PMEL
My first experience on WRF software
infrastructure was in the GRAPES project, which aimed to develop a new
numerical weather model for Chinese Meteorological Administration. It appeared
to be state-of-the-art as it included advanced mechanisms for both distributed
and shared memory parallel processing.
The WRF software infrastructure was used as the
driver layer of the GRAPES model. The user interface is well defined and it is
very easy to understand. Variables could be introduced easily with the registry
mechanism, and almost all the functions we need was available. It greatly eased
the effort to write the parallel code.
The software is quit portable, I installed the
model on IBM sp, and several Intel based clusters without any problem and the
performance was quit encouraging.
Zhiyan Jin
Chinese Academy of Meteorological Sciences
The entire WRF development team should be
congratulated on bringing a flexible, portable, and high performance NWP system
to the meteorological community in such little time. I am particularly
impressed by the portability of the software to so many different systems and
the high computational performance on diverse computer architectures. I am
optimistic that we will continue to see improvements in forecast skill and
basic understanding of the atmosphere since the efforts of many people can be
focused on a common software platform.
Jim Tuccillo
Meteorologist and Sr. Applications Analyst
Linux Networx
In general, I have been very pleased with the
ease with which we have been able to download the WRF software and run it on
our computers at GFDL, as well as the ease with which we have been able to
introduce new physics packages into the model. Scientifically, we have been
impressed with its skill as a cloud-system-resolving model, in which we run WRF
at the resolution of individual cumulus clouds and their associated systems.
Leo J. Donner
Geophysical
Fluid Dynamics Laboratory/NOAA
The WRF software infrastructure appears to be
state-of-the-art insofar as it includes advanced mechanisms for automatically
generating source-code during compilation.
This radically facilitates modification of the code for addition of new
physics packages by the user.
Moreover, such mechanisms allow new arrays
and variables to be introduced easily, with their declaration and interaction
with history/restart/input files being
coded automatically. The software
appears to perform well on a variety of platforms.
The documentation was very helpful, concise
and lucid, and the support from "WRFHelp" was excellent.
Vaughan
Phillips
Geophysical
Fluid Dynamics Laboratory/NOAA
Our group is using
WRF-chem for research applications that require several hundred prognostic
scalars. It has been relatively
straightforward to port chemistry and aerosol modules into the WRF framework
and to add code to permit aerosol-radiation-cloud feedback processes. NCAR has been very supportive in this effort
and has made several modifications to the registry that reduced the amount of
code needed for hundreds of scalars and made simulating atmospheric chemistry
more efficient.
Jerome Fast
Pacific Northwest National Laboratory
My research involves
the study of atmospheres of other planets and moons. Instead of writing our own
atmospheric model for research purposes, we decided to adopt and modify WRF to
be both a global and a more general planetary atmospheric model. Making WRF into a global model involved
modifying parts of the dynamic core, and the cleanly-written and very modular
WRF made dealing with modification of the dynamical core easy. The code is
clear, easy to understand, and flexible for a variety of purposes, including
ours, which the model designers probably never anticipated! The flexible, modular format of the model
code made the additional modifications necessary to produce a planetary
atmospheric model easy as well. The
notes intended for terrestrial scientists to implement their own physics
parameterization schemes were clear and useful for us as well in designing
schemes appropriate to other atmospheres.
Attending the WRF
support meetings has also been an enormous help, and everyone involved in the
project has been helpful and quick to respond to any questions we have
had. All in all, it is a pleasure to
work with the WRF model and the people associated with it.
Anthony Toigo
Research Associate
Cornell University
Ithaca, NY
From a Cray X1/X1E performance
perspective, the flexibility inherent in the WRF software architecture makes it
very easy to use the Cray X1/X1E vector supercomputer efficiently. The array storage order and loop structuring
allows effective use of both shared memory X1/X1E streaming features over bands
of latitudes and fast vector hardware along the length of each latitude. Using simple runtime options, WRF allows
these parameters to be easily tuned for each forecast domain.
Peter
Johnsen
Meteorologist,
Applications Engineering
Cray, Inc.
WRF is readily ported to a variety of
architectures due to it's supported makefile and configuration menus. The input
format and problem setup are straightforward, allowing experiments of a wide
variety of domain sizes and physics options to be easily conducted. The
parallel scaling is excellent on both SMP and clusters, even for modest sized
problems.
Mark Straka
National
Center for Supercomputing Applications
I have used the WRF I/O infrastructure to build
a GRIB (version 1) input/output module. The defined and documented I/O API that
is part of WRF made this task possible. Because the interface was so well
documented, I was able to write the entire module without any input from the
WRF developers. My opinion is that it is extremely important to have
well-defined and documented API's within a software architecture such as WRF so
that it can be easily extended. So, thank you!
The success of this model is in the seamless
introduction of object-oriented architecture in scientific computing. The
registry mechanism of the model provides a sophisticated dynamical memory
management facility for different types of variables and also making it
flexible to add new schemes. The layer and tile framework used in the WRF parallelisation
has eased the efforts to port the model on our hybrid architecture based
supercomputer PARAM PADMA.
Browser based code documentation is a boon to
the modelers to study the code.
Amit Kesarkar
Computational Atmospheric Science Group
Centre for Development of Advanced Computing (CDAC)
Pune, India
The WRF IO framework makes it easy to add data
formats to the software without messing
IO calls inside the model. It also provides a way for other models to adopt IO modules easily.
Muqun “Kent”
Yang
National
Center for Supercomputing Applications
The WRF software architecture from the
initialization to the actual model has been truly well designed for easy use
and easy modification. We are using WRF/Chem for air chemistry studies and air
quality forecast in Latin America also incorporating our own chemistry modules.
Due to its modular structure these modifications have been very easy to
implement. In case of any problem the support has been quick and accurate. Good
stuff....
Rainer Schmitz
Forschungszentrum
Karlsruhe, IMK-IFU
University of
Chile
The WRF build infrastructure simplified our
development of tools for parallel post processing of WRF output files in two
ways: by simplifying the configure and build process of WRF, and by abstracting
I/O functionality into a single place where it can be studied free of any
simulation code. WRF is an example of
how software organization can vastly reduce the complexity of programming and
maintaining not just the model itself, but up/down-stream applications as well.
Jace Mogill
ROTANG
I support several weather forecasting and
climate simulation codes on SGI systems so that our end-users can be productive
with the least amount of trouble. I
find WRF to be one of the easiest packages to build and use of the many I deal
with in my work.
Dr. Gerardo Cisneros
Scientist
SGI
What I find most impressive about WRF is its
portability. My colleagues at FSL and I
have run WRF on various machines and it always installs with minimal problems.
Jacques
Middlecoff
Forecast
System Laboratory/NOAA
If you would
like to provide a testimonial on the WRF Software for inclusion here, please
contact John Michalakes at michalak@ucar.edu
.