1. Introduction

The Unified Forecast System (UFS) is a community-based, coupled, comprehensive Earth modeling system. NOAA’s operational model suite for numerical weather prediction (NWP) is quickly transitioning to the UFS from a number of different modeling systems. The UFS enables research, development, and contribution opportunities within the broader Weather Enterprise (including government, industry, and academia). For more information about the UFS, visit the UFS Portal.

The UFS includes multiple applications that support different forecast durations and spatial domains. This documentation describes the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes to several days. The SRW Application v2.0.0 release includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within this User’s Guide and supported through a community forum. New and improved capabilities for this release include the addition of a verification package (METplus) for both deterministic and ensemble simulations and support for four stochastically perturbed perturbation schemes. Future work will expand the capabilities of the application to include data assimilation (DA) and a forecast restart/cycling capability.

This documentation provides a Quick Start Guide designed for use on Level 1 systems or as an overview of the workflow. It also provides a Container-Based Quick Start Guide for running the SRW Application in a container and a detailed guide for running the SRW App on any supported platform. Additionally, this User’s Guide provides an overview of the release components and details on how to customize or modify different portions of the workflow.

The SRW App v2.0.0 citation is as follows and should be used when presenting results based on research conducted with the App:

UFS Development Team. (2022, June 23). Unified Forecast System (UFS) Short-Range Weather (SRW) Application (Version v2.0.0). Zenodo. https://doi.org/10.5281/zenodo.6505854

1.1. How to Use This Document

This guide instructs both novice and experienced users on downloading, building, and running the SRW Application. Please post questions in the UFS Forum.

Throughout the guide, this presentation style indicates shell commands and options,
code examples, etc.

Variables presented as AaBbCc123 in this User’s Guide typically refer to variables in scripts, names of files, and directories.

File paths or code that include angle brackets (e.g., build_<platform>_<compiler>) indicate that users should insert options appropriate to their SRW App configuration (e.g., build_orion_intel).


1.2. Prerequisites for Using the SRW Application

1.2.1. Background Knowledge Prerequisites

The instructions in this documentation assume that users have certain background knowledge:

  • Familiarity with LINUX/UNIX systems

  • Command line basics

  • System configuration knowledge (e.g., compilers, environment variables, paths, etc.)

  • Numerical Weather Prediction (concepts of parameterizations: physical, microphysical, convective)

  • Meteorology (in particular, meteorology at the scales being predicted: 25km, 13km, and 3km resolutions)

Additional background knowledge in the following areas could be helpful:

  • High-Performance Computing (HPC) Systems for those running the SRW App on an HPC system

  • Programming (particularly Python) for those interested in contributing to the SRW App code

  • Creating an SSH Tunnel to access HPC systems from the command line

  • Containerization

  • Workflow Managers/Rocoto

1.2.2. Software/Operating System Requirements

The UFS SRW Application has been designed so that any sufficiently up-to-date machine with a UNIX-based operating system should be capable of running the application. NOAA Level 1 & 2 systems already have these prerequisites installed. However, users working on other systems must ensure that the following requirements are installed on their system:

Minimum Platform Requirements:

  • POSIX-compliant UNIX-style operating system

  • >82 GB disk space

    • 53 GB input data for a standard collection of global database, or “fix” data (topography, climatology, observational database) for a short 12-hour test forecast on CONUS 25km domain. See data download instructions in Section 7.3.

    • 8 GB for HPC-Stack full installation

    • 3 GB for ufs-srweather-app installation

    • 1 GB for boundary conditions for a short 12-h test forecast on the CONUS 25km domain. See data download instructions in Section 7.3

    • 17 GB for a 12-h test forecast on the CONUS 25km domain, with model output saved hourly, see Section 4.6

  • Fortran compiler released since 2018

    • gfortran v9+ or ifort v18+ are the only ones tested, but others may work.

  • C and C++ compilers compatible with the Fortran compiler

    • gcc v9+, ifort v18+, and clang v9+ (macOS, native Apple clang, LLVM clang, GNU) have been tested

  • Python v3.6+, including prerequisite packages jinja2, pyyaml and f90nml

    • Python packages scipy, matplotlib, pygrib, cartopy, and pillow are required for users who would like to use the provided graphics scripts.

  • Perl 5

  • git v2.12+

  • curl

  • wget

  • Lmod

The following software is also required to run the SRW Application, but the HPC-Stack (which contains the software libraries necessary for building and running the SRW App) can be configured to build these requirements:

  • CMake v3.20+

  • MPI (MPICH, OpenMPI, or other implementation)

    • Only MPICH or OpenMPI can be built with HPC-Stack. Other implementations must be installed separately by the user (if desired).

For MacOS systems, some additional software packages are needed. When possible, it is recommended that users install and/or upgrade this software (along with software listed above) using the Homebrew package manager for MacOS. See Chapter 6.3 and Chapter for further guidance on installing these prerequisites on MacOS.

  • bash v4.x

  • GNU compiler suite v.11 or higher with gfortran

  • cmake

  • make

  • coreutils

Optional but recommended prerequisites for all systems:

  • Conda for installing/managing Python packages

  • Bash v4+

  • Rocoto Workflow Management System (1.3.1)

  • Python packages scipy, matplotlib, pygrib, cartopy, and pillow for graphics

1.3. SRW App Components Overview

1.3.1. Pre-processor Utilities and Initial Conditions

The SRW Application includes a number of pre-processing utilities that initialize and prepare the model. Tasks include generating a regional grid along with orography and surface climatology files for that grid. One pre-processing utility converts the raw external model data into initial and lateral boundary condition files in netCDF format. Later, these files are used as input to the atmospheric model (FV3-LAM). Additional information about the pre-processor utilities can be found in Chapter 5.1, in the UFS_UTILS Technical Documentation, and in the UFS_UTILS Scientific Documentation.

1.3.2. Forecast Model Atmospheric Model

The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Cubed-Sphere (FV3) dynamical core configured with a Limited Area Model (LAM) capability (Black et al. [BAB+21]). The dynamical core is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS Weather Model can be found here. Common Community Physics Package

The Common Community Physics Package (CCPP) supports interoperable atmospheric physics and land surface model options. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. The SRW App v2.0.0 release includes four physics suites. Data Format

The SRW App supports the use of external model data in GRIB2, NEMSIO, and netCDF format when generating initial and boundary conditions. The UFS Weather Model ingests initial and lateral boundary condition files produced by chgres_cube.

1.3.3. Unified Post-Processor (UPP)

The Unified Post Processor (UPP) processes raw output from a variety of numerical weather prediction (NWP) models. In the SRW App, it converts data output from netCDF format to GRIB2 format. The UPP can also be used to compute a variety of useful diagnostic fields, as described in the UPP User’s Guide.

1.3.4. METplus Verification Suite

The Model Evaluation Tools (MET) package is a set of statistical verification tools developed by the Developmental Testbed Center (DTC) for use by the NWP community to help them assess and evaluate the performance of numerical weather predictions. MET is the core component of the enhanced METplus verification framework. The suite also includes the associated database and display systems called METviewer and METexpress. METplus spans a wide range of temporal and spatial scales. It is intended to be extensible through additional capabilities developed by the community. More details about METplus can be found in Chapter 5.4 and on the METplus website.

1.3.5. Visualization Example

The SRW Application v2.0.0 release provides Python scripts to create basic visualizations of the model output. Chapter 12 contains usage information and instructions; instructions also appear at the top of the scripts.

1.3.6. Build System and Workflow

The SRW Application has a portable CMake-based build system that packages together all the components required to build the SRW Application. Once built, users can generate a Rocoto-based workflow that will run each task in the proper sequence (see the Rocoto documentation for more on workflow management). Individual workflow tasks can also be run in a stand-alone, command line fashion.

The SRW Application allows for configuration of various elements of the workflow. For example, users can modify the parameters of the atmospheric model, such as start and end dates, duration, time step, and the physics suite used for the simulation. More information on how to do this is available in Section

The SRW Application v2.0.0 release has been tested on a variety of platforms widely used by researchers, including NOAA High-Performance Computing (HPC) systems (e.g., Hera, Orion), cloud environments, and generic Linux and MacOS systems. Four levels of support have been defined for the SRW Application. Preconfigured (Level 1) systems already have the required external libraries available in a central location (via HPC-Stack). The SRW Application is expected to build and run out-of-the-box on these systems, and users can download the SRW App code without first installing prerequisites. On other platforms (Levels 2-4), the SRW App can be run within a container that includes the HPC-Stack, or the required libraries will need to be installed as part of the SRW Application build process. Once these prerequisite libraries are installed, applications and models should build and run successfully. However, users may need to perform additional troubleshooting on Level 3 or 4 systems since little or no pre-release testing has been conducted on these systems.

1.4. Code Repositories and Directory Structure

1.4.1. Hierarchical Repository Structure

The umbrella repository for the SRW Application is named ufs-srweather-app and is available on GitHub at https://github.com/ufs-community/ufs-srweather-app. An umbrella repository is a repository that houses external code, called “externals,” from additional repositories. The SRW Application includes the manage_externals tool and a configuration file called Externals.cfg, which tags the appropriate versions of the external repositories associated with the SRW App (see Table 1.1).

Table 1.1 List of top-level repositories that comprise the UFS SRW Application

Repository Description

Authoritative repository URL

Umbrella repository for the UFS Short-Range Weather Application


Repository for the UFS Weather Model


Repository for the regional workflow


Repository for UFS utilities, including pre-processing, chgres_cube, and more


Repository for the Unified Post Processor (UPP)


The UFS Weather Model contains a number of sub-repositories, which are documented here.


The prerequisite libraries (including NCEP Libraries and external libraries) are not included in the UFS SRW Application repository. The HPC-Stack repository assembles these prerequisite libraries. The HPC-Stack has already been built on preconfigured (Level 1) platforms. However, it must be built on other systems. Chapter 6 contains details on installing the HPC-Stack.

1.4.2. Directory Structure

The ufs-srweather-app umbrella repository structure is determined by the local_path settings contained within the Externals.cfg file. After manage_externals/checkout_externals is run (see Section 4.3), the specific GitHub repositories described in Table 1.1 are cloned into the target subdirectories shown below. Directories that will be created as part of the build process appear in parentheses and will not be visible until after the build is complete. Some directories have been removed for brevity.

├── (bin)
├── (build)
├── docs
│     └── UsersGuide
├── etc
├── (include)
├── (lib)
├── manage_externals
├── modulefiles
├── regional_workflow
│     ├── docs
│     │     └── UsersGuide
│     ├── (fix)
│     ├── jobs
│     ├── modulefiles
│     ├── scripts
│     ├── tests
│     │     └── baseline_configs
│     └── ush
│          ├── Python
│          ├── rocoto
│          ├── templates
│          └── wrappers
├── (share)
├── src
│    ├── UPP
│    │     ├── parm
│    │     └── sorc
│    │          └── ncep_post.fd
│    ├── UFS_UTILS
│    │     ├── sorc
│    │     │    ├── chgres_cube.fd
│    │     │    ├── fre-nctools.fd
│    │     │    ├── grid_tools.fd
│    │     │    ├── orog_mask_tools.fd
│    │     │    └── sfc_climo_gen.fd
│    │     └── ush
│    └── ufs-weather-model
│         └── FV3
│              ├── atmos_cubed_sphere
│              └── ccpp
└── test Regional Workflow Sub-Directories

A number of sub-directories are created under the regional_workflow directory when the regional workflow is cloned (see directory diagram above). Table 1.2 describes the contents of these sub-directories.

Table 1.2 Sub-directories of the regional workflow

Directory Name



J-job scripts launched by Rocoto


Files used to load modules needed for building and running the workflow


Run scripts launched by the J-jobs


Baseline experiment configuration


Utility scripts used by the workflow

1.4.3. Experiment Directory Structure

When the user generates an experiment using the generate_FV3LAM_wflow.sh script (Section 4.7.3), a user-defined experimental directory ($EXPTDIR) is created based on information specified in the config.sh file. Table 1.3 shows the contents of the experiment directory before running the experiment workflow.

Table 1.3 Files and sub-directory initially created in the experimental directory

File Name



User-specified configuration file, see Section


Cycle-independent input file (empty)


Tracers in the forecast model


Rocoto XML file to run the workflow


Namelist for the UFS Weather Model


Symlink to the shell script of ufs-srweather-app/regional_workflow/ush/launch_FV3LAM_wflow.sh, which can be used to (re)launch the Rocoto workflow. Each time this script is called, it appends to a log file named log.launch_FV3LAM_wflow.


Log of the output from the experiment generation script (generate_FV3LAM_wflow.sh)


See NEMS configuration file


CCPP suite definition file used by the forecast model


Shell script defining the experiment parameters. It contains all of the primary parameters specified in the default and user-specified configuration files plus many secondary parameters that are derived from the primary ones by the experiment generation script. This file is sourced by various other scripts in order to make all the experiment variables available to these scripts.


Cycle directory (empty)

In addition, running the SRW App in community mode creates the fix_am and fix_lam directories (see Table 1.4) in $EXPTDIR. The fix_lam directory is initially empty but will contain some fix (time-independent) files after the grid, orography, and/or surface climatology generation tasks run.

Table 1.4 Description of the fix directories

Directory Name



Directory containing the global fix (time-independent) data files. The experiment generation script copies these files from a machine-dependent system directory.


Directory containing the regional fix (time-independent) data files that describe the regional grid, orography, and various surface climatology fields, as well as symlinks to pre-generated files.

Once the Rocoto workflow is launched, several files and directories are generated. A log file named log.launch_FV3LAM_wflow will be created (unless it already exists) in $EXPTDIR. The first several workflow tasks (i.e., make_grid, make_orog, make_sfc_climo, get_extrn_ics, and get_extrn_lbc) are preprocessing tasks, and these tasks also result in the creation of new files and sub-directories, described in Table 1.5.

Table 1.5 New directories and files created when the workflow is launched

Directory/File Name



This is a “cycle directory” that is updated when the first cycle-specific workflow tasks (get_extrn_ics and get_extrn_lbcs) are run. These tasks are launched simultaneously for each cycle in the experiment. Cycle directories are created to contain cycle-specific files for each cycle that the experiment runs. If DATE_FIRST_CYCL and DATE_LAST_CYCL are different, and/or if CYCL_HRS contains more than one element in the config.sh file, more than one cycle directory will be created under the experiment directory.


Directory generated by the make_grid task to store grid files for the experiment


Contains log files generated by the overall workflow and by its various tasks. Look in these files to determine why a task may have failed.


Directory generated by the make_orog task containing the orography files for the experiment


Directory generated by the make_sfc_climo task containing the surface climatology files for the experiment

FV3LAM_wflow.db FV3LAM_wflow_lock.db

Database files that are generated when Rocoto is called (by the launch script) to launch the workflow.


The launch_FV3LAM_wflow.sh script appends its output to this log file each time it is called. Take a look at the last 30–50 lines of this file to check the status of the workflow.

The output files for an experiment are described in Section 7.2. The workflow tasks are described in Section 4.7.4.

1.5. User Support, Documentation, and Contributions to Development

A forum-based, online support system organized by topic provides a centralized location for UFS users and developers to post questions and exchange information.

A list of available documentation is shown in Table 1.6.

Table 1.6 Centralized list of documentation



UFS SRW Application User’s Guide


UFS_UTILS Technical Documentation


UFS_UTILS Scientific Documentation


UFS Weather Model User’s Guide


HPC-Stack Documentation


FV3 Scientific Documentation


FV3 Technical Documentation


CCPP Scientific Documentation


CCPP Technical Documentation


Stochastic Physics Documentation


ESMF manual


Unified Post Processor


The UFS community is encouraged to contribute to the development effort of all related utilities, model code, and infrastructure. Users can post issues in the related GitHub repositories to report bugs or to announce upcoming contributions to the code base. For code to be accepted in the authoritative repositories, users must follow the code management rules of each UFS component repository, which are outlined in the respective User’s Guides listed in Table 1.6. Contributions to the ufs-srweather-app repository or the regional_workflow repository should follow the guidelines contained in the SRW App Contributor’s Guide.

1.6. Future Direction

Users can expect to see incremental improvements and additional capabilities in upcoming releases of the SRW Application to enhance research opportunities and support operational forecast implementations. Planned enhancements include:

  • A more extensive set of supported developmental physics suites.

  • A larger number of pre-defined domains/resolutions and a fully supported capability to create a user-defined domain.

  • Add user-defined vertical levels (number and distribution).

  • Inclusion of data assimilation and forecast restart/cycling capabilities.


T.L. Black, J.A. Abeles, B.T. Blake, D. Jovic, E. Rogers, X. Zhang, E.A. Aligo, L.C. Dawson, Y. Lin, E. Strobach, P.C. Shafran, and J.R. Carley. A limited area modeling capability for the finite-volume cubed-sphere (fv3) dynamical core. AGU Journal of Advances in Earth Modeling Systems, 2021.


R.J. Purser, D. Jovic, G. Ketefian, T. Black, J. Beck, J. Dong, and J. Carley. The extended schmidt gnomonic grid for regional applications. Unified Forecast System (UFS) Users’ Workshop, 2020. URL: https://dtcenter.org/sites/default/files/events/2020/2-purser-james.pdf.