2.6. Tutorials
This chapter walks users through experiment configuration options for various severe weather events. It assumes that users have already built the SRW App successfully.
Users can run through the entire set of tutorials or jump to the one that interests them most. The first tutorial is recommended for users who have never run the SRW App before. The five tutorials address different skills:
Severe Weather Over Indianapolis: Change physics suites and compare graphics plots.
Cold Air Damming: Coming soon!
Southern Plains Winter Weather Event: Coming soon!
Halloween Storm: Change IC/LBC sources and compare results.
Hurricane Barry: Coming soon!
Choose your own adventure!: Create a forecast of your choice on a custom grid using publicly available data.
Each section provides a summary of the weather event and instructions for configuring an experiment.
2.6.1. Sample Forecast #1: Severe Weather Over Indianapolis
Objective: Modify physics options and compare forecast outputs for similar experiments using the graphics plotting task.
2.6.1.1. Weather Summary
A surface boundary associated with a vorticity maximum over the northern Great Plains moved into an unstable environment over Indianapolis, which led to the development of isolated severe thunderstorms before it congealed into a convective line. The moist air remained over the southern half of the area on the following day. The combination of moist air with daily surface heating resulted in isolated thunderstorms that produced small hail.
Weather Phenomena: Numerous tornado and wind reports (6/15) and hail reports (6/16)

Fig. 2.5 Severe Weather Over Indianapolis Starting at 18z
2.6.1.2. Data
On Level 1 systems, users can find data for the Indianapolis Severe Weather Forecast in the usual input model data locations (see Section 2.4 for a list). The data can also be downloaded from the UFS SRW Application Data Bucket.
FV3GFS data for the first forecast (
control
) is located at:HRRR and RAP data for the second forecast (
test_expt
) is located at:
2.6.1.3. Load the Workflow
To load the workflow environment, source the lmod-setup file and load the workflow conda environment by running:
source /path/to/ufs-srweather-app/etc/lmod-setup.sh <platform>
module use /path/to/ufs-srweather-app/modulefiles
module load wflow_<platform>
where <platform>
is a valid, lowercased machine name (see MACHINE
in Section 3.1.1 for valid values), and /path/to/
is replaced by the actual path to the ufs-srweather-app
.
After loading the workflow, users should follow the instructions printed to the console. Usually, the instructions will tell the user to run conda activate srw_app
. For example, a user on Hera with permissions on the nems
project may issue the following commands to load the workflow (replacing User.Name
with their actual username):
source /scratch1/NCEPDEV/nems/User.Name/ufs-srweather-app/etc/lmod-setup.sh hera
module use /scratch1/NCEPDEV/nems/User.Name/ufs-srweather-app/modulefiles
module load wflow_hera
conda activate srw_app
2.6.1.4. Configuration
Navigate to the ufs-srweather-app/ush
directory. The default (or “control”) configuration for this experiment is based on the config.community.yaml
file in that directory. Users may copy this file into config.yaml
if they have not already done so:
cd /path/to/ufs-srweather-app/ush
cp config.community.yaml config.yaml
Users can save the location of the ush
directory in an environment variable ($USH
). This makes it easier to navigate between directories later. For example:
export USH=/path/to/ufs-srweather-app/ush
Users should substitute /path/to/ufs-srweather-app/ush
with the actual path on their system. As long as a user remains logged into their system, they can run cd $USH
, and it will take them to the ush
directory. The variable will need to be reset for each login session.
2.6.1.4.1. Experiment 1: Control
Edit the configuration file (config.yaml
) to include the variables and values in the sample configuration excerpts below.
Hint
To open the configuration file in the command line, users may run the command:
vi config.yaml
To modify the file, hit the i
key and then make any changes required. To close and save, hit the esc
key and type :wq
to write the changes to the file and exit/quit the file. Users may opt to use their preferred code editor instead.
Start in the user:
section and change the MACHINE
and ACCOUNT
variables. For example, when running on a personal MacOS device, users might set:
user:
RUN_ENVIR: community
MACHINE: macos
ACCOUNT: none
For a detailed description of these variables, see Section 3.1.1.
Users do not need to change the platform:
section of the configuration file for this tutorial. The default parameters in the platform:
section pertain to METplus verification, which is not addressed here. For more information on verification, see Section 2.7.
In the workflow:
section of config.yaml
, update EXPT_SUBDIR
and PREDEF_GRID_NAME
.
workflow:
USE_CRON_TO_RELAUNCH: false
EXPT_SUBDIR: control
CCPP_PHYS_SUITE: FV3_GFS_v16
PREDEF_GRID_NAME: SUBCONUS_Ind_3km
DATE_FIRST_CYCL: '2019061518'
DATE_LAST_CYCL: '2019061518'
FCST_LEN_HRS: 12
PREEXISTING_DIR_METHOD: rename
VERBOSE: true
COMPILER: intel
Note
Users may also want to set USE_CRON_TO_RELAUNCH: true
and add CRON_RELAUNCH_INTVL_MNTS: 3
. This will automate submission of workflow tasks when running the experiment. However, not all systems have cron.
EXPT_SUBDIR:
This variable can be changed to any name the user wants from “gfsv16_physics_fcst” to “forecast1” to “askdfj” (but note that whitespace and some punctuation characters are not allowed). However, the best names will indicate useful information about the experiment. This tutorial uses control
to establish a baseline, or “control”, forecast. Since this tutorial helps users to compare the output from two different forecasts — one that uses the FV3_GFS_v16 physics suite and one that uses the FV3_RRFS_v1beta physics suite — “gfsv16_physics_fcst” could be a good alternative directory name.
PREDEF_GRID_NAME:
This experiment uses the SUBCONUS_Ind_3km grid, rather than the default RRFS_CONUS_25km grid. The SUBCONUS_Ind_3km grid is a high-resolution grid (with grid cell size of approximately 3km) that covers a small area of the U.S. centered over Indianapolis, IN. For more information on this grid, see Section 3.3.
For a detailed description of other workflow:
variables, see Section 3.1.3.
To turn on the plotting for the experiment, the YAML configuration file
should be included in the rocoto:tasks:taskgroups:
section, like this:
rocoto:
tasks:
metatask_run_ensemble:
task_run_fcst_mem#mem#:
walltime: 02:00:00
taskgroups: '{{ ["parm/wflow/prep.yaml", "parm/wflow/coldstart.yaml", "parm/wflow/post.yaml", "parm/wflow/plot.yaml"]|include }}'
For more information on how to turn on/off tasks in the workflow, please see Section 2.4.3.2.2.2.
In the task_get_extrn_ics:
section, add USE_USER_STAGED_EXTRN_FILES
and EXTRN_MDL_SOURCE_BASEDIR_ICS
. Users will need to adjust the file path to reflect the location of data on their system (see Section 2.4.1 for locations on Level 1 systems).
task_get_extrn_ics:
EXTRN_MDL_NAME_ICS: FV3GFS
FV3GFS_FILE_FMT_ICS: grib2
USE_USER_STAGED_EXTRN_FILES: true
EXTRN_MDL_SOURCE_BASEDIR_ICS: /path/to/UFS_SRW_App/develop/input_model_data/FV3GFS/grib2/${yyyymmddhh}
For a detailed description of the task_get_extrn_ics:
variables, see Section 3.1.8.
Similarly, in the task_get_extrn_lbcs:
section, add USE_USER_STAGED_EXTRN_FILES
and EXTRN_MDL_SOURCE_BASEDIR_LBCS
. Users will need to adjust the file path to reflect the location of data on their system (see Section 2.4.1 for locations on Level 1 systems).
task_get_extrn_lbcs:
EXTRN_MDL_NAME_LBCS: FV3GFS
LBC_SPEC_INTVL_HRS: 6
FV3GFS_FILE_FMT_LBCS: grib2
USE_USER_STAGED_EXTRN_FILES: true
EXTRN_MDL_SOURCE_BASEDIR_LBCS: /path/to/UFS_SRW_App/develop/input_model_data/FV3GFS/grib2/${yyyymmddhh}
For a detailed description of the task_get_extrn_lbcs:
variables, see Section 3.1.9.
Users do not need to modify the task_run_fcst:
section for this tutorial.
Lastly, in the task_plot_allvars:
section, add PLOT_FCST_INC: 6
and PLOT_DOMAINS: ["regional"]
. Users may also want to add PLOT_FCST_START: 0
and PLOT_FCST_END: 12
explicitly, but these can be omitted since the default values are the same as the forecast start and end time respectively.
task_plot_allvars:
COMOUT_REF: ""
PLOT_FCST_INC: 6
PLOT_DOMAINS: ["regional"]
PLOT_FCST_INC:
This variable indicates the forecast hour increment for the plotting task. By setting the value to 6
, the task will generate a .png
file for every 6th forecast hour starting from 18z on June 15, 2019 (the 0th forecast hour) through the 12th forecast hour (June 16, 2019 at 06z).
PLOT_DOMAINS:
The plotting scripts are designed to generate plots over the entire CONUS by default, but by setting this variable to [“regional”], the experiment will generate plots for the smaller SUBCONUS_Ind_3km regional domain instead.
After configuring the forecast, users can generate the forecast by running:
./generate_FV3LAM_wflow.py
To see experiment progress, users should navigate to their experiment directory. Then, use the rocotorun
command to launch new workflow tasks and rocotostat
to check on experiment progress.
cd /path/to/expt_dirs/control
rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10
rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10
Users will need to rerun the rocotorun
and rocotostat
commands above regularly and repeatedly to continue submitting workflow tasks and receiving progress updates.
Note
When using cron to automate the workflow submission (as described above), users can omit the rocotorun
command and simply use rocotostat
to check on progress periodically.
Users can save the location of the control
directory in an environment variable ($CONTROL
). This makes it easier to navigate between directories later. For example:
export CONTROL=/path/to/expt_dirs/control
Users should substitute /path/to/expt_dirs/control
with the actual path on their system. As long as a user remains logged into their system, they can run cd $CONTROL
, and it will take them to the control
experiment directory. The variable will need to be reset for each login session.
2.6.1.4.2. Experiment 2: Test
Once the control case is running, users can return to the config.yaml
file (in $USH
) and adjust the parameters for a new forecast. Most of the variables will remain the same. However, users will need to adjust EXPT_SUBDIR
and CCPP_PHYS_SUITE
in the workflow:
section as follows:
workflow:
EXPT_SUBDIR: test_expt
CCPP_PHYS_SUITE: FV3_RRFS_v1beta
EXPT_SUBDIR:
This name must be different than the EXPT_SUBDIR
name used in the previous forecast experiment. Otherwise, the first forecast experiment will be renamed, and the new experiment will take its place (see Section 3.1.3.13 for details). To avoid this issue, this tutorial uses test_expt
as the second experiment’s name, but the user may select a different name if desired.
CCPP_PHYS_SUITE:
The FV3_RRFS_v1beta physics suite was specifically created for convection-allowing scales and is the precursor to the operational physics suite that will be used in the Rapid Refresh Forecast System (RRFS).
Hint
Later, users may want to conduct additional experiments using the FV3_HRRR and FV3_WoFS_v0 physics suites. Like FV3_RRFS_v1beta, these physics suites were designed for use with high-resolution grids for storm-scale predictions.
Next, users will need to modify the data parameters in task_get_extrn_ics:
and task_get_extrn_lbcs:
to use HRRR and RAP data rather than FV3GFS data. Users will need to change the following lines in each section:
task_get_extrn_ics:
EXTRN_MDL_NAME_ICS: HRRR
EXTRN_MDL_SOURCE_BASEDIR_ICS: /path/to/UFS_SRW_App/develop/input_model_data/HRRR/${yyyymmddhh}
task_get_extrn_lbcs:
EXTRN_MDL_NAME_LBCS: RAP
EXTRN_MDL_SOURCE_BASEDIR_LBCS: /path/to/UFS_SRW_App/develop/input_model_data/RAP/${yyyymmddhh}
EXTRN_MDL_LBCS_OFFSET_HRS: '-0'
HRRR and RAP data are better than FV3GFS data for use with the FV3_RRFS_v1beta physics scheme because these datasets use the same physics parameterizations that are in the FV3_RRFS_v1beta suite. They focus on small-scale weather phenomena involved in storm development, so forecasts tend to be more accurate when HRRR/RAP data are paired with FV3_RRFS_v1beta and a high-resolution (e.g., 3-km) grid. Using HRRR/RAP data with FV3_RRFS_v1beta also limits the “spin-up adjustment” that takes place when initializing with model data coming from different physics.
EXTRN_MDL_LBCS_OFFSET_HRS:
This variable allows users to use lateral boundary conditions (LBCs) from a previous forecast run that was started earlier than the start time of the forecast being configured in this experiment. This variable is set to 0 by default except when using RAP data; with RAP data, the default value is 3, so the forecast will look for LBCs from a forecast started 3 hours earlier (i.e., at 2019061515 — 15z — instead of 2019061518). To avoid this, users must set EXTRN_MDL_LBCS_OFFSET_HRS
explicitly.
Under rocoto:tasks:
, add a section to increase the maximum wall time for the postprocessing tasks. The walltime is the maximum length of time a task is allowed to run. On some systems, the default of 15 minutes may be enough, but on others (e.g., NOAA Cloud), the post-processing time exceeds 15 minutes, so the tasks fail.
rocoto:
tasks:
metatask_run_ensemble:
task_run_fcst_mem#mem#:
walltime: 02:00:00
taskgroups: '{{ ["parm/wflow/prep.yaml", "parm/wflow/coldstart.yaml", "parm/wflow/post.yaml", "parm/wflow/plot.yaml"]|include }}'
metatask_run_ens_post:
metatask_run_post_mem#mem#_all_fhrs:
task_run_post_mem#mem#_f#fhr#:
walltime: 00:20:00
Lastly, users must set the COMOUT_REF
variable in the task_plot_allvars:
section to create difference plots that compare output from the two experiments. COMOUT_REF
is a template variable, so it references other workflow variables within it (see Section 3.5 for details on template variables). COMOUT_REF
should provide the path to the control
experiment forecast output using single quotes as shown below:
task_plot_allvars:
COMOUT_REF: '${EXPT_BASEDIR}/control/${PDY}${cyc}/postprd'
Here, $EXPT_BASEDIR
is the path to the main experiment directory (named expt_dirs
by default). $PDY
refers to the cycle date in YYYYMMDD format, and $cyc
refers to the starting hour of the cycle. postprd
contains the post-processed data from the experiment. Therefore, COMOUT_REF
will refer to control/2019061518/postprd
and compare those plots against the ones in test_expt/2019061518/postprd
.
After configuring the forecast, users can generate the second forecast by running:
./generate_FV3LAM_wflow.py
To see experiment progress, users should navigate to their experiment directory. As in the first forecast, the following commands allow users to launch new workflow tasks and check on experiment progress.
cd /path/to/expt_dirs/test_expt
rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10
rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10
Note
When using cron to automate the workflow submission (as described above), users can omit the rocotorun
command and simply use rocotostat
to check on progress periodically.
Note
If users have not automated their workflow using cron, they will need to ensure that they continue issuing rocotorun
commands to launch all of the tasks in each experiment. While switching between experiment directories to run rocotorun
and rocotostat
commands in both directories is possible, it may be easier to finish the control
experiment’s tasks before starting on test_expt
.
As with the control
experiment, users can save the location of the test_expt
directory in an environment variable (e.g., $TEST
). This makes it easier to navigate between directories later. For example:
export TEST=/path/to/expt_dirs/test_expt
Users should substitute /path/to/expt_dirs/test_expt
with the actual path on their system.
2.6.1.5. Compare and Analyze Results
Navigate to test_expt/2019061518/postprd
. This directory contains the post-processed data generated by the UPP from the test_expt
forecast. After the plot_allvars
task completes, this directory will contain .png
images for several forecast variables including 2-m temperature, 2-m dew point temperature, 10-m winds, accumulated precipitation, composite reflectivity, and surface-based CAPE/CIN. Plots with a _diff
label in the file name are plots that compare the control
forecast and the test_expt
forecast.
2.6.1.5.1. Copy .png
Files onto Local System
Users who are working on the cloud or on an HPC cluster may want to copy the .png
files onto their local system to view in their preferred image viewer. Detailed instructions are available in the Introduction to SSH & Data Transfer.
In summary, users can run the scp
command in a new terminal/command prompt window to securely copy files from a remote system to their local system if an SSH tunnel is already established between the local system and the remote system. Users can adjust one of the following commands for their system:
scp username@your-IP-address:/path/to/source_file_or_directory /path/to/destination_file_or_directory
# OR
scp -P 12345 username@localhost:/path/to/source_file_or_directory /path/to/destination_file_or_directory
Users would need to modify username
, your-IP-address
, -P 12345
, and the file paths to reflect their systems’ information. See the Introduction to SSH & Data Transfer for example commands.
2.6.1.5.2. Compare Images
The plots generated by the experiment cover a variety of variables. After downloading the .png
plots, users can open and view the plots on their local system in their preferred image viewer. Table 2.14 lists the available plots (hhh
corresponds to the three-digit forecast hour):
Field |
File Name |
---|---|
2-meter dew point temperature |
2mdew_diff_regional_fhhh.png |
2-meter temperature |
2mt_diff_regional_fhhh.png |
10-meter winds |
10mwind_diff_regional_fhhh.png |
250-hPa winds |
250wind_diff_regional_fhhh.png |
Accumulated precipitation |
qpf_diff_regional_fhhh.png |
Composite reflectivity |
refc_diff_regional_fhhh.png |
Surface-based CAPE/CIN |
sfcape_diff_regional_fhhh.png |
Sea level pressure |
slp_diff_regional_fhhh.png |
Max/Min 2 - 5 km updraft helicity |
uh25_diff_regional_fhhh.png |
Each difference plotting .png
file contains three subplots. The plot for the second experiment (test_expt
) appears in the top left corner, and the plot for the first experiment (control
) appears in the top right corner. The difference plot that compares both experiments appears at the bottom. Areas of white signify no difference between the plots. Therefore, if the forecast output from both experiments is exactly the same, the difference plot will show a white square (see Sea Level Pressure as an example). If the forecast output from both experiments is extremely different, the plot will show lots of color.
In general, it is expected that the results for test_expt
(using FV3_RRFS_v1beta physics and HRRR/RAP data) will be more accurate than the results for control
(using FV3_GFS_v16 physics and FV3GFS data) because the physics in test_expt
is designed for high-resolution, storm-scale prediction over a short period of time. The control
experiment physics is better for predicting the evolution of larger scale weather phenomena, like jet stream movement and cyclone development, since the cumulus physics in the FV3_GFS_v16 suite is not configured to run at 3-km resolution.
2.6.1.5.3. Analysis
2.6.1.5.3.1. Sea Level Pressure
In the Sea Level Pressure (SLP) plots, the control
and test_expt
plots are nearly identical at forecast hour f000, so the difference plot is entirely white.

Fig. 2.6 Difference Plot for Sea Level Pressure at f000
As the forecast continues, the results begin to diverge, as evidenced by the spattering of light blue dispersed across the f006 SLP difference plot.

Fig. 2.7 Difference Plot for Sea Level Pressure at f006
The predictions diverge further by f012, where a solid section of light blue in the top left corner of the difference plot indicates that to the northwest of Indianapolis, the SLP predictions for the control
forecast were slightly lower than the predictions for the test_expt
forecast.

Fig. 2.8 Difference Plot for Sea Level Pressure at f012
2.6.1.5.3.2. Composite Reflectivity
Reflectivity images visually represent the weather based on the energy (measured in decibels [dBZ]) reflected back from radar. Composite reflectivity generates an image based on reflectivity scans at multiple elevation angles, or “tilts”, of the antenna. See https://www.noaa.gov/jetstream/reflectivity for a more detailed explanation of composite reflectivity.
At f000, the test_expt
plot (top left) is showing more severe weather than the control
plot (top right). The test_expt
plot shows a vast swathe of the Indianapolis region covered in yellow with spots of orange, corresponding to composite reflectivity values of 35+ dBZ. The control
plot radar image covers a smaller area of the grid, and with the exception of a few yellow spots, composite reflectivity values are <35 dBZ. The difference plot (bottom) shows areas where the test_expt
plot (red) and the control
plot (blue) have reflectivity values greater than 20 dBZ. The test_expt
plot has significantly more areas with high composite reflectivity values.

Fig. 2.9 Composite Reflectivity at f000
As the forecast progresses, the radar images resemble each other more (see Figure 2.10). Both the test_expt
and control
plots show the storm gaining energy (with more orange and red areas), rotating counterclockwise, and moving east. Thus, both forecasts do a good job of picking up on the convection. However, the test_expt
forecast still indicates a higher-energy storm with more areas of dark red. It appears that the test_expt
case was able to resolve more discrete storms over northwest Indiana and in the squall line. The control
plot has less definition and depicts widespread storms concentrated together over the center of the state.

Fig. 2.10 Composite reflectivity at f006 shows storm gathering strength
At forecast hour 12, the plots for each forecast show a similar evolution of the storm with both resolving a squall line. The test_expt
plot shows a more intense squall line with discrete cells (areas of high composite reflectivity in dark red), which could lead to severe weather. The control
plot shows an overall decrease in composite reflectivity values compared to f006. It also orients the squall line more northward with less intensity, possibly due to convection from the previous forecast runs cooling the atmosphere. In short, test_expt
suggests that the storm will still be going strong at 06z on June 15, 2019, whereas the control
suggests that the storm will begin to let up.

Fig. 2.11 Composite Reflectivity at f012
2.6.1.5.3.3. Surface-Based CAPE/CIN
2.6.1.5.3.3.1. Background
The National Weather Service (NWS) defines Surface-Based Convective Available Potential Energy (CAPE) as “the amount of fuel available to a developing thunderstorm.” According to NWS, CAPE “describes the instabilily of the atmosphere and provides an approximation of updraft strength within a thunderstorm. A higher value of CAPE means the atmosphere is more unstable and would therefore produce a stronger updraft” (see NWS: What is CAPE? for further explanation).
According to the NWS Storm Prediction Center, Convective Inhibition (CIN) “represents the ‘negative’ area on a sounding that must be overcome for storm initiation.” In effect, it measures negative buoyancy (-B) — the opposite of CAPE, which measures positive buoyancy (B or B+) of an air parcel.
2.6.1.5.3.3.2. Interpreting the Plots
CAPE measures are represented on the plots using color. They range in value from 100-5000 Joules per kilogram (J/kg). Lower values are represented by cool colors and higher values are represented by warm colors. In general, values of approximately 1000+ J/kg can lead to severe thunderstorms, although this is also dependent on season and location.
CIN measures are displayed on the plots using hatch marks:
*
means CIN <= -500 J/kg
+
means -500 < CIN <= -250 J/kg
/
means -250 < CIN <= -100 J/kg
.
means -100 < CIN <= -25 J/kg
In general, the higher the CIN values are (i.e., the closer they are to zero), the lower the convective inhibition and the greater the likelihood that a storm will develop. Low CIN values (corresponding to high convective inhibition) make it unlikely that a storm will develop even in the presence of high CAPE.
At the 0th forecast hour, the test_expt
plot (below, left) shows lower values of CAPE and higher values of CIN than in the control
plot (below, right). This means that test_expt
is projecting lower potential energy available for a storm but also lower inhibition, which means that less energy would be required for a storm to develop. The difference between the two plots is particularly evident in the southwest corner of the difference plot, which shows a 1000+ J/kg difference between the two plots.

Fig. 2.12 CAPE/CIN Difference Plot at f000
At the 6th forecast hour, both test_expt
and control
plots are forecasting higher CAPE values overall. Both plots also predict higher CAPE values to the southwest of Indianapolis than to the northeast. This makes sense because the storm was passing from west to east. However, the difference plot shows that the control
forecast is predicting higher CAPE values primarily to the southwest of Indianapolis, whereas test_expt
is projecting a rise in CAPE values throughout the region. The blue region of the difference plot indicates where test_expt
predictions are higher than the control
predictions; the red/orange region shows places where control
predicts significantly higher CAPE values than test_expt
does.

Fig. 2.13 CAPE/CIN Difference Plot at f006
At the 12th forecast hour, the control
plot indicates that CAPE may be decreasing overall. test_expt
, however, shows that areas of high CAPE remain and continue to grow, particularly to the east. The blue areas of the difference plot indicate that test_expt
is predicting higher CAPE than control
everywhere but in the center of the plot.

Fig. 2.14 CAPE/CIN Difference Plot at f012
2.6.1.6. Try It!
2.6.1.6.1. Option 1: Adjust frequency of forecast plots.
For a simple extension of this tutorial, users can adjust PLOT_FCST_INC
to output plots more frequently. For example, users can set PLOT_FCST_INC: 1
to produce plots for every hour of the forecast. This would allow users to conduct a more fine-grained visual comparison of how each forecast evolved.
2.6.1.6.2. Option 2: Compare output from additional physics suites.
Users are encouraged to conduct additional experiments using the FV3_HRRR and FV3_WoFS_v0 physics suites. Like FV3_RRFS_v1beta, these physics suites were designed for use with high-resolution grids for storm-scale predictions. Compare them to each other or to the control!
Users may find the difference plots for updraft helicity particularly informative. The FV3_GFS_v16 physics suite does not contain updraft helicity output in its diag_table
files, so the difference plot generated in this tutorial is empty. Observing high values for updraft helicity indicates the presence of a rotating updraft, often the result of a supercell thunderstorm capable of severe weather, including tornadoes. Comparing the results from two physics suites that measure this parameter can therefore prove insightful.
2.6.2. Sample Forecast #2: Cold Air Damming
2.6.2.1. Weather Summary
Cold air damming occurs when cold dense air is topographically trapped along the leeward (downwind) side of a mountain. Starting on February 3, 2020, weather conditions leading to cold air damming began to develop east of the Appalachian mountains. By February 6-7, 2020, this cold air damming caused high winds, flash flood advisories, and wintery conditions.
Weather Phenomena: Cold air damming

Fig. 2.15 Precipitation Resulting from Cold Air Damming East of the Appalachian Mountains
2.6.2.2. Tutorial Content
Coming Soon!
2.6.3. Sample Forecast #3: Southern Plains Winter Weather Event
2.6.3.1. Weather Summary
A polar vortex brought arctic air to much of the U.S. and Mexico. A series of cold fronts and vorticity disturbances helped keep this cold air in place for an extended period of time, resulting in record-breaking cold temperatures for many southern states and Mexico. This particular case captures two winter weather disturbances between February 14, 2021 at 06z and February 17, 2021 at 06z that brought several inches of snow to Oklahoma City. A lull on February 16, 2021 resulted in record daily low temperatures.
Weather Phenomena: Snow and record-breaking cold temperatures

Fig. 2.16 Southern Plains Winter Weather Event Over Oklahoma City
2.6.3.2. Tutorial Content
Coming Soon!
2.6.4. Sample Forecast #4: Halloween Storm
- Objectives:
Compare forecast outputs for similar experiments that use different IC/LBC sources.
Use verification tools to assess forecast quality.
2.6.4.1. Weather Summary
A line of severe storms brought strong winds, flash flooding, and tornadoes to the eastern half of the US.
Weather Phenomena: Flooding and high winds

Fig. 2.17 Halloween Storm 2019
2.6.4.2. Data
Data for the Halloween Storm is publicly available in S3 data buckets. The Rapid Refresh (RAP) data can be downloaded from the SRW App data bucket using wget
. Make sure to issue the command from the folder where you want to place the data.
wget https://noaa-ufs-srw-pds.s3.amazonaws.com/develop-20240618/halloween_rap.tgz
tar -xzf halloween_rap.tgz
This will untar the halloween_rap.tgz
data into a directory named RAP
.
The SRW App can pull HRRR data directly from the HRRR data bucket. Users do not need to download the data separately.
2.6.4.3. Load the workflow
To load the workflow environment, source the lmod-setup file and load the workflow conda environment by running:
source /path/to/ufs-srweather-app/etc/lmod-setup.sh <platform>
module use /path/to/ufs-srweather-app/modulefiles
module load wflow_<platform>
where <platform>
is a valid, lowercased machine name (see MACHINE
in Section 3.1.1 for valid values), and /path/to/
is replaced by the actual path to the ufs-srweather-app
.
After loading the workflow, users should follow the instructions printed to the console. Usually, the instructions will tell the user to run conda activate srw_app
. For example, a user on Hera with permissions on the nems
project may issue the following commands to load the workflow (replacing User.Name
with their actual username):
source /scratch1/NCEPDEV/nems/User.Name/ufs-srweather-app/etc/lmod-setup.sh hera
module use /scratch1/NCEPDEV/nems/User.Name/ufs-srweather-app/modulefiles
module load wflow_hera
conda activate srw_app
2.6.4.4. Configuration
Navigate to the ufs-srweather-app/ush
directory. The default (or “control”) configuration for this experiment is based on the config.community.yaml
file in that directory. Users may copy this file into config.yaml
if they have not already done so:
cd /path/to/ufs-srweather-app/ush
cp config.community.yaml config.yaml
Users can save the location of the ush
directory in an environment variable ($USH
). This makes it easier to navigate between directories later. For example:
export USH=/path/to/ufs-srweather-app/ush
Users should substitute /path/to/ufs-srweather-app/ush
with the actual path on their system. As long as a user remains logged into their system, they can run cd $USH
, and it will take them to the ush
directory. The variable will need to be reset for each login session.
2.6.4.4.1. Experiment 1: RAP Data
Edit the configuration file (config.yaml
) to include the variables and values in the sample configuration excerpts below.
Hint
To open the configuration file in the command line, users may run the command:
vi config.yaml
To modify the file, hit the i
key and then make any changes required. To close and save, hit the esc
key and type :wq
to write the changes to the file and exit/quit the file. Users may opt to use their preferred code editor instead.
Start in the user:
section and change the MACHINE
and ACCOUNT
variables. For example, when running on a personal MacOS device, users might set:
user:
RUN_ENVIR: community
MACHINE: macos
ACCOUNT: none
For a detailed description of these variables, see Section 3.1.1.
Users do not need to change the platform:
section of the configuration file for this tutorial.
In the workflow:
section of config.yaml
, update EXPT_SUBDIR
, CCPP_PHYS_SUITE
, PREDEF_GRID_NAME
, DATE_FIRST_CYCL
, DATE_LAST_CYCL
, and FCST_LEN_HRS
.
workflow:
USE_CRON_TO_RELAUNCH: false
EXPT_SUBDIR: halloweenRAP
CCPP_PHYS_SUITE: FV3_RAP
PREDEF_GRID_NAME: RRFS_CONUS_13km
DATE_FIRST_CYCL: '2019103012'
DATE_LAST_CYCL: '2019103012'
FCST_LEN_HRS: 36
PREEXISTING_DIR_METHOD: rename
VERBOSE: true
COMPILER: intel
Note
Users may also want to set USE_CRON_TO_RELAUNCH: true
and add CRON_RELAUNCH_INTVL_MNTS: 3
. This will automate submission of workflow tasks when running the experiment. However, not all systems have cron.
EXPT_SUBDIR:
This variable can be changed to any name the user wants from “halloweenRAP” to “HalloweenStorm1” to “askdfj” (but note that whitespace and some punctuation characters are not allowed). However, the best names will indicate useful information about the experiment. Since this tutorial helps users to compare the output from RAP and HRRR forecast input data, this tutorial will use halloweenRAP
for the Halloween Storm experiment that uses RAP forecast data.
PREDEF_GRID_NAME:
This experiment uses the RRFS_CONUS_13km, rather than the default RRFS_CONUS_25km grid. This 13-km resolution is used in the NOAA operational Rapid Refresh (RAP) model and is the resolution envisioned for the initial operational implementation of the Rapid Refresh Forecast System (RRFS). For more information on this grid, see Section 3.4.
CCPP_PHYS_SUITE:
The FV3_RAP physics suite contains the evolving parameterizations used operationally in the NOAA Rapid Refresh (RAP) model; the suite is also a prime candidate under consideration for initial RRFS implementation and has been well-tested at the 13-km resolution. It is therefore an appropriate physics choice when using the RRFS_CONUS_13km grid.
DATE_FIRST_CYCL
, DATE_LAST_CYCL
, and FCST_LEN_HRS
set parameters related to the date and duration of the forecast. Because this is a one-cycle experiment that does not use cycling or data assimilation, the date of the first cycle and last cycle are the same.
For a detailed description of other workflow:
variables, see Section 3.1.3.
In the task_get_extrn_ics:
section, add USE_USER_STAGED_EXTRN_FILES
and EXTRN_MDL_SOURCE_BASEDIR_ICS
. Users will need to adjust the file path to point to the location of the data on their system.
task_get_extrn_ics:
EXTRN_MDL_NAME_ICS: RAP
USE_USER_STAGED_EXTRN_FILES: true
EXTRN_MDL_SOURCE_BASEDIR_ICS: /path/to/RAP/for_ICS
For a detailed description of the task_get_extrn_ics:
variables, see Section 3.1.8.
Similarly, in the task_get_extrn_lbcs:
section, add USE_USER_STAGED_EXTRN_FILES
and EXTRN_MDL_SOURCE_BASEDIR_LBCS
. Users will need to adjust the file path to point to the location of the data on their system.
task_get_extrn_lbcs:
EXTRN_MDL_NAME_LBCS: RAP
LBC_SPEC_INTVL_HRS: 3
USE_USER_STAGED_EXTRN_FILES: true
EXTRN_MDL_SOURCE_BASEDIR_LBCS: /path/to/RAP/for_LBCS
For a detailed description of the task_get_extrn_lbcs:
variables, see Section 3.1.9.
Users do not need to modify the task_run_fcst:
section for this tutorial.
In the rocoto:tasks:
section, increase the walltime for the data-related tasks and metatasks. Then include the YAML configuration file containing the plotting task in the rocoto:tasks:taskgroups:
section, like this:
rocoto:
tasks:
task_get_extrn_ics:
walltime: 06:00:00
task_get_extrn_lbcs:
walltime: 06:00:00
metatask_run_ensemble:
task_make_lbcs_mem#mem#:
walltime: 06:00:00
task_run_fcst_mem#mem#:
walltime: 06:00:00
taskgroups: '{{ ["parm/wflow/prep.yaml", "parm/wflow/coldstart.yaml", "parm/wflow/post.yaml", "parm/wflow/plot.yaml"]|include }}'
Note
Rocoto tasks are run once each. A Rocoto metatask expands into one or more similar tasks by replacing the values between #
symbols with the values under the var:
key. See the Rocoto documentation for more information.
For more information on how to turn on/off tasks in the workflow, please see Section 2.4.3.2.2.2.
In the task_plot_allvars:
section, add PLOT_FCST_INC: 6
. Users may also want to add PLOT_FCST_START: 0
and PLOT_FCST_END: 36
explicitly, but these can be omitted since the default values are the same as the forecast start and end time respectively.
task_plot_allvars:
COMOUT_REF: ""
PLOT_FCST_INC: 6
PLOT_FCST_INC:
This variable indicates the forecast hour increment for the plotting task. By setting the value to 6
, the task will generate a .png
file for every 6th forecast hour starting from 12z on October 30, 2019 (the 0th forecast hour) through the 36th forecast hour (November 1, 2019 at 0z).
After configuring the forecast, users can generate the forecast by running:
./generate_FV3LAM_wflow.py
To see experiment progress, users should navigate to their experiment directory. Then, use the rocotorun
command to launch new workflow tasks and rocotostat
to check on experiment progress.
cd /path/to/expt_dirs/halloweenRAP
rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10
rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10
Users will need to rerun the rocotorun
and rocotostat
commands above regularly and repeatedly to continue submitting workflow tasks and receiving progress updates.
Note
When using cron to automate the workflow submission (as described above), users can omit the rocotorun
command and simply use rocotostat
to check on progress periodically.
Users can save the location of the halloweenRAP
directory in an environment variable (e.g., $HRAP
). This makes it easier to navigate between directories later. For example:
export HRAP=/path/to/expt_dirs/halloweenRAP
Users should substitute /path/to/expt_dirs/halloweenRAP
with the actual path to the experiment directory on their system. As long as a user remains logged into their system, they can run cd $HRAP
, and it will take them to the halloweenRAP
experiment directory. The variable will need to be reset for each login session.
2.6.4.4.2. Experiment 2: Changing the Forecast Input
Once the halloweenRAP
case is running, users can return to the config.yaml
file (in $USH
) and adjust the parameters for a new forecast. In this forecast, users will change the forecast input to use HRRR
data and alter a few associated parameters.
In the workflow:
section of config.yaml
, update EXPT_SUBDIR
and PREDEF_GRID_NAME
. Other parameters should remain the same.
workflow:
EXPT_SUBDIR: halloweenHRRR
PREDEF_GRID_NAME: RRFS_CONUScompact_13km
Note
Relative to the original CONUS domain, the “compact” CONUS domains are slightly smaller. The original CONUS domains were a bit too large to run with LBCs from HRRR, so the “compact” domains were created to be just small enough to work with HRRR data.
In the task_get_extrn_ics:
section, update the values for EXTRN_MDL_NAME_ICS
and USE_USER_STAGED_EXTRN_FILES
and add EXTRN_MDL_FILES_ICS
. Users may choose to comment out or remove EXTRN_MDL_SOURCE_BASEDIR_ICS
, but this is not necessary.
task_get_extrn_ics:
EXTRN_MDL_NAME_ICS: HRRR
USE_USER_STAGED_EXTRN_FILES: false
EXTRN_MDL_FILES_ICS:
- '{yy}{jjj}{hh}00{fcst_hr:02d}00'
For a detailed description of the task_get_extrn_ics:
variables, see Section 3.1.8.
Update the same values in the task_get_extrn_lbcs:
section:
task_get_extrn_lbcs:
EXTRN_MDL_NAME_LBCS: HRRR
LBC_SPEC_INTVL_HRS: 3
USE_USER_STAGED_EXTRN_FILES: false
EXTRN_MDL_FILES_LBCS:
- '{yy}{jjj}{hh}00{fcst_hr:02d}00'
For a detailed description of the task_get_extrn_lbcs:
variables, see Section 3.1.9.
After configuring the forecast, users can generate the second forecast by running:
./generate_FV3LAM_wflow.py
To see experiment progress, users should navigate to their experiment directory. As in the first forecast, the following commands allow users to launch new workflow tasks and check on experiment progress.
cd /path/to/expt_dirs/halloweenHRRR
rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10
rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10
Note
When using cron to automate the workflow submission (as described above), users can omit the rocotorun
command and simply use rocotostat
to check on progress periodically.
Note
If users have not automated their workflow using cron, they will need to ensure that they continue issuing rocotorun
commands to launch all of the tasks in each experiment. While switching between experiment directories to run rocotorun
and rocotostat
commands in both directories is possible, it may be easier to finish the halloweenRAP
experiment’s tasks before starting on halloweenHRRR
.
As with the halloweenRAP
experiment, users can save the location of the halloweenHRRR
directory in an environment variable (e.g., $HHRRR
). This makes it easier to navigate between directories later. For example:
export HHRRR=/path/to/expt_dirs/halloweenHRRR
Users should substitute /path/to/expt_dirs/halloweenHRRR
with the actual path on their system.
2.6.4.5. How to Analyze Results
Navigate to halloweenHRRR/2019103012/postprd
and/or halloweenRAP/2019203012/postprd
. These directories contain the post-processed data generated by the UPP from the Halloween Storm forecasts. After the plot_allvars
task completes, this directory will contain .png
images for several forecast variables.
2.6.4.5.1. Copy .png
Files onto Local System
Users who are working on the cloud or on an HPC cluster may want to copy the .png
files onto their local system to view in their preferred image viewer. Detailed instructions are available in the Introduction to SSH & Data Transfer.
In summary, users can run the scp
command in a new terminal/command prompt window to securely copy files from a remote system to their local system if an SSH tunnel is already established between the local system and the remote system. Users can adjust one of the following commands for their system:
scp username@your-IP-address:/path/to/source_file_or_directory /path/to/destination_file_or_directory
# OR
scp -P 12345 username@localhost:/path/to/source_file_or_directory /path/to/destination_file_or_directory
Users would need to modify username
, your-IP-address
, -P 12345
, and the file paths to reflect their systems’ information. See the Introduction to SSH & Data Transfer for example commands.
2.6.4.5.2. Examining Forecast Plots at Peak Intensity
This experiment will be looking at plots from HRRR and RAP input files while the Halloween Storm is at or approaching peak intensity.
2.6.4.5.2.1. 250mb Wind
An effective weather forecast begins with analyzing a 250mb wind chart. By using this wind plot, forecasters can identify key features such as jet stream placement, jet maxima, troughs, ridges, and more. This analysis also helps pinpoint areas with the potential for the strongest severe weather.
In the 250mb wind plots below, the halloweenHRRR
and halloweenRAP
plots are nearly identical at forecast hour f036. This shows great model agreement. Analyzing this chart we can see multiple ingredients signaling a significant severe weather event over the eastern CONUS. The first thing to notice is the placement of the jet streak along with troughing approaching the eastern US. Also notice an extreme 150KT jet max over southern Ohio further fueling severe weather. The last thing to notice is the divergence aloft present over the eastern CONUS; seeing divergence present all the way up to 250mb indicates a strong system.

Fig. 2.18 RAP Plot for 250mb Wind

Fig. 2.19 HRRR Plot for 250mb Wind
2.6.4.5.2.2. 10m Wind
The 10m wind plots allows forecasters to pick up on patterns closer to the surface. It shows features such as convergence and pressure areas.
In the 10m wind plots below, the halloweenHRRR
and halloweenRAP
are once again very similar, which makes sense given that the 250mb wind plots are also so similar. We can see a few key features on this chart. The most important is the area of convergence taking place over the east coast which is driving the line of severe storms.

Fig. 2.20 RAP Plot for 10m Winds

Fig. 2.21 HRRR Plot for 10m Winds
2.6.4.5.2.3. Composite Reflectivity
Reflectivity images visually represent the weather based on the energy (measured in decibels [dBZ]) reflected back from radar. Composite reflectivity generates an image based on reflectivity scans at multiple elevation angles, or “tilts”, of the antenna. See https://www.noaa.gov/jetstream/reflectivity for a more detailed explanation of composite reflectivity.
In the composite reflectivity plots below, the halloweenHRRR
and halloweenRAP
models remain quite similar, as expected. Utilizing the reflectivity plots provides the final piece of the puzzle. From the previous analyses, we already had a good understanding of where the storms were likely to occur. Composite reflectivity serves as an additional tool, allowing us to visualize where the models predict storm placement. In this case, the strongest storms are indicated by higher dBZ values and appear to be concentrated in the NC/VA region.

Fig. 2.22 RAP Plot for Composite Reflectivity

Fig. 2.23 HRRR Plot for Composite Reflectivity
2.6.4.5.3. Experiment 3: Performing METplus Verification
In this experiment, we will use the METplus verification framework to evaluate the accuracy of the HRRR forecasts for the Halloween Storm Case. The METplus tools provide a robust and customizable way to assess forecast skill by comparing model output against observational data. This section will guide you through the steps to perform METplus verification for this case.
Note
For tutorial purposes we will only run this test using HRRR data, but users should feel free to do the same experiment with RAP data.
2.6.4.6. Set Up Verification
Follow the instructions below to reproduce a forecast for this event using your own model setup! Make sure to install and build the latest version of the SRW Application (v2.2.0). develop
branch code is constantly changing, so it does not provide a consistent baseline for comparison.
On Level 1 systems, users can find data for the Indianapolis Severe Weather Forecast in the usual data locations (see Section 2.4 for a list).
On other systems, users need to download the HalloweenStormData.tar.gz
file using any of the following methods:
Download directly from the S3 bucket using a browser. The data is available at https://noaa-ufs-srw-pds.s3.amazonaws.com/HalloweenStormData/HalloweenStormData.tar.gz
Download from a terminal using the AWS command line interface (CLI), if installed:
aws s3 cp https://noaa-ufs-srw-pds.s3.amazonaws.com/HalloweenStormData/HalloweenStormData.tar.gz HalloweenStormData.tar.gz
Download from a terminal using
wget
:wget https://noaa-ufs-srw-pds.s3.amazonaws.com/HalloweenStormData/HalloweenStormData.tar.gz
This tar file contains IC/LBC files, observation data, model/forecast output, and MET verification output for the sample forecast. Users who have never run the SRW App on their system before will also need to download (1) the fix files required for SRW App forecasts and (2) the NaturalEarth shapefiles required for plotting. Users can download the fix file data from a browser at https://noaa-ufs-srw-pds.s3.amazonaws.com/experiment-user-cases/release-public-v2.2.0/out-of-the-box/fix_data.tgz or visit Section 3.2.3.1 for instructions on how to download the data with wget
. NaturalEarth files are available at https://noaa-ufs-srw-pds.s3.amazonaws.com/develop-20240618/NaturalEarth.tar.gz. See the Section 2.4.3.2.3 for more information on plotting.
After downloading HalloweenStormData.tar.gz
using one of the three methods above, untar the downloaded compressed archive file:
tar xvfz HalloweenStormData.tar.gz
Save the path to this file in the HalloweenDATA
environment variable:
cd HalloweenStormData
export HalloweenDATA=$PWD
Note
Users can untar the fix files and Natural Earth files by substituting those file names in the commands above.
2.6.4.6.1. Load the Workflow
To load the workflow environment, run:
source /path/to/ufs-srweather-app/etc/lmod-setup.sh <platform>
module use /path/to/ufs-srweather-app/modulefiles
module load wflow_<platform>
where <platform>
is a valid, lowercased machine name (see MACHINE
in Section 3.1.1 for valid values), and /path/to/
is replaced by the actual path to the ufs-srweather-app
.
Users running a csh/tcsh shell would run source /path/to/etc/lmod-setup.csh <platform>
in place of the first command above.
After loading the workflow, users should follow the instructions printed to the console. Usually, the instructions will tell the user to run conda activate srw_app
.
2.6.4.6.2. Configure the Verification Sample Case
Once the workflow environment is loaded, copy the out-of-the-box configuration:
cd /path/to/ufs-srweather-app/ush
cp config.community.yaml config.yaml
where /path/to/ufs-srweather-app/ush
is replaced by the actual path to the ufs-srweather-app/ush
directory on the user’s system.
Then, edit the configuration file (config.yaml
) to include the variables and values in the sample configuration excerpt below (variables not listed below do not need to be changed or removed). Users must be sure to substitute values in <>
with values appropriate to their system.
user:
MACHINE: <your_machine_name>
ACCOUNT: <my_account>
platform:
# Add EXTRN_MDL_DATA_STORES variable to config.yaml
EXTRN_MDL_DATA_STORES: aws
workflow:
USE_CRON_TO_RELAUNCH: true
EXPT_SUBDIR: halloweenstormHRRRMETPLUS
CCPP_PHYS_SUITE: FV3_RAP
PREDEF_GRID_NAME: RRFS_CONUScompact_13km
DATE_FIRST_CYCL: '2019103012'
DATE_LAST_CYCL: '2019103012'
FCST_LEN_HRS: 24
PREEXISTING_DIR_METHOD: rename
VERBOSE: true
# Change to gnu if using a gnu compiler; otherwise, no change
COMPILER: intel
task_get_extrn_ics:
# Add EXTRN_MDL_NAME_ICS and EXTRN_MDL_FILES_ICS variables to config.yaml
EXTRN_MDL_NAME_ICS: HRRR
USE_USER_STAGED_EXTRN_FILES: false
EXTRN_MDL_FILES_ICS:
- '{yy}{jjj}{hh}00{fcst_hr:02d}00'
task_get_extrn_lbcs:
# Add EXTRN_MDL_NAME_LBCS, USE_USER_STAGED_EXTRN_FILES, and EXTRN_MDL_FILES_LBCS variables to config.yaml
EXTRN_MDL_NAME_LBCS: HRRR
LBC_SPEC_INTVL_HRS: 3
USE_USER_STAGED_EXTRN_FILES: false
EXTRN_MDL_FILES_LBCS:
- '{yy}{jjj}{hh}00{fcst_hr:02d}00'
task_plot_allvars:
COMOUT_REF: ""
PLOT_FCST_INC: 6
verification:
VX_FCST_MODEL_NAME: FV3_GFS_v16_CONUS_25km
CCPA_OBS_DIR: /path/to/HalloweenStormData/obs_data/ccpa/proc
MRMS_OBS_DIR: /path/to/HalloweenStormData/obs_data/mrms/proc
NDAS_OBS_DIR: /path/to/HalloweenStormData/obs_data/ndas/proc
rocoto:
tasks:
task_get_extrn_ics:
walltime: 06:00:00
task_get_extrn_lbcs:
walltime: 06:00:00
metatask_run_ensemble:
task_make_lbcs_mem#mem#:
walltime: 06:00:00
task_run_fcst_mem#mem#:
walltime: 06:00:00
taskgroups: '{{ ["parm/wflow/prep.yaml", "parm/wflow/coldstart.yaml", "parm/wflow/post.yaml", "parm/wflow/verify_pre.yaml", "parm/wflow/verify_det.yaml", "parm/wflow/test.yaml"]|include }}'
Hint
To open the configuration file in the command line, users may run the command:
vi config.yaml
To modify the file, hit the i
key and then make any changes required. To close and save, hit the esc
key and type :wq
. Users may opt to use their preferred code editor instead.
For additional configuration guidance, refer to the v2.2.0 release documentation on configuring the SRW App.
2.6.4.6.3. Generate the Experiment
Generate the experiment by running this command from the ush
directory:
./generate_FV3LAM_wflow.py
2.6.4.6.4. Run the Experiment
Navigate (cd
) to the experiment directory ($EXPTDIR
) and run the launch script:
./launch_FV3LAM_wflow.sh
To see experiment progress, users should navigate to their experiment directory. The following commands allow users to launch new workflow tasks and check on experiment progress.
cd path/to/expt_dirs/halloweenstormHRRRMETPLUS
rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10
rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10
Users who prefer to automate the workflow via crontab or who need guidance for running without the Rocoto workflow manager should refer to Section 2.4.4 for these options.
If a problem occurs and a task goes DEAD, view the task log files in $EXPTDIR/log
to determine the problem. Then refer to Section 4.2.3.1 to restart a DEAD task once the problem has been resolved. For troubleshooting assistance, users are encouraged to post questions on the new SRW App GitHub Discussions Q&A page.
2.6.4.7. Compare
Once the experiment has completed (i.e., all tasks have “SUCCEEDED” and the end of the log.launch_FV3LAM_wflow
file lists “Workflow status: SUCCESS”), users can see how the forecast verified. From the expt_dirs
users should navigate to their experiment (e.g., halloweenstormHRRRMETPLUS
). Then, navigate to the 2019103012
subdirectory, and finally, enter the metprd
directory to view your experiment’s results.
2.6.4.7.1. MetPlus File Types
For information on different file types found in the metprd
directory users should reference Section 2.19.
2.6.4.7.2. Analyzing HRRR Results
This section, will analyze how different variables were verified in the HRRR forecast. To do this, users will be examining the RMSE and MBIAS scores for Temperature and Dew Point (DPT) on the surface using the point_stat
file type.
Interpretation:
A lower RMSE indicates that the model forecast value is closer to the observed value.
If MBIAS > 1, then the value for a given forecast variable is too high on average by (MBIAS - 1)%. If MBIAS < 1, then the forecasted value is too low on average by (1 - MBIAS)%.
Find column 66-68 for MBIAS and 78-80 for RMSE statistics.
2.6.4.8. Temperature
To begin examining the temperature variable, navigate to path/to/2019103012/metprd
. Use vi
to open PointStat/
, where a menu of files will appear. Navigate using the arrow and enter keys on your. Select this file:
point_stat_FV3_GFS_v16_CONUS_25km_mem000_ADPSFC_NDAS_240000L_20191031_120000V.stat
Press Enter
to open this file. You will now be able to see verification data for different types of variables.
We can see that on October 31st, at 2400L, temperature received an MBIAS score of 1.7212 and an RMSE score of 2.38512. Looking at these scores, we can conclude that for this forecast hour, the temperature was over forecasted and was greater than the observed temperature.
2.6.4.9. Dew Point
Good news! The DPT variable can be examined within the same point_stat file, so there is no need to switch files. DPT received an MBIAS score of 1.5978 and an RMSE score of 2.51836. Looking at these scores, we can see that, similar to temperature, the DPT was over forecasted and was greater than the observed dewpoint.
2.6.5. Sample Forecast #5: Hurricane Barry
2.6.5.1. Weather Summary
Hurricane Barry made landfall in Louisiana on July 11, 2019 as a Category 1 hurricane. It produced widespread flooding in the region and had a peak wind speed of 72 mph and a minimum pressure of 992 hPa.
Weather Phenomena: Flooding, wind, and tornado reports

Fig. 2.24 Hurricane Barry Making Landfall
2.6.5.2. Tutorial Content
Coming Soon!
2.6.6. Sample Forecast #6: Choose Your Own Adventure!
Objective: Create a forecast of your choice on a custom grid using publicly available data.
2.6.6.1. Weather Summary
Weather will vary depending on the case the user chooses. In this example, we use the Gulf Coast Blizzard from January 21, 2025 (2025-01-21), with a custom regional grid centered over New Orleans, LA.
In this case, the polar vortex stretched far south bringing cold, dry air to the Gulf Coast, where it met the comparatively warm, moist air from the Gulf, leading to unprecedented snowfall followed by record low temperatures.
Weather Phenomena: Record-breaking snow and cold temperatures throughout the region.

Fig. 2.25 Incoming Blizzard Over New Orleans
2.6.6.2. Preliminary Steps
It is recommended that users create a directory on their system where they will store data and run their forecast. In this example, we make a directory called blizzard
and navigate into it:
mkdir blizzard
cd blizzard
Users can save the location of the this directory in an environment variable such as $GCB
(for Gulf Coast Blizzard). This makes it easier to navigate between directories later. For example:
export GCB=/path/to/blizzard
where /path/to/
is replaced with the full path to the blizzard
directory on the user’s system.
2.6.6.3. Data
Users can set up a forecast for a weather case of their choice using data downloaded from a publicly available source. The SRW App requires:
Fix files – Static datasets containing climatological information, terrain, and land use data
Initial and boundary conditions files
Natural Earth Shapefiles for use in optional plotting tasks
2.6.6.3.1. Fix Files
Fix files are publicly available in the SRW App Data Bucket. Users can download the full set of fix files and untar it:
wget https://noaa-ufs-srw-pds.s3.amazonaws.com/experiment-user-cases/release-public-v2.2.0/out-of-the-box/fix_data.tgz
tar -xzf fix_data.tgz
2.6.6.3.2. Initial and Boundary Conditions Files
Initial conditions (ICs) files provide information on the state of the atmosphere at the start of the forecast. Because the SRW App uses regional grids, it needs to update the forecast periodically with information about the state of the atmosphere at the edges of the grid. These are called the lateral boundary conditions (LBCs).
IC and LBC data must be in NetCDF
, grib2
, or nemsio
format. There are several potential sources of publicly available data listed in Section 3.2.3.6. Many of these sources only include data for the last 1-30 days, so it is recommended that users download the data that they think they will need and store the data somewhere so that they are still accessible even when no longer available on these sites.
When choosing data, users should consider the type of weather they are trying to model, the size of the domain they plan to use, and the compute resources available to them. This will help to determine what grid size and resolution is most appropriate. There is more information on regional (or “limited area model”) grids available in Section 3.3.
When modeling severe weather, it is preferable to use a high-resolution (~1-4km) grid along with physics suites and data designed for these convection-allowing scales. Therefore, High-Resolution Rapid Refresh (HRRR) data were downloaded for this case. According to NOAA, “HRRR is a NOAA real-time 3-km resolution, hourly updated, cloud-resolving, convection-allowing atmospheric model, initialized by 3km grids with 3km radar assimilation.”
To download data for this case, users can create a directory named data
(in $GCB
or an equivalent directory) and place the get_data.py
script in there:
mkdir data
cd data
Users can download the get_data.py
utility script with wget
or copy-paste the contents of the script into a file on their system to download recent HRRR data from the Planetary Computer site.
wget https://github.com/ufs-community/ufs-srweather-app/wiki/Tutorial/get_data.py
Note
Users will need at least Python 3.10 and the requests
library to run the script as-is.
Call the script with the -h
option to see usage information:
$ python3 get_data.py -h
usage: get_data.py [-h] --date DATE --product
{wrfsfcf,wrfprsf,wrfnatf,wrfsubhf} [--fhour [FH ...]]
Utility for retrieving HRRR data
options:
-h, --help show this help message and exit
--date DATE, -d DATE Date of requested data in YYYYMMDDHH format
--product {wrfsfcf,wrfprsf,wrfnatf,wrfsubhf}, -p {wrfsfcf,wrfprsf,wrfnatf,wrfsubhf}
Product type (e.g., "wrfsfcf", "wrfprsf")
--fhour [FH ...], -f [FH ...]
Forecast hours
Users are welcome to adapt the script to work for other data sources, such as RAP or GFS data.
When creating this experiment, the following command was used to download the data:
python3 get_data.py -p wrfprsf -d 2025012100 -f 0 3 6 9 12 15 18 21 24 27 30 33 36
Users will need to adjust this command for the specific date(s) and forecast hours they need for their experiment. Although the Gulf Coast Blizzard data is no longer available via Planetary Computer, it has been added to the SRW Data Bucket and placed on Level 1 systems in the usual input model data locations (see Section 2.4 for a list). Users can retrieve this data from the SRW App data bucket by running:
wget https://noaa-ufs-srw-pds.s3.amazonaws.com/develop-20240618/gulf_coast_blizzard.tar.gz
However, users are encouraged to experiment with downloading their own data for a case of interest!
2.6.6.3.3. Natural Earth Shapefiles
The small set of Natural Earth shapefiles required for SRW App plotting tasks is publicly available in the SRW App Data Bucket. Users can download and untar the files:
wget https://noaa-ufs-srw-pds.s3.amazonaws.com/develop-20240618/NaturalEarth.tar.gz
tar xvfz NaturalEarth.tgz
The full set of Natural Earth shapefiles can also be downloaded from the Natural Earth website.
2.6.6.4. Download the SRW App
To start an experiment, navigate to your working directory. For example:
cd $GCB
Users who do not already have the SRW App can download the code by running:
git clone -b develop https://github.com/ufs-community/ufs-srweather-app.git
Then check out the external repositories:
cd /path/to/ufs-srweather-app/
./manage_externals/checkout_externals
Then build the SRW App:
./devbuild.sh --platform=<machine_name>
where <machine_name>
is replaced with the name of the platform the user is working on. See Section 3.1.1 for all valid MACHINE
options.
This process will likely take 20-30 min. Users who do not have access to a Level 1 system may need to use the CMake Approach to build the App instead.
2.6.6.5. Create a Regional Grid
The SRW App already contains several predefined grids, which are well-tested with the supported physics and data sources. While these are a great start for new users, they cover a narrow set of possible grid configurations. Users may be interested in creating their own custom grid instead. Section 3.3.2 explains how to create a custom Extended Schmidt Gnomonic grid (ESGgrid), which is the supported grid type for the SRW App. For this example, we created a grid centered over New Orleans, LA (NOLA), which is located at 29.9509º N, 90.0758º W (29.9509, -90.0758). We thank our colleagues at the Developmental Testbed Center (DTC) for their guidance on this process.
For a custom ESGgrid, users must define three sets of parameters:
ESG grid parameters
Computational parameters
Write-component grid parameters
Note
Users can define the custom grid directly in config.yaml
, but in this tutorial, we demonstrate how to add the grid to the set of predefined grids so that it can be reused across multiple experiments.
First, users must choose a name for the grid. In this example, the grid is named SUBCONUS_NOLA_3km
, which describes the domain as being smaller than the continental United States (CONUS); centered over New Orleans, LA; and at a 3-km resolution. Descriptive names are encouraged but not required.
Users must:
Add the grid name to the list of valid grid names in
ufs-srweather-app/ush/valid_param_vals.yaml
. For example:valid_vals_PREDEF_GRID_NAME: [ "RRFS_CONUS_25km", "RRFS_CONUS_13km", "RRFS_CONUS_3km", ... "SUBCONUS_Ind_3km", "WoFS_3km", "SUBCONUS_CO_3km", "SUBCONUS_CO_1km", "SUBCONUS_NOLA_3km" ]
Add a stanza describing the parameters for the new grid in
ufs-srweather-app/ush/predef_grid_params.yaml
. For example:"SUBCONUS_NOLA_3km": GRID_GEN_METHOD: "ESGgrid" ESGgrid_LON_CTR: -90.0758 ESGgrid_LAT_CTR: 29.9509 ESGgrid_DELX: 3000.0 ESGgrid_DELY: 3000.0 ESGgrid_NX: 200 ESGgrid_NY: 200 ESGgrid_PAZI: 0.0 ESGgrid_WIDE_HALO_WIDTH: 6 DT_ATMOS: 36 LAYOUT_X: 5 LAYOUT_Y: 5 BLOCKSIZE: 40 QUILTING: WRTCMP_write_groups: 1 WRTCMP_write_tasks_per_group: 5 WRTCMP_output_grid: "lambert_conformal" WRTCMP_cen_lon: -90.0758 WRTCMP_cen_lat: 29.9509 WRTCMP_stdlat1: 29.9509 WRTCMP_stdlat2: 29.9509 WRTCMP_nx: 197 WRTCMP_ny: 197 WRTCMP_lon_lwr_left: -93.59904184257404 WRTCMP_lat_lwr_left: 27.29340465451015 WRTCMP_dx: 3000.0 WRTCMP_dy: 3000.0
ESGgrid Parameters
GRID_GEN_METHOD: "ESGgrid"
: This will generate a regional version of the Extended Schmidt Gnomonic (ESG) grid, which is the supported grid type for the SRW App.ESGgrid_LON_CTR
andESGgrid_LAT_CTR
: The longitude and latitude for the center of the grid.ESGgrid_DELX
andESGgrid_DELY
: Grid cell size (in meters) in the x- (west-to-east) and y- (south-to-north) directions. For a 3-km grid, this value is always 3000 (i.e., 3 km).ESGgrid_NX
andESGgrid_NY
: The number of grid cells in the x and y directions. This setting will be based on how much area the grid should cover. For example, a grid that is approximately 1200 km wide by 1200 km tall would haveESGgrid_NX
andESGgrid_NY
values of approximately 400 — \(1200 km / 3 km\) per cell.ESGgrid_PAZI
: The rotation of the grid from true north (in degrees). This can generally be set to zero.ESGgrid_WIDE_HALO_WIDTH
: Number of halo points around the grid with a “wide” halo. This parameter can always be set to6
regardless of the other grid parameters.
Computational Parameters
DT_ATMOS:
Physics time step (in seconds) to use in the forecast model with this grid. This is the largest time step in the model. It is the time step on which the physics parameterizations are called. In general,DT_ATMOS
must be smaller for higher horizontal grid resolutions to avoid numerical instabilities (specifically, Courant-Friedrichs-Lewy [CFL] conditions). Users can examine predefined grids of similar resolutions to their custom grid for suggested values.LAYOUT_X
andLAYOUT_Y
: MPI layout — the number of MPI processes into which to decompose the grid.BLOCKSIZE:
Machine-dependent parameter that does not have a default value.
Quilting Parameters
Note
The UPP (called by the run_post_*
tasks) cannot process output on the native grid types (“GFDLgrid” and “ESGgrid”), so output fields are interpolated to a write component grid before writing them to an output file. The output files written by the UFS Weather Model use an Earth System Modeling Framework (ESMF) component, referred to as the write component. This model component is configured with settings in the model_configure
file, as described in Section 4.2.3 of the UFS Weather Model documentation.
Certain quilting parameters will be the same as the ESGgrid parameters. For example:
Set
WRTCMP_cen_lon
andWRTCMP_cen_lat
to the longitude and latitude for the center of the grid (same asESGgrid_LON_CTR
andESGgrid_LAT_CTR
).Set
WRTCMP_dx
andWRTCMP_dy
to the grid cell size (in meters) in the x- (west-to-east) and y- (south-to-north) directions (same asESGgrid_DELX
andESGgrid_DELY
).
Additionally, adjust the following parameters:
WRTCMP_write_groups:
Each write group consists of a set of dedicated MPI processes that writes the fields on the write-component grid to a file on disk while the forecast continues to run on a separate set of processes. This value can usually be set to 1 but may need to be increased for grids with more grid points (larger domain or higher resolution).WRTCMP_write_tasks_per_group:
The number of MPI processes per group. For the Gulf Coast Blizzard case, this can be set to 5, but it may need to be increased for larger grids with more grid points (larger domain or higher resolution).WRTCMP_output_grid:
The type of write-component grid. This can generally be set to"lambert_conformal"
.WRTCMP_stdlat1
andWRTCMP_stdlat2
: The first and second standard latitudes associated with a Lambert conformal coordinate system. For simplicity, use the latitude of the center of the ESGgrid for both. Using off-center standard latitude values will skew the write-component grid to the east or west, somewhat similar toESGgrid_PAZI
.WRTCMP_nx
andWRTCMP_ny
: To ensure that the write-component grid lies completely within the ESG grid, set these to ~95% ofESGgrid_NX
andESGgrid_NY
, respectively.
Then, calculate the values of WRTCMP_lon_lwr_left
and WRTCMP_lat_lwr_left
. According to the DTC guidance, “approximate values (from a linearization that is more accurate the smaller the horizontal extent of the grid) for these quantities can be obtained using the formulas (for reference)”:
\(WRTCMP\_lon\_lwr\_left = WRTCMP\_cen\_lon - (degs\_per\_meter/(2*cos\_phi\_ctr))*WRTCMP\_nx*WRTCMP\_dx\)
\(WRTCMP\_lat\_lwr\_left = WRTCMP\_cen\_lat - (degs\_per\_meter/2)*WRTCMP\_ny*WRTCMP\_dy\)
where:
\(pi\_geom ≈ 3.14\)
\(radius\_Earth ≈ 6371000m\)
\(cos\_phi\_ctr = cos((180/pi\_geom)*WRTCMP\_lat\_lwr\_left)\)
\(degs\_per\_meter = 180/(pi\_geom*radius\_Earth)\)
For the NOLA grid, this comes out to:
WRTCMP_lon_lwr_left:
-93.59904184257404WRTCMP_lat_lwr_left:
27.29340465451015
For supplemental descriptions of the variables that need to be set, see Sections 3.1.5.2: ESGgrid Settings and 3.1.12: Forecast Configuration Parameters.
2.6.6.6. Load the Workflow
To load the workflow environment, source the lmod-setup file and load the workflow conda environment by running:
source /path/to/ufs-srweather-app/etc/lmod-setup.sh <platform>
module use /path/to/ufs-srweather-app/modulefiles
module load wflow_<platform>
where <platform>
is a valid, lowercased machine name (see MACHINE
in Section 3.1.1 for valid values), and /path/to/
is replaced by the actual path to the ufs-srweather-app
.
Then follow the instructions printed to the console and run conda activate srw_app
. For example, a user on Hercules may issue the following commands to load the workflow:
source $GCB/ufs-srweather-app/etc/lmod-setup.sh hercules
module use $GCB/ufs-srweather-app/modulefiles
module load wflow_hercules
conda activate srw_app
2.6.6.7. Configuration
Navigate to the ufs-srweather-app/ush
directory. The default (or “control”) configuration for this experiment is based on the config.community.yaml
file in that directory. Users may copy this file into config.yaml
if they have not already done so:
cd /path/to/ufs-srweather-app/ush
cp config.community.yaml config.yaml
Users can save the location of the ush
directory in an environment variable ($USH
). This makes it easier to navigate between directories later. For example:
export USH=/path/to/ufs-srweather-app/ush
Users should substitute /path/to/ufs-srweather-app/ush
with the actual path on their system. As long as a user remains logged into their system, they can run cd $USH
, and it will take them to the ush
directory. The variable will need to be reset for each login session.
Edit the configuration file (config.yaml
) to include the variables and values in the sample configuration excerpts below:
Hint
To open the configuration file in the command line, users may run the command:
vi config.yaml
To modify the file, hit the i
key and then make any changes required. To close and save, hit the esc
key and type :wq
to write the changes to the file and exit/quit the file. Users may opt to use their preferred code editor instead.
Start in the user:
section of config.yaml
and change the MACHINE
and ACCOUNT
variables. For example, when running on a personal Linux device, users might set:
user:
RUN_ENVIR: community
MACHINE: linux
ACCOUNT: none
For a detailed description of these variables, see Section 3.1.1.
In the workflow:
section of config.yaml
, update EXPT_SUBDIR
, CCPP_PHYS_SUITE:
, PREDEF_GRID_NAME
, DATE_FIRST_CYCL
, DATE_LAST_CYCL
, and FCST_LEN_HRS
.
workflow:
USE_CRON_TO_RELAUNCH: true
CRON_RELAUNCH_INTVL_MNTS: 3
EXPT_SUBDIR: hrrr_nola36
CCPP_PHYS_SUITE: FV3_HRRR
PREDEF_GRID_NAME: SUBCONUS_NOLA_3km
DATE_FIRST_CYCL: '2025012100'
DATE_LAST_CYCL: '2025012100'
FCST_LEN_HRS: 36
PREEXISTING_DIR_METHOD: rename
VERBOSE: true
COMPILER: intel
Note
Users may also want to set USE_CRON_TO_RELAUNCH: true
and add CRON_RELAUNCH_INTVL_MNTS: 3
. This will automate submission of workflow tasks when running the experiment. However, not all systems have cron.
EXPT_SUBDIR:
This variable can be changed to any name the user wants from “HRRR_physics_fcst” to “forecast1” to “askdfj” (but note that whitespace and some punctuation characters are not allowed). However, the best names will indicate useful information about the experiment. This tutorial uses hrrr_nola36
to indicate that it is a 36-hour forecast over NOLA using HRRR data.
CCPP_PHYS_SUITE:
The FV3_HRRR physics suite was specifically created for convection-allowing scales. Users are welcome to try other physics suites designed for high-resolution grids and storm-scale predictions (e.g., FV3_WoFS_v0).
PREDEF_GRID_NAME:
Replace the default value with the name of the grid you created. For example, we used SUBCONUS_NOLA_3km,
rather than the default RRFS_CONUS_25km grid.
DATE_FIRST_CYCL
, DATE_LAST_CYCL
, and FCST_LEN_HRS
set parameters related to the date and duration of the forecast. Because this is a one-cycle experiment that does not use cycling or data assimilation, the date of the first cycle and last cycle are the same. The SRW App will perform best for forecast durations of 48 hours or less, but users can choose longer durations if desired.
For a detailed description of other workflow:
variables, see Section 3.1.3.
In the task_get_extrn_ics:
section, add USE_USER_STAGED_EXTRN_FILES
and EXTRN_MDL_SOURCE_BASEDIR_ICS
. Users will need to adjust the file path to reflect the location of data on their system (see Section 2.4.1 for locations on Level 1 systems).
task_get_extrn_ics:
EXTRN_MDL_NAME_ICS: HRRR
FV3GFS_FILE_FMT_ICS: grib2
USE_USER_STAGED_EXTRN_FILES: true
EXTRN_MDL_SOURCE_BASEDIR_ICS: /path/to/blizzard/data/HRRR
For a detailed description of the task_get_extrn_ics:
variables, see Section 3.1.8.
Similarly, in the task_get_extrn_lbcs:
section, add USE_USER_STAGED_EXTRN_FILES
and EXTRN_MDL_SOURCE_BASEDIR_LBCS
. Users will need to adjust the file path to reflect the location of data on their system (see Section 2.4.1 for locations on Level 1 systems).
task_get_extrn_lbcs:
EXTRN_MDL_NAME_LBCS: HRRR
LBC_SPEC_INTVL_HRS: 3
FV3GFS_FILE_FMT_LBCS: grib2
USE_USER_STAGED_EXTRN_FILES: true
EXTRN_MDL_SOURCE_BASEDIR_LBCS: /path/to/blizzard/data/HRRR
For a detailed description of the task_get_extrn_lbcs:
variables, see Section 3.1.9.
Users do not need to modify the task_run_fcst:
section for this tutorial.
Lastly, in the task_plot_allvars:
section, add PLOT_FCST_INC: 6
and PLOT_DOMAINS: ["regional"]
. Users may also want to add PLOT_FCST_START: 0
and PLOT_FCST_END: 36
explicitly, but these can be omitted since the default values are the same as the forecast start and end time respectively.
task_plot_allvars:
COMOUT_REF: ""
PLOT_FCST_INC: 6
PLOT_DOMAINS: ["regional"]
PLOT_FCST_INC:
This variable indicates the forecast hour increment for the plotting task. By setting the value to 6
, the task will generate .png
files for every 6th forecast hour. For the Gulf Coast Blizzard forecast, this is 0z on January 21, 2025 (the 0th forecast hour) through the 36th forecast hour (January 22, 2025 at 12z).
PLOT_DOMAINS:
The plotting scripts are designed to generate plots over the entire CONUS by default, but by setting this variable to [“regional”], the experiment will generate plots for the smaller custom regional domain (e.g., SUBCONUS_NOLA_3km) instead.
To turn on the plotting for the experiment, the plotting YAML file
should be included in the rocoto:tasks:taskgroups:
section, like this:
rocoto:
tasks:
taskgroups: '{{ ["parm/wflow/prep.yaml", "parm/wflow/coldstart.yaml", "parm/wflow/post.yaml", "parm/wflow/plot.yaml"]|include }}'
metatask_run_ensemble:
task_run_fcst_mem#mem#:
walltime: 02:00:00
For more information on how to turn on/off tasks in the workflow, please see Section 2.4.3.2.2.2.
After configuring the forecast, users can generate the forecast by running:
./generate_FV3LAM_wflow.py
To see experiment progress, users should navigate to their experiment directory. Then, use the rocotorun
command to launch new workflow tasks and rocotostat
to check on experiment progress.
cd $GCB/expt_dirs/hrrr_nola36
rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10
rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10
Users will need to rerun the rocotorun
and rocotostat
commands above regularly and repeatedly to continue submitting workflow tasks and receiving progress updates.
Note
When using cron to automate the workflow submission (as described above), users can omit the rocotorun
command and simply use rocotostat
to check on progress periodically.
Users can save the location of the hrrr_nola36
directory in an environment variable ($NOLA
). This makes it easier to navigate between directories later. For example:
export NOLA=$GCB/expt_dirs/hrrr_nola36
As long as a user remains logged into their system, they can run cd $NOLA
, and it will take them to the hrrr_nola36
experiment directory. The variable will need to be reset for each login session.
The workflow run is complete when all tasks have “SUCCEEDED”. If everything goes smoothly, users will eventually see a workflow status table similar to the following:
CYCLE TASK JOBID STATE EXIT TRIES DURATION
STATUS
=================================================================================
202501210000 make_grid 4348919 SUCCEEDED 0 1 24.0
202501210000 make_orog 4348937 SUCCEEDED 0 1 48.0
202501210000 make_sfc_climo 4349065 SUCCEEDED 0 1 41.0
202501210000 get_extrn_ics 4348920 SUCCEEDED 0 1 29.0
202501210000 get_extrn_lbcs 4348921 SUCCEEDED 0 1 62.0
202501210000 make_ics_mem000 4349103 SUCCEEDED 0 1 46.0
202501210000 make_lbcs_mem000 4349104 SUCCEEDED 0 1 26.0
202501210000 run_fcst_mem000 4349747 SUCCEEDED 0 1 46.0
202501210000 run_post_mem000_f000 4349918 SUCCEEDED 0 1 38.0
202501210000 run_post_mem000_f001 4349919 SUCCEEDED 0 1 25.0
202501210000 run_post_mem000_f002 4349920 SUCCEEDED 0 1 28.0
202501210000 run_post_mem000_f003 4350024 SUCCEEDED 0 1 29.0
...
202501210000 run_post_mem000_f033 4350362 SUCCEEDED 0 1 28.0
202501210000 run_post_mem000_f034 4350363 SUCCEEDED 0 1 29.0
202501210000 run_post_mem000_f035 4350382 SUCCEEDED 0 1 28.0
202501210000 run_post_mem000_f036 4350383 SUCCEEDED 0 1 28.0
202501210000 plot_allvars_mem000_f000 4350460 SUCCEEDED 0 1 09.0
202501210000 plot_allvars_mem000_f001 4350461 SUCCEEDED 0 1 87.0
202501210000 plot_allvars_mem000_f002 4350462 SUCCEEDED 0 1 89.0
202501210000 plot_allvars_mem000_f003 4350463 SUCCEEDED 0 1 77.0
...
202501210000 plot_allvars_mem000_f033 4350493 SUCCEEDED 0 1 81.0
202501210000 plot_allvars_mem000_f034 4350494 SUCCEEDED 0 1 80.0
202501210000 plot_allvars_mem000_f035 4350495 SUCCEEDED 0 1 79.0
202501210000 plot_allvars_mem000_f036 4350496 SUCCEEDED 0 1 77.0
If tasks fail, examine the log files in the experiment directory (e.g., $GCB/expt_dirs/hrrr_nola36/log
).
To rerun tasks, view information on useful Rocoto workflow commands.
2.6.6.8. Analyze Results
Navigate to the postprd
directory for the date of your forecast (e.g., $GCB/expt_dirs/hrrr_nola36/2025012100/postprd
). This directory contains the post-processed data generated by the UPP from the forecast. After the plot_allvars
task completes, this directory will contain .png
images for several forecast variables. The most appropriate variables to track will differ for each case. Given the predictions for blizzard conditions and extreme cold, this tutorial examines the progression of the storm via predictions of 2-m temperature, 10-m winds, composite reflectivity, and accumulated precipitation. In layman’s terms, 2-m temperature is the temperature close to (2 meters from) the Earth’s surface; 10-m wind is the wind velocity at the lowest atmospheric level; reflectivity measures the energy (in decibels [dBZ]) reflected back from radar, and composite reflectivity generates an image based on reflectivity scans at multiple elevation angles, or “tilts”, of the antenna. Accumulated precipitation measures how much precipitation is predicted to fall.
Note
There are many options for viewing plots, and instructions for this are highly machine-dependent. Users can view the data transfer documentation for their system to secure-copy files from a remote system (such as RDHPCS) to their local system.
Another option is to download Xming (for Windows) or XQuartz (for Mac), use the -X
option when connecting to a remote system via SSH, and run:
module load imagemagick
display file_name.png
where file_name.png
is the name of the file to display/view. Depending on the system, users may need to install imagemagick and/or adjust other settings (e.g., for X11 forwarding). Users should contact their machine administrator with any questions.
2.6.6.8.1. Forecast Hour 000
At forecast hour zero (f000, 6pm local time), the 2-m temperature plot shows that the air over land is significantly colder than the more moist air over the water, creating prime conditions for precipitation along the front. At f000, winds were also predicted to pick up over the Gulf coast, bringing winds of up to 25 knots (~29 mph). For reference, the National Weather Service (NWS) describes these as windy conditions; however, wind damage is unlikely to result (https://www.weather.gov/mlb/seasonal_wind_threat). The composite reflectivity plot shows that areas of precipitation are building along the southern and western edges of the map. At f000, there can be no accumulated precipitation.
2-m Temperature |
10-m Wind |
Composite Reflectivity |
---|---|---|
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
Note
Users may wish to adjust the plotting script to limit the temperature range for this experiment, since this is a cold weather case where temperatures are not likely to increase to the extreme of 128+ºF. This change would show the contrast in temperatures over land and water more clearly.
At this point, NWS was warning of potential blizzard conditions on a variety of platforms, including social media:

Fig. 2.26 NWS Lake Charles, LA on Facebook
2.6.6.8.2. Forecast Hour 012
At forecast hour 12, folks in New Orleans normally would have been preparing for their morning commute to work. However, on Jan. 21, 2025 at 6am CT, snow was falling in a place that rarely sees snow, and winds had picked up. By 8am, meteorologist A.J. Mastrangelo noted that the weather “[a]lmost looks like a hurricane… only with snow as the precip type!”

Fig. 2.27 The morning commute in New Orleans, LA
The SRW App plots indicate a drop in temperature during the night and an increase in winds and precipitation. The predicted drop in 2-m temperature over land would increase the temperature differential between the relatively warm air over the waters and the colder temperatures over land. This, in turn, would be likely to cause more severe weather. By f012, the wind was also predicted to pick up, and the patch of red in the southeast corner of the plot shows incoming winds reaching 40 knots (~46 mph) off the coast. The composite reflectivity image shows significant precipitation with a few high-energy spots in red-orange popping up. Precipitation was predicted to start accumulating somewhat. For people who are not used to driving in these conditions, it could be dangerous, but the bigger danger would be how to get home. Conditions were deteriorating, and the worst snowfall was expected in the 11 AM – 6 PM window. That said, a simple comparison of Mastrangelo’s post and the predicted conditions show that the SRW App is likely underpredicting precipitation and wind over land at this point in the storm.
2-m Temperature |
10-m Wind |
Composite Reflectivity |
Accumulated Precipitation |
---|---|---|---|
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
2.6.6.8.3. Forecast Hour 024
By f024, or 6pm CT on January 21, most people had enjoyed a snow day — possibly their first, complete with Mardi Gras snowmen. Check out more photos here!

Fig. 2.28 Mardi Gras snowman (by @davidnola on Instagram)
The SRW App plots show that by f024f (6 PM local time), temperatures were predicted to warm slightly over the course of the day before taking a plunge overnight. Winds were predicted to move eastward and calm somewhat. Composite reflectivity plots show that the snow was anticipated to move eastward and calm over the New Orleans area with fewer high-energy orangey-red areas. At this point, SRW App forecast shows less than two inches of accumulated precipitation — a lot for New Orleans area but still significantly less than the observed precipitation.
2-m Temperature |
10-m Wind |
Composite Reflectivity |
Accumulated Precipitation |
---|---|---|---|
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
However, NWS was predicting a severe drop in temperatures overnight.

Fig. 2.29 Dangerous drop in temperatures overnight
2.6.6.8.4. Forecast Hour 036
Like the NWS, the SRW App predicted a drop in temperatures overnight. Plots show the winds dissipating, precipitation trailing off, and a total accumulated precipitation of less than 2 inches.
2-m Temperature |
10-m Wind |
Composite Reflectivity |
Accumulated Precipitation |
---|---|---|---|
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
While this generally corresponds with how the storm unfolded, this forecast significantly underpredicted snow accumulation.

Fig. 2.30 Official snowfall for New Orleans, LA
2.6.6.8.5. The Aftermath
Overall, New Orleans handled the snow reasonably well under the circumstances, but the storm still caused significant challenges, including:
A boil water advisory for parts of New Orleans after a brief, cold-related power loss at Carrollton Water Treatment Plant
3,500 power outages statewide; 200+ in Orleans Parish
Shelters reached capacity: New Orleans officials opened Treme Recreation Community Center to accommodate more people.
Weather-related deaths across the south; 2 in Louisiana
2.6.6.9. How Did We Do?
Overall, this configuration of the SRW App significantly underpredicted snowfall. Plots show a maximum prediction of about ~1.5 inches, but New Orleans received 8 inches of snow. Due to a lack of publicly available verification data, it is difficult to determine whether wind predictions over land were accurate. For example, both NOMADS and Planetary Computer include MRMS verification data for the previous 24 hours only, so users would need to plan in advance to download this data before it becomes unavailable for their experiment. Publicly available CCPA or NDAS data is also limited. Verification data is key to determining precisely how accurate or inaccurate forecasts are, and in what ways they excel or fall short. With concrete skill measurements, scientists can investigate potential improvements to the current forecasting systems and quantify those improvements once implemented.
2.6.6.10. Improving the Forecast
There are several options available to improve this forecast. In general, ensemble forecasts will provide more accurate predictions than any single deterministic forecast. Users can run an ensemble forcast by setting DO_ENSEMBLE: true
and selecting a value for NUM_ENS_MEMBERS
, which indicates how many ensemble members will be included in the forecast. Verification capabilities also exist within the SRW App, and users who have access to the requisite data can visit Section 3.1.18 to explore the relevant variables and perform a quantitative analysis of the forecast’s shortcomings so that researchers and developers can work towards improvements. Lastly, the addition of new capabilities, such as data assimilation via JEDI, have the potential to vastly improve prediction capabilities. Users and developers are encouraged to explore improvements that are most interesting and relevant to their work.