Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add instructions for the gdas_init utility to readthedocs #920

Merged
merged 17 commits into from
Mar 18, 2024
54 changes: 54 additions & 0 deletions docs/source/ufs_utils.rst
Original file line number Diff line number Diff line change
Expand Up @@ -665,3 +665,57 @@ Run script
----------

To run, use the machine-dependent script under ./util/weight_gen

***************************************************
UFS_UTILS utilities
***************************************************

gdas_init
=========

Introduction
------------

The gdas_init utility is used to create coldstart initial conditions for global cycled experiments using the chgres_cube program. It has two components: one that pulls the input data required by chgres_cube from HPSS, and one that runs chgres_cube. The utility is only supported on machines with access to HPSS:
GeorgeGayno-NOAA marked this conversation as resolved.
Show resolved Hide resolved

* Hera
* Jet
* WCOSS2
* S4 (Only the chgres_cube step is supported, not the data pull step.)

Location
--------

Find it here: ./util/gdas_init

Build UFS_UTILS and set 'fixed' directories
-------------------------------------------

* Invoke the build script from the root directory: ``./build_all.sh``
* Set the 'fixed' directories using the script in the 'fix' subdirectory: ``./link_fixdirs.sh emc $MACHINE`` (where MACHINE is 'hera', 'jet', 'wcoss2', or 's4')
KateFriedman-NOAA marked this conversation as resolved.
Show resolved Hide resolved

Configure for your experiment
-----------------------------

Edit the variables in the 'config' file for your experiment:

* **EXTRACT_DIR** - Directory where data extracted from HPSS is stored.
* **EXTRACT_DATA** - Set to 'yes' to extract data from HPSS. If data has been extracted and is located in EXTRACT_DIR, set to 'no'. On 's4' this step can't be run. Instead, the data must be pulled from another machine.
* **RUN_CHGRES** - To run chgres, set to 'yes'. To extract data only, set to 'no'.
* **yy/mm/dd/hh** - The year/month/day/hour of your desired experiment. Currently, does not support pre-ENKF GFS data, prior to 2012 May 21 00z. Use two digits.
* **LEVS** - Number of hybrid levels plus 1. To run with 64 levels, set LEVS to 65.
GeorgeGayno-NOAA marked this conversation as resolved.
Show resolved Hide resolved
* **CRES_HIRES** - Resolution of the hires component of your experiment. Example: C768.
* **CRES_ENKF** - Resolution of the enkf component of the experiments.
* **UFS_DIR** - Location of your cloned UFS_UTILS repository.
* **OUTDIR** - Directory where the coldstart data output from chgres is stored.
* **CDUMP** - When 'gdas', will process gdas and enkf members. When 'gfs', will process gfs member for running free forecast only.
* **use_v16retro** - When 'yes', use v16 retrospective parallel data. The retro parallel tarballs can be missing or incomplete. So this option may not always work. Contact a UFS_UTILS repository manager if you encounter problems.

Kick off the utility
--------------------

Submit the script for your machine: ``./driver.$MACHINE.sh`` where MACHINE IS 'hera', 'jet', 'wcoss2', or 's4'.

The standard output will be placed in log files in the current directory.

The converted output will be found in $OUTDIR, including the needed abias and radstat initial condition files (if CDUMP=gdas). The files will be in the needed directory structure for the global-workflow system, therefore a user can move the contents of their $OUTDIR directly into their $ROTDIR.
Loading