Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Start+count exceeds dimension bound error message from ufs-weather-model while using fractional grid update in SRW App #961

Open
MichaelLueken opened this issue Jun 12, 2024 · 12 comments

Comments

@MichaelLueken
Copy link

While attempting to run the current UFS_UTILS develop HEAD (though, this same issue has been seen following updating the UFS_UTILS hash to 7addff5) in the SRW App, one of the fundamental Workflow-End-to-End (WE2E) tests, grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_HRRR_suite_HRRR, is failing while attempting to run the forecast with the following error message:

NetCDF: Start+count exceeds dimension bound: netcdf_read_data_3d: file:INPUT/sfc_data.nc- variable:tiice

The version of the ufs-weather-model that is currently being used with this testing is 1c6b4d4 from May 16, 2024.

More information regarding the WE2E test that failed:

CCPP physics suite used is FV3_HRRR. The predefined grid is RRFS_CONUScompact_25km. Both the ICs and LBCs were derived from the HRRR. The test is a 24-hour forecast beginning on 2020081000.

Has anyone encountered this behavior before? What additional changes should be made to the workflow to correct this error?

@MichaelLueken
Copy link
Author

Some additional details:

The comprehensive test suite was run and there were 21 failures total, all with the same error noted above. The listing of failed tests include:

custom_ESGgrid - FV3_HRRR
custom_ESGgrid_Great_Lakes_snow_8km - FV3_RAP
custom_ESGgrid_NewZealand_3km - FV3_HRRR
custom_ESGgrid_Peru_12km - FV3_RAP
get_from_AWS_ics_GEFS_lbcs_GEFS_fmt_grib2_2022040400_ensemble_2mems - FV3_HRRR
get_from_HPSS_ics_GDAS_lbcs_GDAS_fmt_netcdf_2022040400_ensemble_2mems - FV3_HRRR
get_from_HPSS_ics_HRRR_lbcs_RAP - FV3_HRRR
get_from_HPSS_ics_RAP_lbcs_RAP - FV3_HRRR
grid_RRFS_AK_3km_ics_FV3GFS_lbcs_FV3GFS_suite_HRRR - FV3_HRRR
grid_RRFS_CONUS_13km_ics_FV3GFS_lbcs_FV3GFS_suite_RAP - FV3_RAP
grid_RRFS_CONUS_13km_ics_FV3GFS_lbcs_FV3GFS_suite_HRRR - FV3_HRRR
grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_HRRR - FV3_HRRR
grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_RAP - FV3_RAP
grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_RAP_suite_RAP - FV3_RAP
grid_RRFS_CONUS_3km_ics_FV3GFS_lbcs_FV3GFS_suite_HRRR - FV3_HRRR
grid_RRFS_CONUScompact_13km_ics_HRRR_lbcs_RAP_suite_HRRR - FV3_HRRR
grid_RRFS_CONUScompact_25km_ics_HRRR_lbcs_HRRR_suite_HRRR - FV3_HRRR
grid_RRFS_CONUScompact_3km_ics_HRRR_lbcs_RAP_suite_HRRR - FV3_HRRR
grid_RRFS_NA_13km_ics_FV3GFS_lbcs_FV3GFS_suite_RAP - FV3_RAP
grid_SUBCONUS_Ind_3km_ics_HRRR_lbcs_HRRR_suite_HRRR - FV3_HRRR
long_fcst - FV3_RAP

In the above list, the FV3_RAP and FV3_HRRR indicate the CCPP physics suite used. Of note, all tests using FV3_RAP and FV3_HRRR failed, while all other physics suites successfully passed.

Given the failure in INPUT/sfc_data.nc, this file is generated in the exregional_make_ics script. Looking for these two specific physics suites in the exregional_make_ics script, it is only used for setting either GSDphys_var_map.txt or GFSphys_var_map.txt as the varmap table to use.

@MichaelLueken
Copy link
Author

Thank you very much, @GeorgeGayno-NOAA, for the email correspondence and checking the consistency of the files in ./orog and ./sfc_climo that points with some land have valid surface data.

It turns out that the issue is due to the fact that both RAP and HRRR SDFs use RUC LSM. Unfortunately, Model%kice is 9 for RUC LSM, but tiice in the initial conditions only has two vertical layers. This is causing the issue that is being encountered in this issue. It's not clear to me how to best address this issue, since tiice has two vertical layers, but Model%kice is required to be 9 for RUC LSM. Would it be possible to add in the v1 sfc file generation to chgres_cube, so that v1 sfc data files can be used for RAP and HRRR physics suites, while v2 sfc data can be used for the rest of the non-RUC LSM based physics suites?

Thank you very much for the assistance with this issue!

@GeorgeGayno-NOAA
Copy link
Collaborator

Thank you very much, @GeorgeGayno-NOAA, for the email correspondence and checking the consistency of the files in ./orog and ./sfc_climo that points with some land have valid surface data.

It turns out that the issue is due to the fact that both RAP and HRRR SDFs use RUC LSM. Unfortunately, Model%kice is 9 for RUC LSM, but tiice in the initial conditions only has two vertical layers. This is causing the issue that is being encountered in this issue. It's not clear to me how to best address this issue, since tiice has two vertical layers, but Model%kice is required to be 9 for RUC LSM. Would it be possible to add in the v1 sfc file generation to chgres_cube, so that v1 sfc data files can be used for RAP and HRRR physics suites, while v2 sfc data can be used for the rest of the non-RUC LSM based physics suites?

Thank you very much for the assistance with this issue!

v1 of the surface coldstart file is being deprecated. At some point, only v2 files will be used.

@MichaelLueken
Copy link
Author

v1 of the surface coldstart file is being deprecated. At some point, only v2 files will be used.

Thank you, @GeorgeGayno-NOAA! I'll try reaching out to the FV3ATM team and see if they might have a strategy to deal with tiice for RAP and HRRR.

@JeffBeck-NOAA
Copy link
Collaborator

@GeorgeGayno-NOAA, since this update has unfortunately broken RUC-LSM-based physics schemes across all UFS applications, what would be your recommendation on how we can fix this? Should we make a change in the ufs-weather-model repository that somehow tells RUC-LSM not to use the tiice field? Thanks.

@GeorgeGayno-NOAA
Copy link
Collaborator

Is the 'tiice' record used by RUC-LSM? If it is used, does it need a 9 layer 'tiice' record?

@JeffBeck-NOAA
Copy link
Collaborator

@tanyasmirnova, are you able to comment on George's question above? Thank you!

@tanyasmirnova
Copy link

@tanyasmirnova, are you able to comment on George's question above? Thank you!
RUC ice model has 9 levels, similar to soil. It is ridiculous to stick with 2/4 for ever. We have to make it flexible.
In the 7addff5 hash, I saw a comment for soil that interpolation is applied if there are more than 4 levels. Same could be done for ice as a temporary fix until the number of levels in ice and soil is flexible.

@GeorgeGayno-NOAA
Copy link
Collaborator

@tanyasmirnova, are you able to comment on George's question above? Thank you!
RUC ice model has 9 levels, similar to soil. It is ridiculous to stick with 2/4 for ever. We have to make it flexible.
In the 7addff5 hash, I saw a comment for soil that interpolation is applied if there are more than 4 levels. Same could be done for ice as a temporary fix until the number of levels in ice and soil is flexible.

Thanks for your reply. We can certainly update chgres to output 9 ice layers if necessary.

@GeorgeGayno-NOAA
Copy link
Collaborator

@JeffBeck-NOAA and @MichaelLueken - do you want me to work on this? I won't be able to get to it until later next week.

@MichaelLueken
Copy link
Author

@GeorgeGayno-NOAA - Updating chgres to output 9 ice layers for tiice would certainly allow RAP and HRRR physics suites to work with fractional grids. If tiice is required to be 2 layers, is this a requirement in the weather model? This behavior needs to be changed to allow RUC LSM physics packages to work with the new fractional grid in chgres.

@LarissaReames-NOAA
Copy link
Collaborator

@GeorgeGayno-NOAA - Updating chgres to output 9 ice layers for tiice would certainly allow RAP and HRRR physics suites to work with fractional grids. If tiice is required to be 2 layers, is this a requirement in the weather model? This behavior needs to be changed to allow RUC LSM physics packages to work with the new fractional grid in chgres.

If this would involve interpolation from 2 to 9 layers, shouldn't it just be done in the weather model where this sort of interpolation is already done for soil moisture and temperature? It would seem odd to duplicate that sort of code here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants