-
Notifications
You must be signed in to change notification settings - Fork 37
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added cubed_sphere
target grid test to utility/combine_topo
#861
base: main
Are you sure you want to change the base?
Added cubed_sphere
target grid test to utility/combine_topo
#861
Conversation
…ine_topo, nearing complete workflow for pull request
…nd cubed_sphere remapping cases, and successfully ran for 0.1 deg and ne30 target grids
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@bmooremaley, this is excellent work! I'm very happy with the way you have reorganized things and the clean-up you've done along the way.
I have some suggested changes that we can discuss today.
# Build combined filename | ||
combined_filename = '_'.join([ | ||
antarctic_filename.strip('.nc'), global_filename.strip('.nc'), | ||
self.resolution_name, datetime.now().strftime('%Y%m%d.nc'), | ||
]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a nice improvement compared to the config option we had before.
masks = { | ||
'ice_mask': 0., | ||
'grounded_mask': 0., | ||
'ocean_mask': ocean_mask, | ||
} | ||
for field in masks: | ||
combined[field] = bedmachine[field].where( | ||
bedmachine[field].notnull(), masks[field], | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this is missing some logic for putting in fill values for ice_draft
and thickness
that was in the original code.
# Remove PETxxx.RegridWeightGen.Log files | ||
for f in glob('*.RegridWeightGen.Log'): | ||
os.remove(f) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like this. The log files will be there if the code crashes but get removed if it ran successfully. That should help with debugging.
gebco.bathymetry.where(gebco.bathymetry < 0, 0.) - \ | ||
bedmachine.bathymetry | ||
diff.to_netcdf('gebco_minus_bedmachine.nc') | ||
logger.info(' Done.') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We're missing a newline here at the end of the file.
|
||
# the number of tiles in lat and lon for GEBCO remapping | ||
lat_tiles = 3 | ||
lon_tiles = 6 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same here, we're missing a newline here at the end of the file.
grounded_mask = np.logical_or(np.logical_or(mask == 1, mask == 2), | ||
mask == 4).astype(float) | ||
mask == 4).astype(float) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the correct indentation according to the pep8 style guide is as it was before, since logical_or()
is a method.
'--file', f'{self.resolution_name}.g', | ||
] | ||
logger.info(f' Running: {" ".join(args)}') | ||
subprocess.run(args, check=True) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In compass, there's a strong preference to call mpas_tools.logging.check_call()
and pass self.logger. If you do this, the output from these calls goes to the log file, not to the "terminal" or the log for the job script. We try to keep that output really minimal so it's obvious if the task succeeded and how long it took. The same goes for all the subprocess.run()
calls.
However, I know you were seeing hanging so we will have to see if that recurs with calls to check_call()
.
|
||
# Generate weights file | ||
args = [ | ||
'srun', '-n', f'{self.ntasks}', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We want to let compass take care of this, rather than hard-coding srun
and its flags. The preferred way to do this it to call compass.parallal.run_command()
something like the way it's done for the model:
Line 60 in 2e06f2e
run_command(args=args, cpus_per_task=cpus_per_task, ntasks=ntasks, |
Presumably, you'll need to do a commit to make the linter happy. The easiest way to do that is probably to add an extra newline somewhere (maybe at the end of the file) and then fix whatever it complains about. Make sure you do that with the conda environment loaded so pre-commit can happen. |
The
ocean/utility/combine_topo
test has been split into two separate tests: one for alat_lon
target grid and one for acubed_sphere
target grid. Thecubed_sphere
test allows the migration tombtempest
for remapping bathymetry to MPAS grids by removing the pole singularities. The workflow for the existinglat_lon
target grid has also been updated to use remapping tools more consistently between the GEBCO and BedMachineAntarctica data sets, and between thelat_lon
andcubed_sphere
target mesh workflows. Specifically, the GEBCO dataset is separated into tiles prior to remapping,ESMF_RegridWeightGen
is used for building weights files, andncremap
is used for the final remapping step.Checklist
Testing
in this PR) any testing that was used to verify the changes