Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upgrade Earlinet reader and implement vertical profiles #855

Merged
merged 166 commits into from
Oct 3, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
166 commits
Select commit Hold shift + click to select a range
a740c97
checking initial functionality wip
lewisblake Apr 24, 2023
a33518f
read_file should be good now. test zdust though
lewisblake Apr 25, 2023
ae401fe
typing and fix assert barrier
lewisblake Apr 25, 2023
ead803e
close to first draft. minor fixes for tomorrow morning
lewisblake Apr 25, 2023
a55653b
change altitude in data_obj to be station lat. daa alt already defined
lewisblake Apr 26, 2023
297b496
add ec355aer
lewisblake Apr 26, 2023
df9278d
working on tests
lewisblake Apr 26, 2023
9b3cff9
use pandas.Timestamp().to_numpy()
lewisblake Apr 27, 2023
4375d7d
remove breakpoint()
lewisblake Apr 27, 2023
cba5945
squeeze err
lewisblake Apr 27, 2023
432e6a7
working on tests but need to to think about directory structure
lewisblake Apr 27, 2023
06141a1
update paths.ini with temp path for testing
lewisblake May 3, 2023
1a55019
fix up get_file_list
lewisblake May 3, 2023
bf6be38
fix up get_file_list tests
lewisblake May 3, 2023
b2db8d2
working on tets: figure out converting to station data
lewisblake May 3, 2023
5614ee4
dtime needed .astype("datetime64[s]")
lewisblake May 4, 2023
527bb94
test_ReadEarlinet_read working
lewisblake May 4, 2023
39d7f98
adjust number of files since onefile.txt updated
lewisblake May 4, 2023
8cd6446
update paths
lewisblake May 4, 2023
fbddece
update __version__
lewisblake May 4, 2023
c57e6e7
add ec355aer to variables.ini
lewisblake May 4, 2023
0171348
working on pushing through to aeroval
lewisblake May 5, 2023
b806b49
working on getting through to aeroval with altitude
lewisblake May 5, 2023
c7a648c
seeing where ungriddeddata fails WIP
lewisblake May 11, 2023
c33d7e7
start a colocation 3d file
lewisblake May 12, 2023
944d364
obs_is_3d
lewisblake May 12, 2023
8be7b64
skip stuff if var is altitude. may remove later
lewisblake May 12, 2023
18e3a78
change earlinet ts_type to hourly
lewisblake May 12, 2023
623c61d
working on is_vertical_profile property
lewisblake May 12, 2023
5098d5e
linters
lewisblake May 12, 2023
f530641
figuring out why is_vertical_profile not being set
lewisblake May 12, 2023
755b0c1
Merge branch 'main-dev' into earlinet
lewisblake May 15, 2023
84e5cd7
figured a way to pass vertical profile form reader to ungriddeddata
lewisblake May 19, 2023
83ebeb8
attempt at obs_is_vertical_profile
lewisblake May 19, 2023
3b10ff9
setter and get obs_is_vertical_profile
lewisblake May 19, 2023
add0108
breakpoint in new colocator
lewisblake May 19, 2023
157c1ad
resolve_var_name no longer private
lewisblake May 19, 2023
3104cf5
more private methods to public to import other places, probs shoulda …
lewisblake May 19, 2023
1ad4255
verticalprofile colocator WIP
lewisblake May 19, 2023
97d4ef2
WIP
lewisblake May 19, 2023
073a442
colocation_layer_limits passed to new colocator
lewisblake May 23, 2023
42d3baa
remove dead code
lewisblake May 23, 2023
f193ab3
WIP
lewisblake May 30, 2023
6f61ee2
merge main-dev to keep current
lewisblake Jul 10, 2023
d276c6a
start and end plus checker in function
lewisblake Jul 10, 2023
2bbc15b
filter obs_data by altitudes in vertical layer
lewisblake Jul 10, 2023
75d80b6
add station altitude to altitudes
lewisblake Jul 11, 2023
efad233
comment out colocation and profile layer limits from model entry
lewisblake Jul 11, 2023
c557370
profile layer limits also included
lewisblake Jul 11, 2023
0be506e
can't use model_level_number
lewisblake Jul 11, 2023
99beb3d
4D model data in colocator. requires preprocessing
lewisblake Jul 12, 2023
067bb2a
strategy is to create 2D layer time_series in loop
lewisblake Jul 13, 2023
de6d8cc
prepared arguments for colocation helpers
lewisblake Jul 14, 2023
13d8247
got through for loop, check colocation nans
lewisblake Jul 17, 2023
3eb70a1
major refactor into a helper function
lewisblake Jul 17, 2023
2a8b203
output a namedtuple
lewisblake Jul 17, 2023
52054d0
named tuple output and type hints
lewisblake Jul 18, 2023
aba1d78
can run through w/o crashing but modify output
lewisblake Jul 18, 2023
b12a3e3
figuring out why no data
lewisblake Jul 18, 2023
4186561
use correct statistics terminology
lewisblake Jul 19, 2023
35be719
add vertical_layer to coldata meta
lewisblake Jul 19, 2023
6cb4302
Finding out why no data WIP
lewisblake Jul 19, 2023
cf9b70f
Finding out why no data in output WIP
lewisblake Jul 19, 2023
015a2ea
got the the point where need json output examples
lewisblake Jul 19, 2023
df7461a
from __future__ import annotations
lewisblake Jul 19, 2023
5f358c9
change vertical_layer for output
lewisblake Jul 20, 2023
06cc5b8
working on profile json output WIP
lewisblake Jul 20, 2023
2e67107
write profile json 1st draft
lewisblake Jul 21, 2023
0f0b541
add profile JSON_SUBDIRS
lewisblake Jul 22, 2023
7fb61b0
correct profile output
lewisblake Jul 22, 2023
51be00e
check bug in profile output
lewisblake Jul 22, 2023
ef88b59
clean up
lewisblake Jul 24, 2023
e923c82
skip earlinet tests that need new data
lewisblake Jul 24, 2023
7179747
skips tests that need new earlinet data
lewisblake Jul 24, 2023
f0e0cac
linters
lewisblake Jul 24, 2023
a67752a
output profiles not profile
lewisblake Jul 24, 2023
98593e1
separate profiles by station_name
lewisblake Jul 24, 2023
fe896b0
fix layer colocation bug
lewisblake Jul 24, 2023
0d83e8b
clean up and testing
lewisblake Jul 25, 2023
87a236f
change unit on metadata to m
lewisblake Jul 25, 2023
e6837ab
_colocate_vertical_profile_gridded
lewisblake Aug 7, 2023
7e2da85
ColocatedDataLists class
lewisblake Aug 7, 2023
df098fc
profiles vized. need to change alt units
lewisblake Sep 5, 2023
abbddcf
figured out altitude units
lewisblake Sep 6, 2023
7275030
add some meta from coldata
lewisblake Sep 6, 2023
3eaf9d0
change units on extinction vars
lewisblake Sep 7, 2023
7effbc2
formatting and units
lewisblake Sep 7, 2023
4b99cad
use hasattr()
lewisblake Sep 7, 2023
9274c5c
unit to km in tests
lewisblake Sep 7, 2023
67b4406
isort
lewisblake Sep 7, 2023
07aeafb
aeronet tests
lewisblake Sep 7, 2023
1ec8823
13
lewisblake Sep 7, 2023
753967f
modify CI
lewisblake Sep 7, 2023
d61c271
black
lewisblake Sep 7, 2023
14e62b3
remove some notes to self
lewisblake Sep 7, 2023
bc2d244
remove dev notes
lewisblake Sep 8, 2023
1b3a39f
save coldata objs w/ vertical layers in km
lewisblake Sep 8, 2023
52ed06f
remove notes to self. convert to TODOs
lewisblake Sep 8, 2023
094d283
remove notes to self
lewisblake Sep 8, 2023
b2e0af5
linters
lewisblake Sep 8, 2023
dcecccd
remove dead breakpoints
lewisblake Sep 8, 2023
a1065b3
make empty test_colocation_3d file
lewisblake Sep 8, 2023
9aa5504
Merge branch 'main-dev' into earlinet
lewisblake Sep 8, 2023
f44cd88
cleanup not needed imports
lewisblake Sep 8, 2023
e717739
getting tests ready for local testing
lewisblake Sep 11, 2023
f6abfc9
test_read-earlinet passing local CI
lewisblake Sep 11, 2023
769558d
need to include model and obs in testdata-minimal
lewisblake Sep 11, 2023
9df3fd0
testing WIP
lewisblake Sep 13, 2023
a0aac58
testing WIP
lewisblake Sep 13, 2023
3d142e7
altitude for stations back in meters
lewisblake Sep 13, 2023
8677aad
remove dead code
lewisblake Sep 13, 2023
f30f65f
prepare creating of fake data
lewisblake Sep 13, 2023
9897412
exctinction & backscatter colorbars
lewisblake Sep 14, 2023
2af47e0
testing WIP
lewisblake Sep 14, 2023
38898e0
working on fixtures that need fixtures
lewisblake Sep 15, 2023
8945011
colocation_3d tests work
lewisblake Sep 19, 2023
e858e0c
update testdfata-minimal file
lewisblake Sep 19, 2023
6e432d9
linters and remove dead code
lewisblake Sep 19, 2023
60b83a7
linters
lewisblake Sep 19, 2023
1a6ef18
remove dead code
lewisblake Sep 19, 2023
667a0fc
use hasattr()
lewisblake Sep 19, 2023
9df89e3
vert_code
lewisblake Sep 19, 2023
00f3b54
clean up
lewisblake Sep 19, 2023
7a32631
clean up
lewisblake Sep 19, 2023
d052544
test_load_berlin has 4 files
lewisblake Sep 19, 2023
4c73e8a
typo fix
lewisblake Sep 19, 2023
faa2e17
clean up
lewisblake Sep 19, 2023
2b6e2c9
test_get_profilename
lewisblake Sep 19, 2023
aabc7d4
no cover profile exts of funs w/o current tests
lewisblake Sep 19, 2023
85787df
isort
lewisblake Sep 19, 2023
feeec58
remove old earlinet test
lewisblake Sep 19, 2023
df94c6d
remove old earlinet test parameterizations
lewisblake Sep 19, 2023
5c1d922
clean up
lewisblake Sep 20, 2023
bcf1e14
reintroduce test_stationdata
lewisblake Sep 20, 2023
03afe5d
Alvaro feedback round 1
lewisblake Sep 21, 2023
d1d224c
remove dead code
lewisblake Sep 21, 2023
1e6f91f
Merge branch 'main-dev' into earlinet
lewisblake Sep 25, 2023
fb56720
Merge branch 'main-dev' into earlinet
lewisblake Sep 28, 2023
0444c5a
adding comments about methods and typing
lewisblake Sep 28, 2023
195b08f
remove kwargs
lewisblake Sep 28, 2023
8e311a7
type hints
lewisblake Sep 28, 2023
152a754
use numpy array instead of list of nans
lewisblake Sep 28, 2023
dbaa506
ALLOWED_VERT_CORD_TYPES
lewisblake Sep 28, 2023
a1ff070
Merge branch 'main-dev' into earlinet
lewisblake Sep 28, 2023
09fd9e5
Merge branch 'main-dev' into earlinet
lewisblake Sep 28, 2023
8fc2b98
Merge branch 'main-dev' into earlinet
lewisblake Sep 29, 2023
e6c7851
improve colocation_3d tests
avaldebe Sep 29, 2023
e196d5b
_aerocom_savename
lewisblake Sep 29, 2023
f4e010d
no blind except
lewisblake Sep 29, 2023
a2e8d26
_process_profile_data_for_vizualization
lewisblake Sep 29, 2023
6910130
from __future__ import annotations
lewisblake Sep 29, 2023
0759a9b
clean up vert_code
lewisblake Sep 29, 2023
36cecd9
_process_stats_timeseries_for_all_regions
lewisblake Oct 2, 2023
8875aaa
if hasattr(coldata.data, "altitude_units")
lewisblake Oct 2, 2023
789e236
_get_vert_code(self, ...)
lewisblake Oct 2, 2023
df34437
remove duplicated depenency
lewisblake Oct 2, 2023
7419abe
clean up and check start and end
lewisblake Oct 2, 2023
b151988
remove pd.to_timseries() on obs
lewisblake Oct 2, 2023
0c7eabf
clean up
lewisblake Oct 2, 2023
2827fff
go back to old stats[0] in test_stationdata
lewisblake Oct 2, 2023
844ed6a
info -> debug in stationdata
lewisblake Oct 2, 2023
1a96016
bring back sigfigs in test_read_earlinet
lewisblake Oct 2, 2023
f945978
ALLOWED_VERT_CORD_TYPES again
lewisblake Oct 2, 2023
9ea9501
colocation_3d # pragma: no covers
lewisblake Oct 2, 2023
6016053
swap order any checking of DataUnitError
lewisblake Oct 2, 2023
b6daae7
self.ALLOWED_VERT_COORD_TYPES
lewisblake Oct 2, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
340 changes: 240 additions & 100 deletions pyaerocom/aeroval/coldatatojson_engine.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,10 @@
import logging
import os
import shutil
from time import time

from cf_units import Unit

from pyaerocom import ColocatedData, TsType
from pyaerocom._lowlevel_helpers import write_json
from pyaerocom.aeroval._processing_base import ProcessingEngine
Expand All @@ -18,10 +21,14 @@
_process_statistics_timeseries,
_write_site_data,
_write_stationdata_json,
add_profile_entry_json,
get_heatmap_filename,
get_json_mapname,
get_profile_filename,
get_timeseries_file_name,
init_regions_web,
process_profile_data_for_regions,
process_profile_data_for_stations,
update_regions_json,
)
from pyaerocom.exceptions import AeroValConfigError, TemporalResolutionError
Expand All @@ -46,6 +53,11 @@
list of files that have been converted.

"""
out_dirs = self.cfg.path_manager.get_json_output_dirs(True)
for idir in out_dirs:
if os.path.exists(out_dirs[idir]):
shutil.rmtree(out_dirs[idir])

converted = []
for file in files:
logger.info(f"Processing: {file}")
Expand Down Expand Up @@ -93,7 +105,15 @@

stats_min_num = self.cfg.statistics_opts.MIN_NUM

vert_code = coldata.get_meta_item("vert_code")
if hasattr(coldata.data, "altitude_units"):
if Unit(coldata.data.attrs["altitude_units"]) != Unit(

Check warning on line 109 in pyaerocom/aeroval/coldatatojson_engine.py

View check run for this annotation

Codecov / codecov/patch

pyaerocom/aeroval/coldatatojson_engine.py#L109

Added line #L109 was not covered by tests
"km"
): # put everything in terms of km for viz
lewisblake marked this conversation as resolved.
Show resolved Hide resolved
# convert start and end for file naming
self._convert_coldata_altitude_units_to_km(coldata)

Check warning on line 113 in pyaerocom/aeroval/coldatatojson_engine.py

View check run for this annotation

Codecov / codecov/patch

pyaerocom/aeroval/coldatatojson_engine.py#L113

Added line #L113 was not covered by tests

vert_code = self._get_vert_code(coldata)

diurnal_only = coldata.get_meta_item("diurnal_only")

add_trends = self.cfg.statistics_opts.add_trends
Expand All @@ -109,7 +129,7 @@

# this will need to be figured out as soon as there is altitude
elif "altitude" in coldata.data.dims:
raise NotImplementedError("Cannot yet handle profile data")
raise ValueError("Altitude should have been dealt with already in the colocation")

Check warning on line 132 in pyaerocom/aeroval/coldatatojson_engine.py

View check run for this annotation

Codecov / codecov/patch

pyaerocom/aeroval/coldatatojson_engine.py#L132

Added line #L132 was not covered by tests

elif not isinstance(coldata, ColocatedData):
raise ValueError(f"Need ColocatedData object, got {type(coldata)}")
Expand Down Expand Up @@ -165,114 +185,234 @@
if annual_stats_constrained:
data = _apply_annual_constraint(data)

if not diurnal_only:
logger.info("Processing statistics timeseries for all regions")
input_freq = self.cfg.statistics_opts.stats_tseries_base_freq

for reg in regnames:
try:
stats_ts = _process_statistics_timeseries(
data=data,
freq=main_freq,
region_ids={reg: regnames[reg]},
use_weights=use_weights,
use_country=use_country,
data_freq=input_freq,
)

except TemporalResolutionError:
stats_ts = {}

fname = get_timeseries_file_name(regnames[reg], obs_name, var_name_web, vert_code)
ts_file = os.path.join(out_dirs["hm/ts"], fname)
_add_heatmap_entry_json(
ts_file, stats_ts, obs_name, var_name_web, vert_code, model_name, model_var
if coldata.data.attrs.get("just_for_viz", True): # make the regular json output
if not diurnal_only:
logger.info("Processing statistics timeseries for all regions")

self._process_stats_timeseries_for_all_regions(
data=data,
coldata=coldata,
main_freq=main_freq,
regnames=regnames,
use_weights=use_weights,
use_country=use_country,
obs_name=obs_name,
obs_var=obs_var,
var_name_web=var_name_web,
out_dirs=out_dirs,
vert_code=vert_code,
model_name=model_name,
model_var=model_var,
meta_glob=meta_glob,
periods=periods,
seasons=seasons,
add_trends=add_trends,
trends_min_yrs=trends_min_yrs,
regions_how=regions_how,
regs=regs,
stats_min_num=stats_min_num,
use_fairmode=use_fairmode,
)

logger.info("Processing heatmap data for all regions")
hm_all = _process_heatmap_data(
data,
regnames,
use_weights,
use_country,
meta_glob,
periods,
seasons,
add_trends,
trends_min_yrs,
else:
logger.info("Processing profile data for vizualization")

Check warning on line 217 in pyaerocom/aeroval/coldatatojson_engine.py

View check run for this annotation

Codecov / codecov/patch

pyaerocom/aeroval/coldatatojson_engine.py#L217

Added line #L217 was not covered by tests

self._process_profile_data_for_vizualization(

Check warning on line 219 in pyaerocom/aeroval/coldatatojson_engine.py

View check run for this annotation

Codecov / codecov/patch

pyaerocom/aeroval/coldatatojson_engine.py#L219

Added line #L219 was not covered by tests
data=data,
use_country=use_country,
region_names=regnames,
station_names=coldata.data.station_name.values,
periods=periods,
seasons=seasons,
var_name_web=var_name_web,
out_dirs=out_dirs,
)

for freq, hm_data in hm_all.items():
fname = get_heatmap_filename(freq)
logger.info(
f"Finished computing json files for {model_name} ({model_var}) vs. "
f"{obs_name} ({obs_var})"
)

hm_file = os.path.join(out_dirs["hm"], fname)
dt = time() - t00
logger.info(f"Time expired: {dt:.2f} s")

_add_heatmap_entry_json(
hm_file, hm_data, obs_name, var_name_web, vert_code, model_name, model_var
)
def _convert_coldata_altitude_units_to_km(self, coldata: ColocatedData = None):
alt_units = coldata.data.attrs["altitude_units"]

Check warning on line 239 in pyaerocom/aeroval/coldatatojson_engine.py

View check run for this annotation

Codecov / codecov/patch

pyaerocom/aeroval/coldatatojson_engine.py#L239

Added line #L239 was not covered by tests

logger.info("Processing regional timeseries for all regions")
ts_objs_regional = _process_regional_timeseries(data, regnames, regions_how, meta_glob)

_write_site_data(ts_objs_regional, out_dirs["ts"])
if coldata.has_latlon_dims:
for cd in data.values():
if cd is not None:
cd.data = cd.flatten_latlondim_station_name().data

logger.info("Processing individual site timeseries data")
(ts_objs, map_meta, site_indices) = _process_sites(data, regs, regions_how, meta_glob)

_write_site_data(ts_objs, out_dirs["ts"])

scatter_freq = min(TsType(fq) for fq in self.cfg.time_cfg.freqs)
scatter_freq = min(scatter_freq, main_freq)

logger.info("Processing map and scat data by period")
for period in periods:
# compute map_data and scat_data just for this period
map_data, scat_data = _process_map_and_scat(
data,
map_meta,
site_indices,
[period],
str(scatter_freq),
stats_min_num,
seasons,
add_trends,
trends_min_yrs,
use_fairmode,
obs_var,
)
coldata.data.attrs["vertical_layer"]["start"] = str(

Check warning on line 241 in pyaerocom/aeroval/coldatatojson_engine.py

View check run for this annotation

Codecov / codecov/patch

pyaerocom/aeroval/coldatatojson_engine.py#L241

Added line #L241 was not covered by tests
Unit(alt_units).convert(coldata.data.attrs["vertical_layer"]["start"], other="km")
)

# the files in /map and /scat will be split up according to their time period as well
map_name = get_json_mapname(
obs_name, var_name_web, model_name, model_var, vert_code, period
)
outfile_map = os.path.join(out_dirs["map"], map_name)
write_json(map_data, outfile_map, ignore_nan=True)
coldata.data.attrs["vertical_layer"]["end"] = str(

Check warning on line 245 in pyaerocom/aeroval/coldatatojson_engine.py

View check run for this annotation

Codecov / codecov/patch

pyaerocom/aeroval/coldatatojson_engine.py#L245

Added line #L245 was not covered by tests
Unit(alt_units).convert(coldata.data.attrs["vertical_layer"]["end"], other="km")
)

outfile_scat = os.path.join(out_dirs["scat"], map_name)
write_json(scat_data, outfile_scat, ignore_nan=True)
def _get_vert_code(self, coldata: ColocatedData = None):
if hasattr(coldata.data, "vertical_layer"):
# start and end for vertical layers (display on web and name jsons)
start = float(coldata.data.attrs["vertical_layer"]["start"])
end = float(coldata.data.attrs["vertical_layer"]["end"])

Check warning on line 253 in pyaerocom/aeroval/coldatatojson_engine.py

View check run for this annotation

Codecov / codecov/patch

pyaerocom/aeroval/coldatatojson_engine.py#L252-L253

Added lines #L252 - L253 were not covered by tests
# format correctly (e.g., 1, 1.5, 2, 2.5, etc.)
start = f"{round(float(start), 1):g}"
end = f"{round(float(end), 1):g}"
vert_code = f"{start}-{end}km"

Check warning on line 257 in pyaerocom/aeroval/coldatatojson_engine.py

View check run for this annotation

Codecov / codecov/patch

pyaerocom/aeroval/coldatatojson_engine.py#L255-L257

Added lines #L255 - L257 were not covered by tests
else:
vert_code = coldata.get_meta_item("vert_code")
return vert_code

def _process_profile_data_for_vizualization(
self,
data: dict[str, ColocatedData] = None,
use_country: bool = False,
region_names=None,
station_names=None,
periods: list[str] = None,
seasons: list[str] = None,
obs_name: str = None,
var_name_web: str = None,
out_dirs: dict = None,
):
assert (

Check warning on line 274 in pyaerocom/aeroval/coldatatojson_engine.py

View check run for this annotation

Codecov / codecov/patch

pyaerocom/aeroval/coldatatojson_engine.py#L274

Added line #L274 was not covered by tests
region_names != None and station_names != None
), f"Both region_id and station_name can not both be None"

# Loop through regions
for regid in region_names:
profile_viz = process_profile_data_for_regions(

Check warning on line 280 in pyaerocom/aeroval/coldatatojson_engine.py

View check run for this annotation

Codecov / codecov/patch

pyaerocom/aeroval/coldatatojson_engine.py#L279-L280

Added lines #L279 - L280 were not covered by tests
data=data,
region_id=regid,
use_country=use_country,
periods=periods,
seasons=seasons,
)

if coldata.ts_type == "hourly" and use_diurnal:
logger.info("Processing diurnal profiles")
(ts_objs_weekly, ts_objs_weekly_reg) = _process_sites_weekly_ts(
coldata, regions_how, regnames, meta_glob
fname = get_profile_filename(region_names[regid], obs_name, var_name_web)

Check warning on line 288 in pyaerocom/aeroval/coldatatojson_engine.py

View check run for this annotation

Codecov / codecov/patch

pyaerocom/aeroval/coldatatojson_engine.py#L288

Added line #L288 was not covered by tests

outfile_profile = os.path.join(out_dirs["profiles"], fname)
add_profile_entry_json(outfile_profile, data, profile_viz, periods, seasons)

Check warning on line 291 in pyaerocom/aeroval/coldatatojson_engine.py

View check run for this annotation

Codecov / codecov/patch

pyaerocom/aeroval/coldatatojson_engine.py#L290-L291

Added lines #L290 - L291 were not covered by tests
# Loop through stations
for station_name in station_names:
profile_viz = process_profile_data_for_stations(

Check warning on line 294 in pyaerocom/aeroval/coldatatojson_engine.py

View check run for this annotation

Codecov / codecov/patch

pyaerocom/aeroval/coldatatojson_engine.py#L293-L294

Added lines #L293 - L294 were not covered by tests
data=data,
station_name=station_name,
use_country=use_country,
periods=periods,
seasons=seasons,
)
outdir = os.path.join(out_dirs["ts/diurnal"])
for ts_data_weekly in ts_objs_weekly:
# writes json file
_write_stationdata_json(ts_data_weekly, outdir)
if ts_objs_weekly_reg != None:
for ts_data_weekly_reg in ts_objs_weekly_reg:
# writes json file
_write_stationdata_json(ts_data_weekly_reg, outdir)

logger.info(
f"Finished computing json files for {model_name} ({model_var}) vs. "
f"{obs_name} ({obs_var})"
fname = get_profile_filename(station_name, obs_name, var_name_web)

Check warning on line 302 in pyaerocom/aeroval/coldatatojson_engine.py

View check run for this annotation

Codecov / codecov/patch

pyaerocom/aeroval/coldatatojson_engine.py#L302

Added line #L302 was not covered by tests

outfile_profile = os.path.join(out_dirs["profiles"], fname)
add_profile_entry_json(outfile_profile, data, profile_viz, periods, seasons)

Check warning on line 305 in pyaerocom/aeroval/coldatatojson_engine.py

View check run for this annotation

Codecov / codecov/patch

pyaerocom/aeroval/coldatatojson_engine.py#L304-L305

Added lines #L304 - L305 were not covered by tests

def _process_stats_timeseries_for_all_regions(
self,
data: dict[str, ColocatedData] = None,
coldata: ColocatedData = None,
main_freq: str = None,
regnames=None,
use_weights: bool = True,
use_country: bool = False,
obs_name: str = None,
obs_var: str = None,
var_name_web: str = None,
out_dirs: dict = None,
vert_code: str = None,
model_name: str = None,
model_var: str = None,
meta_glob: dict = None,
periods: list[str] = None,
seasons: list[str] = None,
add_trends: bool = False,
trends_min_yrs: int = 7,
regions_how: str = "default",
regs: dict = None,
stats_min_num: int = 1,
use_fairmode: bool = False,
):
input_freq = self.cfg.statistics_opts.stats_tseries_base_freq
for reg in regnames:
try:
stats_ts = _process_statistics_timeseries(
data=data,
freq=main_freq,
region_ids={reg: regnames[reg]},
use_weights=use_weights,
use_country=use_country,
data_freq=input_freq,
)

except TemporalResolutionError:
stats_ts = {}

Check warning on line 345 in pyaerocom/aeroval/coldatatojson_engine.py

View check run for this annotation

Codecov / codecov/patch

pyaerocom/aeroval/coldatatojson_engine.py#L344-L345

Added lines #L344 - L345 were not covered by tests
fname = get_timeseries_file_name(regnames[reg], obs_name, var_name_web, vert_code)
ts_file = os.path.join(out_dirs["hm/ts"], fname)
_add_heatmap_entry_json(
ts_file, stats_ts, obs_name, var_name_web, vert_code, model_name, model_var
)

logger.info("Processing heatmap data for all regions")

hm_all = _process_heatmap_data(
data,
regnames,
use_weights,
use_country,
meta_glob,
periods,
seasons,
add_trends,
trends_min_yrs,
)

dt = time() - t00
logger.info(f"Time expired (TOTAL): {dt:.2f} s")
for freq, hm_data in hm_all.items():
fname = get_heatmap_filename(freq)

hm_file = os.path.join(out_dirs["hm"], fname)

_add_heatmap_entry_json(
hm_file, hm_data, obs_name, var_name_web, vert_code, model_name, model_var
)

logger.info("Processing regional timeseries for all regions")
ts_objs_regional = _process_regional_timeseries(data, regnames, regions_how, meta_glob)

_write_site_data(ts_objs_regional, out_dirs["ts"])
if coldata.has_latlon_dims:
for cd in data.values():
if cd is not None:
cd.data = cd.flatten_latlondim_station_name().data

Check warning on line 382 in pyaerocom/aeroval/coldatatojson_engine.py

View check run for this annotation

Codecov / codecov/patch

pyaerocom/aeroval/coldatatojson_engine.py#L380-L382

Added lines #L380 - L382 were not covered by tests

logger.info("Processing individual site timeseries data")
(ts_objs, map_meta, site_indices) = _process_sites(data, regs, regions_how, meta_glob)

_write_site_data(ts_objs, out_dirs["ts"])

scatter_freq = min(TsType(fq) for fq in self.cfg.time_cfg.freqs)
scatter_freq = min(scatter_freq, main_freq)

logger.info("Processing map and scat data by period")

for period in periods:
# compute map_data and scat_data just for this period
map_data, scat_data = _process_map_and_scat(
data,
map_meta,
site_indices,
[period],
str(scatter_freq),
stats_min_num,
seasons,
add_trends,
trends_min_yrs,
use_fairmode,
obs_var,
)

# the files in /map and /scat will be split up according to their time period as well
map_name = get_json_mapname(
obs_name, var_name_web, model_name, model_var, vert_code, period
)
outfile_map = os.path.join(out_dirs["map"], map_name)
write_json(map_data, outfile_map, ignore_nan=True)

outfile_scat = os.path.join(out_dirs["scat"], map_name)
write_json(scat_data, outfile_scat, ignore_nan=True)
Loading
Loading