-
Notifications
You must be signed in to change notification settings - Fork 77
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implementation of data-processing-level attributes #1001
Implementation of data-processing-level attributes #1001
Conversation
Codecov Report
📣 This organization is not using Codecov’s GitHub App Integration. We recommend you install it so Codecov can continue to function properly for your repositories. Learn more @@ Coverage Diff @@
## dev #1001 +/- ##
===========================================
- Coverage 79.98% 49.05% -30.93%
===========================================
Files 66 47 -19
Lines 5870 4813 -1057
===========================================
- Hits 4695 2361 -2334
- Misses 1175 2452 +1277
Flags with carried forward coverage won't be shown. Click here to find out more.
... and 55 files with indirect coverage changes 📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
…ods. Decorated echodata.update_platform
for more information, see https://pre-commit.ci
I got the decorator working on the |
Thanks @emiliom, I did a quick scan through the code -- it's exciting to see these functionalities implemented! My only comment at this moment is having to call |
Thanks! Yeah, isn't it? I was thrilled to also be able to reuse the same decorator function with class methods.
Definitely, having to explicitly call that function within the decorated function is not ideal (though not awful, either). The challenge with folding this into the decorator function is having access to the "input" xr.Dataset. The decorator function does have access to all inputs, but it doesn't know which function argument is which. I suppose that if there's a common pattern for the position of the input dataset (or the use of a keyword argument), the decorator function could be made to work. For example, in Either way, these implementation decisions are not user-facing, so in principle we could go with one scheme for 0.7.0 and change it later if a better option is identified. |
…ed level or sublevel propagation forms like L*A or L2*; other cleanups to the function
…e_MVBS_index_binning
In the latest commits:
I've cleaned up the code in Regarding the folding of the |
I just remembered that I haven't added any tests 😞. I'll start working on that. |
for more information, see https://pre-commit.ci
…ultiple processing steps
Added tests (in a new, single module) that includes two test files, a hake survey EK60 raw file and an AZFP raw file. The tests check for the correction addition of processing level attributes (or they expected absence) on multiple processing steps along a sequence:
@leewujung This PR is ready for review! The only final change I expect to make, other than those based on your input, is on the final url for |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@emiliom : Thanks for the PR -- My comments are small and IMO this is ready to go!
BTW, I like the test scheme with _presence_test
and _absense_test
:)
Are you thinking the addition of processing_level_url
will be another PR or in this one?
I think that'll be the cleanest. That way we can get this PR into dev ASAP. In the meantime, I'll go through your comments now. |
Co-authored-by: Wu-Jung Lee <[email protected]>
Alright, the Then I'll submit another PR updating |
BTW, I have no idea what the windows build test failure is. I've never looked at that GH action. |
Yeah, no idea why that particular windows test failed. The other windows test passed...! |
Draft, rough, partial implementation of data-processing-level attribute insertion, addressing #980. I've got a "representative" core working now, that handles:
open_raw
, but only if Platform lat-lon data are present.echodata.update_platform
on an echodata object that previously did not have lat-lon data and hence had not been assigned processing level attributes.consolidate.add_location
.commongrid.compute_MVBS
. It looks for an existingprocessing_level
attribute in the inputSv
dataset. If it's not present, nothing is added. Otherwise, L3A or L3B are assigned based on the inputSv
being L2A or L2B, respectively.Two attributes are inserted:
processing_level
andprocessing_level_url
. Theprocessing_level
value currently looks like this: "Level 1A". For EchoData object (Level 1A), the attributes are inserted into the Top-level group dataset.The functionality is implemented as a decorator function,
add_processing_level
, withtwoone optional argumentsin addition to the required processing level code. There is also a complementary functioninsert_input_processing_level
that must be called in the decorated function to enable sublevel "propagation" -- eg, L2A > L3A, L2B > L3B. I placed all the new functionality in the existing moduleutils/prov.py
; alternatively, there are good arguments for creating a new module, say,utils/processinglevels.py
See https://github.com/uw-echospace/data-processing-levels/blob/main/data_processing_levels.md for the draft description of data processing levels (and discussions about it here), and #817 (comment) for a listing of data processing functions that will ultimately receive this feature.
The code is full of comments for myself at this point, though mostly outside actual functions. I'll set it as draft. As is, it should give you a good idea of how it works and what it covers.No tests yet, though I've run it locally, manually, on a couple of specific cases. Tomorrow.