Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix failing tests due to auto_combine deprecation #324

Merged
merged 12 commits into from
Sep 25, 2019
Merged

Conversation

spencerahill
Copy link
Owner

Closes #323

The failures related to xarray's deprecation of auto_combine, which has been superseded by the combine_by_coords and combine_nested top-level functions (and associated combine='by_coords' and combine='nested' options for open_mfdataset.

Took the liberty to clean-up a few style things unrelated to the actual meat of the issue while I was at it; apologies for murking up the diff in the process.

More substantively, I refactored _prep_time_data out of data_loader and into utils.times, since that seems more appropriate from an organizational standpoint. But its tests rely on some of the fixtures defined in test_data_loader, and so for now I just hard-copied those into test_utils_times. Should we move our various fixtures somewhere central, e.g. test/__init__.py. to avoid this problem in the future? (This could be a can of worms, so I'm also not opposed to walking back this for now, which wasn't in the scope of the original issue in the first place.)

Also, I removed the logic in the test of ensure_time_as_index that checks that the input dataset it unmodified. The only place we call it is to rewrite it anyways: ds = ensure_time_as_index(ds), so this seemed superfluous.

@spencerkclark (been a while!) would you mind sharing your $0.02 on this when you get a chance? TIA.

Copy link
Collaborator

@spencerkclark spencerkclark left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure! Happy to take a look. Thanks for kicking the dust off aospy; indeed it's been a while!

More substantively, I refactored _prep_time_data out of data_loader and into utils.times, since that seems more appropriate from an organizational standpoint.

I agree it makes sense to move _prep_time_data to utils.times given the number of methods from utils.times _prep_time_data depends on. Regarding duplication of test fixtures, is there anything stopping you from importing them directly from test_data_loader.py within test_utils_times.py?

Also, I removed the logic in the test of ensure_time_as_index that checks that the input dataset it unmodified. The only place we call it is to rewrite it anyways: ds = ensure_time_as_index(ds), so this seemed superfluous.

This sounds good. I think this was a check to make sure things were not impacted by an xarray bug, pydata/xarray#2856, which has now been fixed. See #322 and discussion in #320.

Regarding testing, I'm not sure what's going wrong in the Appveyor builds. I'm fine putting off fixing that until later. At some point, like xarray has done, we may consider moving our CI completely to Azure pipelines, so that we can have all platforms tested under one umbrella.

aospy/data_loader.py Show resolved Hide resolved
@spencerahill
Copy link
Owner Author

At some point, like xarray has done, we may consider moving our CI completely to Azure pipelines, so that we can have all platforms tested under one umbrella.

Fully agree. I'll open an issue to track it.

Regarding duplication of test fixtures, is there anything stopping you from importing them directly from test_data_loader.py within test_utils_times.py?

This led me down the right track. It turns out that importing fixtures from another module isn't ideal, because you have to also import every fixture that the given fixture calls, recursively. And the linter thinks that the imports aren't used. But some googling led me to pytest's docs, which recommend putting fixtures in a conftest.py file, which I have now created.

Moving forward I think it makes sense to put fixtures there as much as possible, and it might also be useful to do a refactor of our existing tests. (I suspect we're defining sample Datasets numerous places, when we could just have one fixture to do that.) But that's probably low priority...if it ain't broke, as they say...

Still haven't finished this PR, as there are other warning remaining to fix.

@spencerahill
Copy link
Owner Author

@spencerkclark this is ready for a final check over if you don't mind. Test suite now runs with zero warnings. TIA.

Copy link
Collaborator

@spencerkclark spencerkclark left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@spencerahill I had a look this morning and everything seems good to me!

Thanks for cleaning up the remaining Python 2 compatibility cruft and for improving the formatting / variable names used in several places. On the topic of formatting, it seems a number of your changes are taking things in the direction that black would go. In the spirit of following xarray, we might consider applying that to the codebase at some point as well. See pydata/xarray#3092 and pydata/xarray#3142 for more details.

@spencerahill spencerahill changed the title WIP Fix failing tests due to auto_combine deprecation Fix failing tests due to auto_combine deprecation Sep 25, 2019
@spencerahill
Copy link
Owner Author

spencerahill commented Sep 25, 2019

Thanks @spencerkclark

On the topic of formatting, it seems a number of your changes are taking things in the direction that black would go

I agree. I have played around with black not in aospy but applied to my own scientific code, and I've found in that case it's too restrictive: e.g. in expressions of physical equations, it obliterates line breaks that have physical meaning, etc.

But that's not an issue for aospy, and as usual if xarray does it we probably should to.

@spencerahill spencerahill merged commit f17d91e into master Sep 25, 2019
@spencerahill spencerahill deleted the fix-warnings branch September 25, 2019 14:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Need to fix warnings due to recent updates to dependencies
3 participants