Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Wrap grdgradient #1269
Wrap grdgradient #1269
Changes from all commits
31d5347
e95cc99
f9ae69e
8eb8ecd
e9d4ddf
4d59891
198dce3
9aeb560
e79adb7
e53201f
e22bd57
429e7ff
43585a0
77d4cc9
a3794dd
0d26dc1
f433363
801684a
1451ff1
ce45929
9287fc6
1fd6d32
fb751e1
939375d
9f02faf
37bb1e2
8b2a772
9f3e083
39c3d27
7d6d8ab
File filter
Filter by extension
Conversations
Jump to
There are no files selected for viewing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you add add an extra test using e.g.
grdgradient(..., radiance=[135, 45])
that produces an outgrid. Similar in style to yourtest_grdclip_outgrid
examplepygmt/pygmt/tests/test_grdclip.py
Lines 21 to 35 in ce774d4
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For reasons I don't understand, both
test_grdgradient_outgrid()
andtest_grdgradient_no_outgrid()
fail when I run them on my computer, but they pass on the GitHub Action. For both of them, I get anAssertionError
that tells me that theassert_allclose
is not equal within tolerance. Any idea why this would be a problem when running thetest_grdgradient.py
on some machines by not others?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you report the full error message? I'm not surprised that this did not work, this might be due to flakiness (#1242) and/or something with
grdgradient
itself. Theshading
parameter ingrdimage
/grdview
(which usesgrdgradient
behind the scenes) has always been a bit sensitive to crashes and bugs, but we haven't managed to track down the true cause of it.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here is the failure for
test_grdgradient_outgrid()
:There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here is the failure for
test_grdgradient_no_outgrid()
:There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@weiji14 What I'm a little confused about with the issue you mention above is that this test fails when I run
test_grdgradient.py
by itself and as part ofmake test
on my computer , but consistently passes when the entire test suite is run on GitHub Actions. I would think the issue wouldn't affect a single file run and the issue is that all of the testing is run under one GMT instance.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you provide the output of
pygmt.show_versions()
please? You're right that it's funny breaking the other way around. I'll try to see if I cam reproduce it on my end tomorrow.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not too smart on using conda environments in PyCharm, but I changed the default conda environment to a different environment and changed it back, and now my test appear to be passing when both run alone and in
make test
. Not sure if it was a PyCharm error (unlikely) or human error (very likely), but the problem seems to be fixed.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can't seem to reproduce your error above when running
pytest pygmt/tests/test_grdgradient.py
, so maybe you're right that there isn't any problem. Mypygmt.show_versions()
is as below for reference, just did a fresh install yesterday. I only see a difference inxarray
andghostscript
versions but those shouldn't matter much I think.