-
Notifications
You must be signed in to change notification settings - Fork 78
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
14 unit tests fail with "ConnectionError (to Aquarius?) #245
Comments
My leading theory here is memory issues + Docker limits. Investigating. |
I remember getting these at one point with tox. And the solution was to prune and restart the volumes, which worked. It's possible some issues happened inbetween v.2.2.4 and v2.2.6, going through v2.2.5, which suppported ES queries but still had some wrong assertions, causing failures. My local system can not run provider2 (Docker has memory leaks on Mac, and that's apparently the straw that breaks the camel's back), so I always get one failure from compute flow. I've run tests a few times now, and it seems to reproduce randomly, only for one test (market flow). Clues:
Since this takes a lot of precious time (11 mins per full test run and only reproduces in some full runs), I'll give it another 2-3 runs. But if I can't reproduce it, I'll continue working with ocean.py as usually, and see if it pops up again. |
Finally found the most likely issue. On the market_flow error I got, I traced it back to:
I checked usages of s3 in ocean.py and they coincide with the usage of the metadata() fixture, market_flow and compute_flow (a bigger file for the last cases). So even though I can't fully reproduce it, I've reached the following most likely conclusion: It happens when my Internet connection is slow or the S3 is slow, and I can't connect to the sample files on amazon quick enough (there's a max timeout set for connections). I tried reproducing it by turning off WiFi during testing, but it's not possible, since the data provider first checks the connection itself. The issue is not with NO connection, but with SLOW connection, and depends both on my Internet speed and S3 itself. I don't know how I can simulate that. And since it only happens in rare cases, I don't know how much time I should keep spending on this one. |
When I saw the error I did not have internet issues. It happened repeatedly. I wish I could say "sure ignore" but right now I don't think we can. |
Could that have been issues from S3? Even if your Internet was fine, if S3 was slow you'd get the same failures. If it happened repeatedly and then stopped, we might be able to check whether S3 was slow that day and/or something was fixed. |
It is possible that I did not clean my volumes. I thought I did, but you never know. So I just did everything fresh again. It worked. :) That is, the errors reported above went away. So, I'm fine if you close the ticket. We can reopen it if the issue re-emerges. (Or go ahead and keep chasing leads if you have other things you want to investigate). |
Let's reopen if it arises again. I'll keep in mind the S3 issue if pruning is just a coincidence. |
Note: not all tests passed. There were two errors that I hadn't seen before. I reported them in #255, since scope looks different than this ticket. Though it does appear related to aquarius too. |
Describe the bug
14 unit tests fail due to
requests.exceptions.ConnectionError
.This is probably an issue with Aquarius, and maybe Provider; I'm putting it in ocean.py because this is where I discovered it and how to reproduce the bug.
To Reproduce
Steps to reproduce the behavior:
pytest
pytest
will return 14 failedExpected behavior
All unit tests pass.
Logs
Here's a snippet from the log of the first failing unit test.
Full log in my console (to the extent that my console captured it):
log.txt
Note: the barge console logs didn't have any WARNING or ERROR messages.
High priority because it makes unit tests fail.
Maybe related, just reported today too:
Maybe related, reported >1 week ago, issue is closed:
The text was updated successfully, but these errors were encountered: