Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix the ability to run network and flaky tests #3070

Merged
merged 17 commits into from
Jul 4, 2019

Conversation

shoyer
Copy link
Member

@shoyer shoyer commented Jul 2, 2019

The old setup didn't seem to work on CI, even when we explicitly passed the
relevant flags.

The old setup didn't seem to work on CI, even when we explicitly passed the
relevant flags.
@pep8speaks
Copy link

pep8speaks commented Jul 2, 2019

Hello @shoyer! Thanks for updating this PR. We checked the lines you've touched for PEP 8 issues, and found:

There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻

Comment last updated at 2019-07-04 20:02:40 UTC

@codecov
Copy link

codecov bot commented Jul 2, 2019

Codecov Report

❗ No coverage uploaded for pull request base (master@681ec0e). Click here to learn what that means.
The diff coverage is n/a.

@@           Coverage Diff            @@
##             master   #3070   +/-   ##
========================================
  Coverage          ?   94.8%           
========================================
  Files             ?      67           
  Lines             ?   12876           
  Branches          ?       0           
========================================
  Hits              ?   12207           
  Misses            ?     669           
  Partials          ?       0

@max-sixty
Copy link
Collaborator

Hmmm, this is my code. Checking what I missed. We use the similar code internally and it works (I think...)

Do we have any flaky tests? I only see one that's marked, and that's skipped anyway (maybe because of this problem):

rg flaky      

azure-pipelines.yml
24:        pytest_extra_flags: --run-flaky --run-network-tests

setup.cfg
13:    flaky: flaky tests

conftest.py
8:    parser.addoption("--run-flaky", action="store_true",
9:                     help="runs flaky tests")
16:    if not config.getoption("--run-flaky"):
17:        skip_flaky = pytest.mark.skip(
18:            reason="set --run-flaky option to run flaky tests")
20:            if "flaky" in item.keywords:
21:                item.add_marker(skip_flaky)

doc/whats-new.rst
1847:- Enhanced tests suite by use of ``@slow`` and ``@flaky`` decorators, which are
1848:  controlled via ``--run-flaky`` and ``--skip-slow`` command line arguments

xarray/tests/__init__.py
112:flaky = pytest.mark.flaky

xarray/tests/test_plot.py
30:@pytest.mark.flaky
31:@pytest.mark.skip(reason='maybe flaky')

@shoyer
Copy link
Member Author

shoyer commented Jul 2, 2019

Hmm... maybe it does work, at least in this. But we definitely aren't running the network tests, even with these changes (e.g., look at the TestPydapOnline tests). I'm not quite sure why.

@max-sixty
Copy link
Collaborator

--run-network works fine

 I  test  ~/w/xarray   *  pytest --run-network xarray/tests/test_tutorial.py                                                                                               192ms  Tue Jul  2 02:45:19 2019
Test session starts (platform: darwin, Python 3.7.3, pytest 4.6.3, pytest-sugar 0.9.2)
rootdir: /Users/maximilian/workspace/xarray, inifile: setup.cfg
plugins: xdist-1.29.0, forked-1.0.2, sugar-0.9.2, regtest-1.4.1, testmon-0.9.16, pycharm-0.5.0, celery-4.3.0
collecting ...
 xarray/tests/test_tutorial.py ✓✓                                                                                                                                                                100% ██████████
=============================================================================================== warnings summary ===============================================================================================
xarray/tests/test_tutorial.py::TestLoadDataset::test_download_from_github
  /usr/local/Cellar/python/3.7.3/Frameworks/Python.framework/Versions/3.7/lib/python3.7/importlib/_bootstrap.py:219: RuntimeWarning: numpy.ufunc size changed, may indicate binary incompatibility. Expected 192 from C header, got 216 from PyObject
    return f(*args, **kwds)

xarray/tests/test_tutorial.py::TestLoadDataset::test_download_from_github
xarray/tests/test_tutorial.py::TestLoadDataset::test_download_from_github_load_without_cache
  /usr/local/lib/python3.7/site-packages/_pytest/python.py:174: RuntimeWarning: deallocating CachingFileManager(<function _open_scipy_netcdf at 0x114e29840>, '/Users/maximilian/.xarray_tutorial_data/tiny.nc', mode='r', kwargs={'mmap': None, 'version': 2}), but file is not already closed. This may indicate a bug.
    testfunction(**testargs)

-- Docs: https://docs.pytest.org/en/latest/warnings.html

Results (15.92s):
       2 passed  # <-- runs
 N  test  ~/w/xarray   *  pytest  xarray/tests/test_tutorial.py                                                                                                            17.6s  Tue Jul  2 02:49:10 2019
Test session starts (platform: darwin, Python 3.7.3, pytest 4.6.3, pytest-sugar 0.9.2)
rootdir: /Users/maximilian/workspace/xarray, inifile: setup.cfg
plugins: xdist-1.29.0, forked-1.0.2, sugar-0.9.2, regtest-1.4.1, testmon-0.9.16, pycharm-0.5.0, celery-4.3.0
collecting ...
 xarray/tests/test_tutorial.py ss                                                                                                                                                                100% ██████████

Results (3.06s):
       2 skipped  # <-- skips

@max-sixty
Copy link
Collaborator

Let's def change to the most reliable & standard approach here. I think I originally got this from https://docs.pytest.org/en/latest/example/simple.html#control-skipping-of-tests-according-to-command-line-option but if this is minority let's make the switch

What makes us think the existing approach wasn't working? (not saying that it did work in CI! It does seem to work locally though)

@shoyer
Copy link
Member Author

shoyer commented Jul 2, 2019

Pytest reports the exact same number of tests passed, both with and without --run-network-tests. That seems suspicious to me.

@max-sixty
Copy link
Collaborator

That does sound suspicious! I'm on vacation with bad WiFi but let me confirm in the next few days, if that's ok

@max-sixty
Copy link
Collaborator

Somewhat weirdly. I get different results; could you confirm yours?

 pytest --run-network-tests

---

Results (175.99s):
    7266 passed
       5 xpassed
      19 xfailed
    1092 skipped
pytest

---

Results (165.58s):
    7264 passed
       5 xpassed
      19 xfailed
    1094 skipped

Also of note: --run-network also works, as does --run-n

@shoyer
Copy link
Member Author

shoyer commented Jul 3, 2019

I'm referring to the tests running on CI, e.g., "Linux py36-flakey" on Azure. If you look at the full logs from this PR (which I made verbose) and scroll down you see:

xarray/tests/test_backends.py::TestPydapOnline::test_cmp_local_file SKIPPED [ 13%]
xarray/tests/test_backends.py::TestPydapOnline::test_compatible_to_netcdf SKIPPED [ 13%]
xarray/tests/test_backends.py::TestPydapOnline::test_dask SKIPPED        [ 13%]
xarray/tests/test_backends.py::TestPydapOnline::test_session SKIPPED     [ 13%]

So for some reason it's not doing flakey tests, even though in theory I'm passing the right flags for it:

--cov-report=html $EXTRA_FLAGS

Actually, I definitely screwed this up. I should be using $PYTEST_EXTRA_FLAGS, not $EXTRA_FLAGS. Let me try fixing that...

@shoyer shoyer merged commit 1a5b630 into pydata:master Jul 4, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants