Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Skip tests that require a fixture #367

Closed
pytestbot opened this issue Oct 10, 2013 · 10 comments
Closed

Skip tests that require a fixture #367

pytestbot opened this issue Oct 10, 2013 · 10 comments
Labels
type: enhancement new feature or API change, should be merged into features branch

Comments

@pytestbot
Copy link
Contributor

Originally reported by: Wolfgang Schnerring (BitBucket: wosc, GitHub: wosc)


I recently wrote some tests that need an external resource (an LDAP server in the concrete case), which I encapsulated as a fixture. Since that resource might not always be available, I then also created a decorator to skip these tests in that case.

Since I now need to do two things to each test that requires this resource (fixture and skip-decorator), this got me thinking how it would be nice if there are tests that require a fixture, and said fixture signals that it's not available, then these tests would be skipped.

I don't know enough about py.test internals whether the structure would make this easy, hard or impossible, I don't know whether this could be expressed as a plugin, and I also don't know wheter this pattern is common enough to warrant support at all. But still, it sounded interesting enough that I wanted to throw it out there.


@pytestbot
Copy link
Contributor Author

Original comment by holger krekel (BitBucket: hpk42, GitHub: hpk42):


If i understand it correctly, there already is a canonical way to achieve this. Use pytest.skip or pytest.xfail within the fixture function body early on. This will cause all tests that depend on this fixture to be skipped/xfailed appropriately. Works for you?

@pytestbot
Copy link
Contributor Author

Original comment by Chris Williamson (BitBucket: cwilliamson, GitHub: cwilliamson):


I have a slight variation of Wolfgang's request. I have a TestClass that has a number of test_cases(bl) that all use a bl fixture. I want to decorate the class with skipif(bl.hasSomeAttribute()), but bl is not defined.

To me this seems like a reasonable use case.

Thanks,
Chris

@pytestbot
Copy link
Contributor Author

Original comment by holger krekel (BitBucket: hpk42, GitHub: hpk42):


@ChrisWills123 so you want express a skip condition for a test function, that depends on a fixture. I had a discussion around PyconDE2013 regarding a similar issue with @ctheune i think but am not sure about the best API to support it yet. It is somewhat unrelated to the issue @wosc brought up here and should probably live in a separate issue. Regarding the latter, i intend to close this one here because using imperative pytest.xfail/skip calls in a fixture function makes it easy to skip/xfail all tests depending on a resource that cannot be created.

@pytestbot
Copy link
Contributor Author

Original comment by Christian Theune (BitBucket: ctheune, GitHub: ctheune):


I'm actually confused which discussion you're referring to. For @wosc I think the skip should work fine.

@pytestbot
Copy link
Contributor Author

Original comment by holger krekel (BitBucket: hpk42, GitHub: hpk42):


@ctheune maybe i misremember - someone asked me how to skip a test with a condition that depends on the fixture values of the test.

@pytestbot
Copy link
Contributor Author

Original comment by holger krekel (BitBucket: hpk42, GitHub: hpk42):


You can use the imperative pytest.skip or pytest.xfail from within a fixture function and thus skip/xfail all tests.

@pytestbot
Copy link
Contributor Author

Original comment by Chris Williamson (BitBucket: cwilliamson, GitHub: cwilliamson):


The case I needed was to use the skipif decorator for a whole class or module using a fixture as a parameter. I don't want to have to add code to each test case in a class to implement pytest.skip.

What are your suggestions for skipping a whole class or module?

@pytestbot
Copy link
Contributor Author

Original comment by holger krekel (BitBucket: hpk42, GitHub: hpk42):


Chris, sorry if i was unclear. I think a skipping decorator that can work with fixture values makes some sense but it does not fit into this issue. So it would warrant a new one. I don't think it should be stuffed into the existing pytest.mark.skipif decorator.

Most importantly, however, we should be clear about the exact behaviour. Note that you can write an autouse-fixture like this:

#!python
@pytest.fixture(autouse=True)
def skip_something(request):
    if "arg" in request.fixturenames:
        arg = request.getfuncargvalue("arg")
        if some_condition_regarding(arg1):
            pytest.skip("...")


This code would be invoked for each test and can skip depending on the existence and values of one or more fixtures. Maybe that's already enough to cover your use case? Feel free to anyway suggest a marker but please try to be as precise as possible about the intended behaviour. thanks, holger

@pytestbot
Copy link
Contributor Author

Original comment by Chris Williamson (BitBucket: cwilliamson, GitHub: cwilliamson):


Thanks Holger. I will try what you suggest above and if I cannot make it work for my usecase, then I will start another thread. Still fairly new to pytest and I really appreciate everyone's input.
BR, Chris

@wosc
Copy link
Contributor

wosc commented Feb 21, 2019

For future reference, for me an imperative pytest.skip() inside the fixture was not quite what I wanted / needed; I'm instead using this approach in my conftest.py to automatically pytest.mark all test that use a fixture. Then I can skip those via bin/pytest -m 'not myfixture'

def pytest_collection_modifyitems(items):
    for item in items:
        try:
            fixtures = item.fixturenames
            if 'myfixture' in fixtures:
                item.add_marker(pytest.mark.myfixture)
        except Exception:
            pass

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type: enhancement new feature or API change, should be merged into features branch
Projects
None yet
Development

No branches or pull requests

2 participants