Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Message queue not empty at test start up #116

Closed
ctmbl opened this issue Jun 9, 2023 · 5 comments · Fixed by #122
Closed

Message queue not empty at test start up #116

ctmbl opened this issue Jun 9, 2023 · 5 comments · Fixed by #122

Comments

@ctmbl
Copy link
Contributor

ctmbl commented Jun 9, 2023

I don't know if this is a bug or a feature but without explicitly saying it, the message queue is at "module" scope (or even more global) which when using pytest (for example) is really annoying.

I'll illustrate it with the example from dpytest doc (only slightly modified: I renamed the bot fixture to config_bot: a weird habit of mine to always rename fixture object in test, feels like functions calls I guess 🤷‍♂️ )

import discord
import discord.ext.commands as commands
from discord.ext.commands import Cog, command
import pytest
import pytest_asyncio
import discord.ext.test as dpytest


class Misc(Cog):
    @command()
    async def ping(self, ctx):
        await ctx.send("Pong !")

    @command()
    async def echo(self, ctx, text: str):
        await ctx.send(text)

@pytest_asyncio.fixture
async def config_bot():
    intents = discord.Intents.default()
    intents.members = True
    intents.message_content = True
    b = commands.Bot(command_prefix="!",
                     intents=intents)
    await b._async_setup_hook()  # setup the loop
    await b.add_cog(Misc())

    dpytest.configure(b)
    return b

@pytest.mark.asyncio
async def test_ping(config_bot):
    bot = config_bot
    await dpytest.message("!ping")
    assert dpytest.verify().message().content("Pong !")


@pytest.mark.asyncio
async def test_echo(config_bot):
    bot = config_bot
    await dpytest.message("!echo Hello world")
    assert dpytest.verify().message().contains().content("Hello")

as expected this example pass:

$ pytest
======================================= test session starts =======================================
platform linux -- Python 3.10.10, pytest-7.2.2, pluggy-1.0.0
rootdir: /tmp/dpytest
plugins: asyncio-0.21.0
asyncio: mode=strict
collected 2 items

test_dpytest_bug.py .. 

======================================== 2 passed in 0.24s ========================================

However just modifying a bit test_ping (sending two commands) will crash test_echo which seems weird: a test shouldn't influence another one:

@pytest.mark.asyncio
async def test_ping(config_bot):
    bot = config_bot
    await dpytest.message("!ping")
    await dpytest.message("!ping")
    assert dpytest.verify().message().content("Pong !")
$ pytest
======================================= test session starts =======================================
[blabla]

test_dpytest_bug.py .F                                                                      [100%]

============================================ FAILURES =============================================
____________________________________________ test_echo ____________________________________________

config_bot = <discord.ext.commands.bot.Bot object at 0x7f9244d70490>

    @pytest.mark.asyncio
    async def test_echo(config_bot):
          [blablabla]

test_dpytest_bug.py:43: AssertionError
===================================== short test summary info =====================================
FAILED test_dpytest_bug.py::test_echo - AssertionError: assert <discord.ext.test.verify.VerifyMessage object at 0x7f9244d72200>
=================================== 1 failed, 1 passed in 0.34s ===================================

because the AssertionError dump isn't really clear (related to #115) I'll use dpytest.get_message() to debug my test_echo (the one which has crashed):

@pytest.mark.asyncio
async def test_echo(config_bot):
    bot = config_bot
    await dpytest.message("!echo Hello world")
    print(dpytest.get_message().content)
    assert False

and we get:

$ pytest
======================================= test session starts =======================================
[blabla]

test_dpytest_bug.py .F                                                                      [100%]

============================================ FAILURES =============================================
____________________________________________ test_echo ____________________________________________
[blabla]
-------------------------------------- Captured stdout call ---------------------------------------
Pong !
===================================== short test summary info =====================================
FAILED test_dpytest_bug.py::test_echo - assert False
=================================== 1 failed, 1 passed in 0.34s ===================================_

Here we are:
in test_echo that doesn't call !ping tehre is a "Pong !" response in message queue

I'm aware of the dpytest.empty_queue to clear the message queue, but I'd expect the configuration from dpytest to be fucntion scoped.
Ideally I'd like to call this dpytest.empty_queue in test teardown: for example after yielding b in the config_bot fixture but for some reason an async fixture can't yield and then teardown the test properly, at least I didn't manage to do it 😕

I'd really like your thoughts and help on that subject!

ctmbl added a commit to ctmbl/RootPythia that referenced this issue Jun 9, 2023
see iScsc#14 on RootPythia's repo
as well as
iScsc@17b39a0
the commit that introduced this disabled test

an issue quzstionning this problem has been opened on dpytest repo
CraftSpider/dpytest#116
@Sergeileduc
Copy link
Collaborator

You are entirely right.

We should automatically empty the queue between each test, but I don't have time right now to investigate how to write a teardown

@Sergeileduc Sergeileduc pinned this issue Jun 16, 2023
@Sergeileduc
Copy link
Collaborator

@ctmbl actually, it works, with our conftest.py ! 😁

we use a cleanup fixture

@pytest_asyncio.fixture(autouse=True)
async def cleanup():
    yield
    await dpytest.empty_queue()

Actually the yield is not necessary

@pytest_asyncio.fixture(autouse=True)
async def cleanup():
    # yield
    await dpytest.empty_queue()

works just fine.

I also try to put the empty queue in the bot fixture

    yield b
    await dpytest.empty_queue()

and it works !

So, closing, I guess ?

feel free to reopen if you can't make it work.

@Sergeileduc Sergeileduc unpinned this issue Jun 16, 2023
@ctmbl
Copy link
Contributor Author

ctmbl commented Jun 22, 2023

@Sergeileduc

@ctmbl actually, it works, with our conftest.py !

oh my bad I didn't see this one, and actually this is how I fixed it: by adding await dpytest.empty_queue() in the bot fixture.
And sure the yield is not necessary because emptying the queue at the setup of test B results in the same behavior that emptying it at the teardown of test A.

However, using autouse fixture is considered as a bad habit because "explicit is better than implicit". And for a new user to dpytest this kind of mistake results in overly difficult to understand test crashes.

A good solution to me would be to update the documentation to move it after the yield in the bot fixture, this makes sense as this is actually a cleanup after the test. Also, scoping the bot fixture to other scopes than function will result in expected behavior (because the cleaning is done in the bot fixture).

Another really good one would be to implement it by default in dpytest, in the bot config for example, but I don't know if this is easily doable.

I could propose a PR for either solution if you're interested in!

@Sergeileduc
Copy link
Collaborator

sure, you can PR with the bot fixture solution

    ....
    yield b
    await dpytest.empty_queue()  # add a comment to explain why we do that

both in conftest.py and in the doc

you are probably right, the cleanup fixture is a lot of code, just for 1 line that can be written in the bot fixture.

So yeah

@ctmbl
Copy link
Contributor Author

ctmbl commented Jun 23, 2023

@Sergeileduc could you just reopen this if I open a PR that fix it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants