Skip to content

0.6.0 version causing test failures with: There is no current event loop in thread 'MainThread' #54

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
hyzhak opened this issue May 31, 2017 · 6 comments

Comments

@hyzhak
Copy link

hyzhak commented May 31, 2017

Got Error:

RuntimeError: There is no current event loop in thread 'MainThread'.

For example:
https://travis-ci.org/botstory/botstory/builds/236994191#L594

Sources of unit test:
https://github.com/botstory/botstory/blob/develop/botstory/chat_test.py#L38

@pytest.mark.asyncio
async def test_should_say(mock_interface):
    with answer.Talk() as talk:
        story = talk.story
        story.use(mock_interface)

        @story.on('hi there!')
        def one_story():
            @story.part()
            async def then(ctx):
                await story.say('Nice to see you!', ctx['user'])

        await talk.pure_text('hi there!')

        mock_interface.send_text_message.assert_called_once_with(
            recipient=talk.user,
            text='Nice to see you!',
            options=None,
        )

And the same error for many other async tests which are marked with @pytest.mark.asyncio, with fixtures and without it.

Deps are: https://github.com/botstory/botstory/blob/develop/requirements.txt

aiohttp==2.1.0
motor==1.1
pytest==3.1.0
pytest-aiohttp==0.1.3
pytest-asyncio==0.5.0
pytest-catchlog==1.2.2
pytest-cov==2.5.1
pytest-flakes==2.0.0
pytest-mock==1.6.0
yarl==0.10.2

previous version works fine, except deprecation warning

@Insoleet
Copy link

Insoleet commented Jun 1, 2017

Same problem here, using quamash event loop.

@Tinche
Copy link
Member

Tinche commented Jun 1, 2017

I'll take a look when I can, but getting me a minimal example would speed this up a lot.

@hyzhak
Copy link
Author

hyzhak commented Jun 1, 2017

@Tinche you can use my example (noted in this issue) if would like. It is open source.

@Tinche
Copy link
Member

Tinche commented Jun 1, 2017

The problem is pytest-aiohttp.

Pytest-aiohttp treats every coroutine as a test function, so when you have this:

@pytest.mark.asyncio
async def test_blabla():
    pass

both pytest-asyncio and pytest-aiohttp will try to run it, and that's where the conflict will be.

I'm opposed to treating any coroutine (i.e. without a marker) as a pytest-asyncio test, because that precludes using other Python async frameworks like Curio or Trio, so I won't be changing this behavior and I consider the pytest-aiohttp approach wrong.

In the short term, just remove the pytest.mark.asyncio marker and let pytest-aiohttp run your test.

@Tinche Tinche closed this as completed Jun 1, 2017
@hyzhak
Copy link
Author

hyzhak commented Jun 2, 2017

@Tinche thanks for feedback! And I will contact with @asvetlov to get additional information about this case

@achimnol
Copy link

achimnol commented Aug 3, 2017

As linked above, I have found similar regression without pytest-aiohttp.
After adding new test cases that does not use pytest-asyncio but creates & closes new event loops on their own, existing test cases that depend on pytest-asyncio began to break. Downgrading to pytest-asyncio 0.5.0 makes it working again.
The interesting part is that not all pytest-asyncio test cases fails; only even-ordered ones fail. (You can check out the Travis CI log linked in the above issue)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants