I cannot use aiohttp session if the previous request timed out

I have a simple script that creates asynchronous tasks to load different pages. The first request fails with a TimeoutError and this causes the following requests to fail. But the second one has a much longer timeout and must pass.

Is it possible that other requests are not failing?

import aiohttp
import asyncio
import logging


logging.basicConfig(level=logging.DEBUG,
                    format='%(asctime)s %(name)s %(levelname)s %(message)s')


async def main():
    asyncio.ensure_future(
        load_page('https://www.netflix.com:5000'))

    asyncio.ensure_future(
        load_page('http://bash.im/', 10))

    asyncio.ensure_future(
        load_page('https://myshows.me/'))

    asyncio.ensure_future(
        load_page('http://www.lostfilm.tv/'))


async def load_page(url, timeout=3):
    try:
        async with session.get(url, timeout=timeout) as response:
            text = await response.text()
            print(len(text))

    except Exception:
        logging.warning(type(e))


if __name__ == '__main__':
    loop = asyncio.get_event_loop()

    conn = aiohttp.TCPConnector(limit=1)
    session = aiohttp.ClientSession(connector=conn, loop=loop)

    asyncio.ensure_future(main())
    loop.run_forever()

      

Log:

2017-06-26 13:57:37,869 asyncio DEBUG Using selector: EpollSelector
2017-06-26 13:57:41,780 root WARNING <class 'concurrent.futures._base.TimeoutError'>
2017-06-26 13:57:41,780 root WARNING <class 'concurrent.futures._base.TimeoutError'>
2017-06-26 13:57:41,780 root WARNING <class 'concurrent.futures._base.TimeoutError'>
2017-06-26 13:57:48,780 root WARNING <class 'concurrent.futures._base.TimeoutError'>

      

+3


source to share


1 answer


Yes, it is possible. I am rewriting your code and use some of this concept to populate your requests:

  • I used itertools.starmap

    to pass multiple arguments and create a list of all arguments passed to the function load_page

    .

  • I used asyncio.gather

    to bundle tasks together, and used to change the return_exceptions flag to true to ensure no exceptions were thrown.

  • Modified async

    to def main

    . It returns the collected tasks now.

  • I closed the loop at the end.

Code:

import aiohttp
import asyncio
import logging
import itertools

logging.basicConfig(level=logging.DEBUG,
                    format='%(asctime)s %(name)s %(levelname)s %(message)s')

def main(session):
    args = [
        ('https://www.netflix.com:5000', session,),
        ('http://bash.im/', session, 10),
        ('https://myshows.me/', session,),
        ('http://www.lostfilm.tv/', session,),
    ]
    tasks = itertools.starmap(load_page, args)
    futures = map(asyncio.ensure_future, tasks)

    return asyncio.gather(*futures, return_exceptions=True)


async def load_page(url, session, timeout=3):
    try:
        async with session.get(url, timeout=timeout) as response:
            text = await response.text()
            print(len(text))
    except Exception:
        logging.warning(type(e))


if __name__ == '__main__':
    loop = asyncio.get_event_loop()

    conn = aiohttp.TCPConnector(limit=1)
    session = aiohttp.ClientSession(connector=conn, loop=loop)
    loop.run_until_complete(main(session))
    loop.close()

      



More on: asyncio.gather .

More on: itertools.starmap .

Enjoy!

+1


source







All Articles