Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ConnectionAbortedError: SSL handshake is taking longer than 60.0 seconds: aborting the connection #3651

Closed
MtzwOsk opened this issue Mar 15, 2019 · 2 comments

Comments

@MtzwOsk
Copy link

MtzwOsk commented Mar 15, 2019

Long story short

Tho many requests as a client are closed by handshake timeout in ssl connection.

Actual behaviour

Traceback (most recent call last):
  File "/home/mtzw/projects/get_book/env/lib/python3.7/site-packages/aiohttp/connector.py", line 924, in _wrap_create_connection
    await self._loop.create_connection(*args, **kwargs))
  File "/usr/local/lib/python3.7/asyncio/base_events.py", line 970, in create_connection
    ssl_handshake_timeout=ssl_handshake_timeout)
  File "/usr/local/lib/python3.7/asyncio/base_events.py", line 998, in _create_connection_transport
    await waiter
ConnectionAbortedError: SSL handshake is taking longer than 60.0 seconds: aborting the connection

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/mtzw/programs/pycharm-2018.2.3/helpers/pydev/pydevd.py", line 1664, in <module>
    main()
  File "/home/mtzw/programs/pycharm-2018.2.3/helpers/pydev/pydevd.py", line 1658, in main
    globals = debugger.run(setup['file'], None, None, is_module)
  File "/home/mtzw/programs/pycharm-2018.2.3/helpers/pydev/pydevd.py", line 1068, in run
    pydev_imports.execfile(file, globals, locals)  # execute the script
  File "/home/mtzw/programs/pycharm-2018.2.3/helpers/pydev/_pydev_imps/_pydev_execfile.py", line 18, in execfile
    exec(compile(contents+"\n", file, 'exec'), glob, loc)
  File "/home/mtzw/projects/get_book/backend/main.py", line 125, in <module>
    results = asyncio.run(main(get_book_titles_list()))
  File "/usr/local/lib/python3.7/asyncio/runners.py", line 43, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.7/asyncio/base_events.py", line 568, in run_until_complete
    return future.result()
  File "/home/mtzw/projects/get_book/backend/main.py", line 121, in main
    return await asyncio.gather(*title_request)
  File "/home/mtzw/projects/get_book/backend/main.py", line 105, in get_books
    html = await fetch(session, BASE_URL, params=query_params)
  File "/home/mtzw/projects/get_book/backend/main.py", line 57, in fetch
    async with session.get(url, params=params) as response:
  File "/home/mtzw/projects/get_book/env/lib/python3.7/site-packages/aiohttp/client.py", line 1005, in __aenter__
    self._resp = await self._coro
  File "/home/mtzw/projects/get_book/env/lib/python3.7/site-packages/aiohttp/client.py", line 476, in _request
    timeout=real_timeout
  File "/home/mtzw/projects/get_book/env/lib/python3.7/site-packages/aiohttp/connector.py", line 522, in connect
    proto = await self._create_connection(req, traces, timeout)
  File "/home/mtzw/projects/get_book/env/lib/python3.7/site-packages/aiohttp/connector.py", line 854, in _create_connection
    req, traces, timeout)
  File "/home/mtzw/projects/get_book/env/lib/python3.7/site-packages/aiohttp/connector.py", line 992, in _create_direct_connection
    raise last_exc
  File "/home/mtzw/projects/get_book/env/lib/python3.7/site-packages/aiohttp/connector.py", line 974, in _create_direct_connection
    req=req, client_error=client_error)
  File "/home/mtzw/projects/get_book/env/lib/python3.7/site-packages/aiohttp/connector.py", line 931, in _wrap_create_connection
    raise client_error(req.connection_key, exc) from exc
aiohttp.client_exceptions.ClientConnectorError: Cannot connect to host aleph.koszykowa.pl:443 ssl:None [None]

Steps to reproduce


import aiohttp
import asyncio
import math
import time

from bs4 import BeautifulSoup as bs
from yarl import URL

from backend.lubimy import get_book_titles_list

BEGIN_COUNTER: int = 11
TABLE_PAGINATION: int = 10

BASE_URL = 'https://aleph.koszykowa.pl/F'


def has_valign(tag):
    return tag.get('valign', '') == 'baseline'


def extend_session(session, soup):
    # cookies from HTML
    meta_tags = soup.find_all('meta')
    for meta in meta_tags:
        if meta.attrs.get('http-equiv') == 'Set-Cookie':
            key, value, *_ = meta.attrs.get('content').replace(' = ', ';').split(';')
            session.cookie_jar.update_cookies(
                {key: value},
                response_url=URL('https://aleph.koszykowa.pl'))


def get_library_name(soup, title):
    table = soup.find(id='short_table')
    if table:
        libraries = []
        rows = table.find_all(has_valign)
        for row in rows:
            library_name_row = row.find_all('td')[-1]
            libraries_a_tags = library_name_row.find_all('a')
            [libraries.append(library_name.text) for library_name in libraries_a_tags]
        
        for library in libraries:
            name, availability = library.split('(')
            number, lend = availability.replace(' ', '').replace(')', '').split('/')
            print(f'{title}, {name}, availability {number != lend}')
            
        return True
    else:
        try:
            print(f'Brak {title}', soup.find('div', class_='title').text)
        except AttributeError:
            pass
        return False


async def fetch(session, url, params=None):
    async with session.get(url, params=params) as response:
        response._content_type = 'text/html; charset=UTF-8'
        response_text = await response.text()
    soup = bs(response_text, 'html.parser')
    meta_tags = soup.find_all('meta')
    for meta in meta_tags:
        if meta.attrs.get('http-equiv') == 'Set-Cookie':
            key, value, *_ = meta.attrs.get('content').replace(' = ', ';').split(';')
            session.cookie_jar.update_cookies(
                {key: value},
                response_url=URL('https://aleph.koszykowa.pl'))
    return response_text


async def fetch_table(session, url, title: str, params: dict=None):
    async with session.get(url, params=params) as response:
        soup = create_soup(await response.text())
    get_library_name(soup, title)


def create_soup(html):
    return bs(html, 'lxml')


def get_number_of_tables(soup):
    try:
        books_number = int(list(filter(None, soup.find(class_='text3').text.split(' ')))[6])  # math.ceil
        if books_number > 10:
            return math.ceil(books_number / TABLE_PAGINATION)
        return 0
    except (IndexError, AttributeError) as error:
        print(f'Not more tables. {error}')
        return 0


async def get_books(title: str):
    async with aiohttp.ClientSession(
            timeout=aiohttp.ClientTimeout(total=900, connect=900, sock_connect=900, sock_read=900),
            connector=aiohttp.TCPConnector(force_close=True)) as session:
        query_params = {
            'func': 'find-b',
            'find_code': 'WTI',
            'adjacent': 'Y',
            'local_base': 'SROBK',
            # 'find_code': 'WAU',
            # 'request': 'Henryk Sienkiewicz',
        }
        query_params['request'] = title
        html = await fetch(session, BASE_URL, params=query_params)
        soup = create_soup(html)
        # get number from first page

        if get_library_name(soup, title):
            tables_number = get_number_of_tables(soup)
            if tables_number >= 1:
                jumps = [{'func': 'short-jump', 'jump': BEGIN_COUNTER + 10 * number} for number in range(tables_number)]
                tables_request = [asyncio.ensure_future(fetch_table(session, BASE_URL, jump)) for jump in jumps]
                results = await asyncio.gather(*tables_request)
                return results
        return None


async def main(titles: list = None):
    title_request = [asyncio.ensure_future(get_books(title)) for title in titles]
    return await asyncio.gather(*title_request)


if __name__ == '__main__':
    results = asyncio.run(main(get_book_titles_list())) 
# If list to long bugs ConnectionAbortedError: SSL handshake is taking longer than 60.0 seconds: aborting the connection


Your environment

Python 3.7
aiohttp 3.5.4
Ubuntu 16.04

@aio-libs-bot
Copy link

GitMate.io thinks possibly related issues are #3052 (SSL with closed connections), #523 (Not all connections are closed (pending: 0)), #3477 ("Application data after close notify" regression in 3.5.0 with SSL connection), #1799 (Unclosed connection), and #1584 ((Broken) HTTP/1.0 server workaround for 'Connection: closed'?).

@asvetlov
Copy link
Member

If SSL handshake takes longer than a minute that means something is going wrong with your code.
Try to reduce the number of parallel handshakes if it helps.
Nothing to do on the aiohttp side, sorry :(

@lock lock bot added the outdated label Jun 24, 2020
@lock lock bot locked as resolved and limited conversation to collaborators Jun 24, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

3 participants