Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AttributeError: 'NoneType' object has no attribute 'headers' #17

Closed
glaslos opened this issue Jun 7, 2016 · 8 comments
Closed

AttributeError: 'NoneType' object has no attribute 'headers' #17

glaslos opened this issue Jun 7, 2016 · 8 comments
Labels
Milestone

Comments

@glaslos
Copy link
Member

glaslos commented Jun 7, 2016

Traceback (most recent call last):
  File "/usr/lib/python3.5/site-packages/aiohttp/helpers.py", line 365, in log
    [message, environ, response, transport, time]))
  File "/usr/lib/python3.5/site-packages/aiohttp/helpers.py", line 352, in _format_line
    return tuple(m(args) for m in self._methods)
  File "/usr/lib/python3.5/site-packages/aiohttp/helpers.py", line 352, in <genexpr>
    return tuple(m(args) for m in self._methods)
  File "/usr/lib/python3.5/site-packages/aiohttp/helpers.py", line 301, in _format_i
    return args[0].headers.get(multidict.upstr(key), '-')
AttributeError: 'NoneType' object has no attribute 'headers'
@glaslos glaslos added the bug label Jun 7, 2016
@afeena
Copy link
Collaborator

afeena commented Jun 10, 2016

I reproduced this bug

Full exception:

Error in logging
Traceback (most recent call last):
  File "/usr/lib/python3.5/site-packages/aiohttp/server.py", line 247, in start
    message = yield from httpstream.read()
  File "/usr/lib/python3.5/site-packages/aiohttp/streams.py", line 591, in read
    result = yield from super().read()
  File "/usr/lib/python3.5/site-packages/aiohttp/streams.py", line 446, in read
    yield from self._waiter
  File "/usr/lib/python3.5/asyncio/futures.py", line 358, in __iter__
    yield self  # This tells Task to wait for completion.
  File "/usr/lib/python3.5/asyncio/tasks.py", line 290, in _wakeup
    future.result()
  File "/usr/lib/python3.5/asyncio/futures.py", line 274, in result
    raise self._exception
  File "/usr/lib/python3.5/site-packages/aiohttp/parsers.py", line 139, in feed_data
    self._parser.send(data)
  File "/usr/lib/python3.5/site-packages/aiohttp/protocol.py", line 201, in __call__
    headers, raw_headers, close, compression = self.parse_headers(lines)
  File "/usr/lib/python3.5/site-packages/aiohttp/protocol.py", line 114, in parse_headers
    'limit request headers fields size')
aiohttp.errors.LineTooLong: 400, message='got more than Unknown bytes when reading limit request headers fields size'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.5/site-packages/aiohttp/helpers.py", line 365, in log
    [message, environ, response, transport, time]))
  File "/usr/lib/python3.5/site-packages/aiohttp/helpers.py", line 352, in _format_line
    return tuple(m(args) for m in self._methods)
  File "/usr/lib/python3.5/site-packages/aiohttp/helpers.py", line 352, in <genexpr>
    return tuple(m(args) for m in self._methods)
  File "/usr/lib/python3.5/site-packages/aiohttp/helpers.py", line 301, in _format_i
    return args[0].headers.get(multidict.upstr(key), '-')
AttributeError: 'NoneType' object has no attribute 'headers'

It occurs in aiohttp/aiohttp/protocol.py, if headers field more than 8190 bytes (or all headers size more then 32768 bytes).

class HttpParser:

    def __init__(self, max_line_size=8190, max_headers=32768,
                 max_field_size=8190):
<...>

def parse_headers(self, lines):
<...>
if continuation:
                bvalue = [bvalue]
                while continuation:
                    header_length += len(line)
                    if header_length > self.max_field_size:
                        raise errors.LineTooLong(
                            'limit request headers fields size')
                    bvalue.append(line)

                    # next line
                    lines_idx += 1
                    line = lines[lines_idx]
                    continuation = line[0] in (32, 9)  # (' ', '\t')
                bvalue = b'\r\n'.join(bvalue)
            else:
                if header_length > self.max_field_size:
                    raise errors.LineTooLong(
                        'limit request headers fields size')

I tested with:

import urllib.parse
import urllib.request
import random
import string

url = 'http://caughtyou.net:80'
user_agent = ''.join(random.choice(string.ascii_uppercase) for i in range(8192))
values = {'name': 'test',          
          'language': 'Python' }
headers = { 'User-Agent' : user_agent }

data = urllib.parse.urlencode(values)
data = data.encode('ascii')
req = req = urllib.request.Request(url, data, headers)
with urllib.request.urlopen(req) as resp:
    the_page = resp.read()

In SNARE this exception is raised inside super init function:

super().__init__(debug=debug, keep_alive=keep_alive, **kwargs)

How we should execute these requests?

@glaslos
Copy link
Member Author

glaslos commented Jun 10, 2016

Maybe a first step could be looking at the header content.
I assume we should return a 413 Entity Too Large to the client. This could be to identify the web server as HTTP has not specification for the header size but most web servers have some upper limits. Can you catch the exception, return the 413 to the client but submit the request to TANNER to have a look at the header?

@afeena
Copy link
Collaborator

afeena commented Jun 10, 2016

We can't catch this exception, but we can override handle_error function http://aiohttp.readthedocs.io/en/stable/server.html#aiohttp.server.ServerHttpProtocol.handle_error

@afeena
Copy link
Collaborator

afeena commented Jun 13, 2016

This bug also occurs if make the connection via telnet.

afeena@afeena-ubuntu16:/etc$ telnet caughtyou.net 80
Trying 163.172.157.231...
Connected to caughtyou.net.
Escape character is '^]'.
slkhfck.gdf.
HTTP/1.1 400 Bad Request
CONTENT-TYPE: text/html; charset=utf-8
CONTENT-LENGTH: 162
CONNECTION: close
DATE: Mon, 13 Jun 2016 10:44:07 GMT
SERVER: Python/3.5 aiohttp/0.21.6


<html>
  <head>
    <title>400 Bad Request</title>
  </head>
  <body>
    <h1>400 Bad Request</h1>
    Bad request syntax or unsupported method
  </body>
</html>Connection closed by foreign host.

SNARE logs:

Error in logging
Traceback (most recent call last):
  File "/usr/lib/python3.5/site-packages/aiohttp/server.py", line 234, in start
    yield from prefix.read()
  File "/usr/lib/python3.5/site-packages/aiohttp/streams.py", line 591, in read
    result = yield from super().read()
  File "/usr/lib/python3.5/site-packages/aiohttp/streams.py", line 446, in read
    yield from self._waiter
  File "/usr/lib/python3.5/asyncio/futures.py", line 358, in __iter__
    yield self  # This tells Task to wait for completion.
  File "/usr/lib/python3.5/asyncio/tasks.py", line 290, in _wakeup
    future.result()
  File "/usr/lib/python3.5/asyncio/futures.py", line 274, in result
    raise self._exception
  File "/usr/lib/python3.5/site-packages/aiohttp/parsers.py", line 139, in feed_data
    self._parser.send(data)
  File "/usr/lib/python3.5/site-packages/aiohttp/protocol.py", line 146, in __call__
    raw_data = yield from buf.waituntil(b' ', 12)
  File "/usr/lib/python3.5/site-packages/aiohttp/parsers.py", line 454, in waituntil
    'Line is too long. %s' % bytes(self._data), limit)
aiohttp.errors.LineLimitExceededParserError: Line is too long. b'slkhfck.gdf.\r\n'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.5/site-packages/aiohttp/helpers.py", line 365, in log
    [message, environ, response, transport, time]))
  File "/usr/lib/python3.5/site-packages/aiohttp/helpers.py", line 352, in _format_line
    return tuple(m(args) for m in self._methods)
  File "/usr/lib/python3.5/site-packages/aiohttp/helpers.py", line 352, in <genexpr>
    return tuple(m(args) for m in self._methods)
  File "/usr/lib/python3.5/site-packages/aiohttp/helpers.py", line 301, in _format_i
    return args[0].headers.get(multidict.upstr(key), '-')
AttributeError: 'NoneType' object has no attribute 'headers'

This exception is cached there:
https://github.com/KeepSafe/aiohttp/blob/master/aiohttp/server.py#L279

And exception occurs there:
https://github.com/KeepSafe/aiohttp/blob/master/aiohttp/server.py#L366

@afeena
Copy link
Collaborator

afeena commented Jun 13, 2016

By the way aio-libs/aiohttp#889
But our error occurs in _fornat_i function

We have no HTTP via telnet prefix and we have error in this place:
https://github.com/KeepSafe/aiohttp/blob/master/aiohttp/server.py#L234
this function raise error and server catch this error, in this stage message is None.

So. message is None because telnet doesn't have HTTP prefix, server catches this exception and try to makes log:
https://github.com/KeepSafe/aiohttp/blob/master/aiohttp/helpers.py#L390
and have exception here:

  @staticmethod
    def _format_i(key, args):
        return args[0].headers.get(multidict.upstr(key), '-')

Because arg[0] is a message and we try to access it's headers, but it is None!

We can only try to pass to the TANNER the info, that we have error

@afeena
Copy link
Collaborator

afeena commented Jun 13, 2016

Possible solutions:

Change logging in SNARE server http://aiohttp.readthedocs.io/en/stable/logging.html

  • Change format string (don't use %{FOO}i)
  • Disable logs (access_log=None)

Or compare message with None in _format_i function (in aiohttp)

@glaslos
Copy link
Member Author

glaslos commented Jun 14, 2016

Go for changing the logging in SNARE, we don't really want to mess with aiohttp.

@glaslos
Copy link
Member Author

glaslos commented Jun 15, 2016

Mitigated with 8136a52

@glaslos glaslos closed this as completed Jun 15, 2016
@glaslos glaslos added this to the gsoc_sprint_4 milestone Jun 15, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants