-
-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix read bytes count #463
Fix read bytes count #463
Conversation
`read` attempts to read *up to* `chunk_size` bytes. So real chunk length can be less than `chunk_size`.
@@ -67,6 +68,8 @@ def __init__(self, content): | |||
|
|||
@asyncio.coroutine | |||
def read(self, size=None): | |||
if size is not None: | |||
size = random.randint(size // 2, size) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I dislike the test.
Let's assume we will break the code again.
Looking on random test failures I'll have no clue what's broken.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Agree. Better see separate test function that reproduces that behaviour.
@kxepal would you see the PR in 0.17.2 bugfix release? |
Would be nice to have. |
if not chunk: | ||
break | ||
chunks.append(chunk) | ||
result = b''.join(chunks) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You can use bytearray instead of list with further join.
Not sure about Travis failure reason. Locally test passes with Python 3.4.3. I'll rewrite stream object without inheritance later. |
Fixed in 0.17 branch. |
if size is not None and self._first: | ||
self._first = False | ||
size = size // 2 | ||
return super().read(size) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should be return (yield from super().read(size))
.
I've fixed it already.
read
attempts to read up tochunk_size
bytes. So real chunk length can be less thanchunk_size
.