-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
string parser (and possibly others internally using consume_while) force unnecessary stream reads #67
Comments
would-fix: m4rw3r#67 (but seems very ugly to me)
afterthought: with some (minor?) api change and test breakage i think this specific instance could also be solved by making
where the question would be what the |
see m4rw3r#67 for the full story
would-fix: m4rw3r#67 (but seems very ugly to me)
see m4rw3r#67 for the full story
problem
the
chomp::parsers::string
parser (and possibly others internally usingconsume_while
) might force unnecessary streamread
s. example code:expected output: Ok(<some bytes from the imap server welcome line>)
actual output: Err(Retry)
cause
the string parser (src/parsers.rs:378) uses
consume_while(f)
, which first reads the next token from the input stream, and only after that inspects it (usingf
) for whether to consume it or not. note this is not a bug in consume_while, but its perfectly fine expected behaviour. the problem with using it the way it currently is forstring(s)
is that afterlen(s)
tokens have been consumed, we could return successfully, butconsume_while
waits for the next token to call its decider function on (which then determines that it has readlen(s)
tokens already and tellsconsume_while
to quit), which in some cases can force a read on the underlying stream when actually the answer would be clear.solution
i wrote a (very hackish) fix for the
string
parser at https://github.com/dario23/chomp/tree/fix_string but (without having checked in depth) i'm expecting more parsers to be affected. probably a more exhaustive fix would include addingconsume_while_max_n(f, usize)
.i'd be happy to propose changes and submit a PR, but only after hearing your opinion on the matter :-)
The text was updated successfully, but these errors were encountered: