-
-
Notifications
You must be signed in to change notification settings - Fork 377
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added ability to set the timeout of an HTTP request #502
Conversation
Thank you @basicn86, for your contribution, Look good, I will ask my developer to review it and release a version with it if he accepts the pull request. Best Regards, Jon Sponsorship Performance Libraries Runtime Evaluation |
public int Timeout | ||
{ | ||
get { return _timeout; } | ||
set { if (value < 0) { _timeout = 0; } else { _timeout = value; } } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why should an assignment like HtmlWeb.Timeout = -4567
be equivalent to HtmlWeb.Timeout = 0
, and why and how would some HAP user possibly expect -4567 being the same as 0 for timeout values?
In my opinion it would rather make more sense to throw an ArgumentOutOfRangeException for TimeOut values less than or equal to zero except Timeout.Infinite (Timeout.Infinite is the value -1 which indicates inifinite timeout), equivalent to the behavior of HttpWebRequest.Timeout (rejecting negative values except Timeout.Infinite) and HttpClient.Timeout (rejecting TimeSpan values that are zero or less than zero except TimeSpan.Infinite).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good idea, I adjusted the setter to throw a ArgumentOutOfRangeException if the value is less than or equal to zero. Thanks!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Note that i edited my comment, as i originally did not account for Timeout.Infinite (which is the value -1)
So, basically, the setter should throw if value <= 0 && value != Timeout.Infinite
.
(My apologies for this oversight in my original comment.)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To be honest, I do not expect any users to be waiting infinitely for a document to load. The default value for HttpClient.Timeout
and HttpWebRequest.Timeout
is 100,000 ms, or 100 seconds. Adding an infinite timeout is a bit troublesome since HttpWebRequest
requires an integer while HttpClient
requires a Timespan
object to be passed instead. I think throwing an exception if the value is zero or less is more of a practical approach.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Adding an infinite timeout is a bit troublesome since
HttpWebRequest
requires an integer whileHttpClient
requires aTimespan
object to be passed instead.
No, it's actually not troublesome :-) because Timeout.InfiniteTimeSpan is per definition a TimeSpan value of -1 millisecond. ThereforeTimeSpan.FromMilliseconds(-1)
(i.e., TimeSpan.FromMilliseconds(Timeout.Infinite)
) equals Timeout.InfiniteTimeSpan
. No special treatment necessary here, just pass the integer timeout value to TimeSpan.FromMilliseconds and Bob's your uncle...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh really? I thought TimeSpan.FromMilliseconds(-1)
would throw an exception that it's out of bounds. In this case, we can change that if statement to be this if (value <= 0 && value != -1) { throw new ArgumentOutOfRangeException(); }
There you go, works as expected.
I've been developing a web scraper, however, the web scraper seems to get stuck on some random websites. Sometimes it just tries to download a massive file, other times the resource does not exist and it times out. So I added the ability to set the request timeout to prevent the scraper from being stalled for too long.