-
-
Notifications
You must be signed in to change notification settings - Fork 501
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Resolving raw.githubusercontent.com (raw.githubusercontent.com)... failed: Connection timed out. #2820
Comments
When i tried to ping the site it was very slow
|
So now i have a half updated DietPi? What should i do now? |
@Sopor Hmm 13 ms but overall 13 seconds? Looks like actually it means 13 seconds for each ping, right? Jep that is indeed very slow. Not sure if it's temporarily an issue of the GitHub server. Our timeout is 5s by default. This can be changed since v6.23 🤣. Do the following:
|
But there was several errors including some ssh address it couldn't connect to. |
@Sopor Please do:
And if you want to re-apply the update, just to be failsafe: And you can adjust the timeout for our URL connection check via |
If i run with sudo i get: |
@Sopor Since those G_* are global "functions" instead of "aliases", those are not available within sudo environment without having a shell loaded. |
Same thing :( This is what i wrote:
|
@Sopor |
|
I hope i do it right?
|
@Sopor
|
Same thing again
|
@Sopor However then forget about G_SUDO for now and open a sudo shell to apply :
|
Ah found a solution, although not beautiful due to doubled subshell creation:
|
Now you are confusing me ... What should i run? |
@Sopor |
And now i should be able to run |
@Sopor |
Too late :( |
@Sopor Btw just to test, I added the fix for G_SUDO. Reload updated dietpi-globals into current session once and try it out:
|
|
@Sopor |
@MichaIng |
@Sopor I was rethinking our connection checks and download handling yesterday. Actually it is not really required in most cases to check the URL first and do the download afterwards. Checking URLs includes just the same connection + DNS + server response steps as the download itself, so this is just doubled... It would make sense instead to do the download only and use the same connection + DNS timeouts. Then read the exit code/error message to distinguish between connection errors and those that are due to e.g. download target disk size/permissions/corruption or such. Everything else is doubled effort + time. Also the retry attempts only apply for the connection/URL tests. The actual download attempt itself uses wget defaults. This does not make much sense if something is wrong with the connection that single attempts can fail, then connection check could pass while download connection could fail. The retry option of error prompts anyway allow to retry as often as one wishes. Was thinking to switch from wget to curl for downloads, since in my tests curl was significantly faster to initiate (of course download speed etc the same). Sadly you cannot specify DNS + server response timeouts separately. However perhaps the default is reasonable and okay (needs testing). So basically we could use curl without any specific timeouts for usual downloads (in dietpi-software and such). Those play no role anyway in most cases since server downs/failures and wrong URLs and such result in an clear failure response so that curl/wget stop immediately, printing that failure/response code. So we can read the exit code and give reasonable ideas on the error prompt. Without specifying timeouts:
Last but not least, the wget --spider (as well as curl -I) URL checks fail in some cases, even that the download would succeed. For these cases currently we skip the connection check or do it on the main server URL instead of on the target file. Also here only the final download will show a meaningful error message/exit code. ... So finally I would skip those connection checks for downloads completely, instead implement some exit code based error handling as new G_DOWNLOAD function. If it is possible for curl to wait unlimited on any step and this cannot be prevented than stay with wget. The URL check makes sense then only for verifying entered APT mirrors, DNS servers, NTP mirrors, the general internet connection check within network adapters menu or general debugging reasons. cURL exit codes: https://ec.haxx.se/usingcurl-returns.html |
Details:
Steps to reproduce:
Expected behaviour:
Actual behaviour:
Extra details:
Additional logs:
The text was updated successfully, but these errors were encountered: