-
Notifications
You must be signed in to change notification settings - Fork 275
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Prevent the upload of empty (0kb) files #33
Comments
Is the file on your local machine empty? |
The only thing I could imagine is somehow the download before through sftp ( I set VSCode like 2 weeks ago) created 0 kb files locally and the directory were synchronised to the remote server (I had syncMode full last week until I noticed the empty files on the server ) |
happened again 5 days ago with the config provided above... 500 files where uploaded empty :/ |
Same thing happened here with some hundreds of files. I can confirm that it doesn't correctly download all files - lots of them have 0 length. And, if you use syncMode full, it probably uploads them back to the server. Good news: I can reproduce it - it fails to download my whole directory structure every time. Bad news: no logs, even after activating the debug log :-(. When I first tried it after activating the debug log, it was generating a lot of EMFILE errors. I increased the soft limit for the max number of open files using ulimit -Sn 2000 (it was originally set to 1024) and now I have no more logs and still creates lot of zero length files :-( I wonder though: why does it need to keep so many files open at the same time? After increasing the soft limit to 2000 and trying again it says "[debug]: conncet to remote" and that's the last thing I see in the output panel. The messages in the status bar show that it downloads some files (it gets from one directory to another pretty quickly), but then it gets stuck at a certain file (same every time, it seems - an interesting clue!) and that's all. I left it like that for more than 15 minutes but nothing else happens. No output, no error. I am using Ubuntu Linux 16.04.3 and vscode 1.18.0 (but it did the same thing in 1.17). |
Because all the operations are paralleled. Does it feel so fast? @kataklys Thanks for your information. Could you provide your config? And that would be great if you can make a clone structure of your entire remote directory with same filename and format.(file content is unnecessary). I promise this will be fixed. |
Config file (I replaced confidential data with ******):
|
Could you try download at every subdirectory see if this happens? |
By the way, I was partially wrong - it doesn't get stuck on the same file every time, but there are just a few files that it blocks on, they tend to repeat. How can I choose just a subdirectory? Let me ask why: I start with an empty local folder, so I don't have any directory structure yet so I can't click on a certain sub-folder to download it. I just click somewhere on the left panel (the explorer I think it's called) and then SFTP download, which brings me the entire remote directory. |
Did some more tests. Deleted all folders from my project except one, and it still doesn't download it entirely. Good news is I can send you that folder because it doesn't contain anything confidential: it's just the vendor directory of a Laravel installation! |
@kataklys Bad news. The vendor.zip works well for me. |
...and other SFTP clients work well for me, when downloading the same folder from the same server. So, it's hard to draw a conclusion from this. We have to investigate more. Can your extension be configured to generate more debugging data? As I told you, I activated sftp.printDebugLog but it only says "connect to remote" when starting the download and that's all. It's downloading a lot of files without saying a word and then failing silently (I even left it over night! no additional messages). Is there any way to make it more verbose? |
I will make it more verbose when next update. |
Thank you! I want to help you find this bug, but it's a little hard to do it blindfolded ;-) |
Also reproduced the problem in 2 different ways:
I kept on testing in the second scenario. I enabled the transfer log on my SSH server and did the following:
Which means that, for some reason, vscode is not even trying to download that file! |
@kataklys So thanks. I've published a debug version with verbose log at debug-pacakge branch. Check it out! |
Ok, let's rock! Here's what it says for one of the files that has size 0 after downloading:
|
It can only tell that there is a failure when trying to read from Please help me answer these:
|
I doubt that it's the same set of affected files every time, because after every download I checked the size of the resulting folder, which should be 62MB. One time it's 44MB, another time it's 29MB, another time it's 9MB...it doesn't seem to follow a pattern. I'll keep doing various experiments that I have in mind and I'll get back with more info. |
I would guess somehow it triggers an os limitation(memory, disk, max opened files). I will try to limit the concurrency. |
Latest tests where done with a limit of 500000 for the number of open files. Maybe there's another OS limitation that I'm not aware of. |
So the answer is no? Did you increase the limit of open files on your ssh server? |
You mean specifically for the SSH server process? No. On the other hand, can you (as a client) ask a certain SSH server process to open a specific number of files? I assume that the server is written in such a way that it doesn't open them all at once, even if you ask for a large number. |
glad to see there is some progress. I'm currently on holidays but it happened for me when I was downloading a whole directory with lot of files / directories in it. Here are the server limits our dev server is using for the user
|
Ok, I think I got it. It's a matter of limits on the server too. Ubuntu had a default soft limit of 1024 open files and a hard limit of 4096 (and there are other distros that have the same settings). I changed it to something huge (100000 or so) and I still had the problem. The catch is that, when I reproduced the problem locally, using my own SSH server, I only changed the limit for the client! When changing the limit globally, in /etc/security/limits.conf, it magically started downloading all the files. Unfortunately, developers usually can't change that kind of setting on production servers, so I would suggest creating a configurable parameter of the extension, that controls the number of threads or the max number of open files or whatever @liximomo thinks it's best. Or/and just lowering the default numbers if it doesn't affect performance. |
@kataklys I’m truly grateful for your help. I think I will limit the max number to 512. Please wait for the update. |
You are welcome. I am looking forward to using your extension, now that the main problem is gone. |
@kataklys @HeavyTuned I've made an update of limiting the max concurrence file transmission to 512 at debug-pacakge branch. Could you help me test it? |
Thank you! installed it. Leaving tomorrow the country for one week. You probably won't hear anything from me until mid next week. So far no issues. |
Good job @liximomo ! It seems to be working fine now. I lowered the max file limit back to the default (1024) and downloaded a whole site (more than 11,000 files). There's no error in the sftp logs. I also used an app to recursively compare the original dir to the downloaded one and they are identical. Great! I have a question not related to this: I have a few things to say/fix/suggest about the example config file on the site. Should I create a new issue for that or is there any other way? |
@kataklys PRs are very welcome. |
Expected Behavior
add a option for not uploading empty files
Actual Behavior
I cant reproduce it on demand but the plugin often overwrites files on our servers with empty 0 kb files
I work on windows 10
Our config:
The text was updated successfully, but these errors were encountered: