You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We maintain build servers (TeamCity) which run a large selection of builds VS2015sp2. They consume a variety of nuget packages, and occasionally we upgrade one component or other. Some components are rather large. Some components are only used in one project and not in others.
Problem: When deselecting a package in a project using the NuGet Package manager, the package is deleted on disk, even though it is consumed by other projects as well. Even when setting read-only attributes on the package folders, or creating the /packages/ structure with Junctions, the packages are deleted, the same happens when running nuget restore.
I urgently need a client side modification which will allow me to retain my cache the same way you can with Gradle, Maven etc. We are losing a lot of time in repeatedly retrieving packages which have not been upgraded or anything else.
I propose something like
nuget config -Set repositoryPath=D:\Build\Packages -Set shared=True -Set aging=6month
If Nuget finds these settings, the behaviour when changing dependencies, should be to only remove the ref in package.config for the individual visual studio projects, but leaving the files on disk, improving performance for all other builds.
The aging parameter can be added and used by the restore command to purge any packages older that say 0,3m,6m,9m, never, where 0 is the current behaviour.
WE maintain about 700 builds on our build farm, and our overhead with Nuget is substantiol, simply due to the extra time taken, to keep downloading packeges which we already have.
Thanks for the feedback. Understand the need.
Just did a quick repro. Looks like NuGet Package Manager will leave the package in the repository if one of the other projects in the solution is using it. However, if a project in another solution is using it, it doesn't clean it up.
As far as I know, we don't have a solution or workaround today. Assuming I'm correct, we'll add this issue to our backlog. Not positive how we'll address yet, but thanks for the suggestion. @harikmenon@yishaigalatzer
The solution here is to move to project.json (or its next incarantion directly inside msbuild) and global packages folder. There is no support for sharing packages folders across multiple solutions, and we don't plan on building that.
I'm suggesting to resolve this issue as won't fix, and invest our efforts in the global packages folder design.
Not to say that we can't periodically do a cleanup on the global packages folder, similar to the idea above. I recall an issue asking for a similar fix. Lets find it and link it here.
Hi Support
We maintain build servers (TeamCity) which run a large selection of builds VS2015sp2. They consume a variety of nuget packages, and occasionally we upgrade one component or other. Some components are rather large. Some components are only used in one project and not in others.
Problem: When deselecting a package in a project using the NuGet Package manager, the package is deleted on disk, even though it is consumed by other projects as well. Even when setting read-only attributes on the package folders, or creating the /packages/ structure with Junctions, the packages are deleted, the same happens when running nuget restore.
I urgently need a client side modification which will allow me to retain my cache the same way you can with Gradle, Maven etc. We are losing a lot of time in repeatedly retrieving packages which have not been upgraded or anything else.
I propose something like
nuget config -Set repositoryPath=D:\Build\Packages -Set shared=True -Set aging=6month
If Nuget finds these settings, the behaviour when changing dependencies, should be to only remove the ref in package.config for the individual visual studio projects, but leaving the files on disk, improving performance for all other builds.
The aging parameter can be added and used by the restore command to purge any packages older that say 0,3m,6m,9m, never, where 0 is the current behaviour.
WE maintain about 700 builds on our build farm, and our overhead with Nuget is substantiol, simply due to the extra time taken, to keep downloading packeges which we already have.
please update on [email protected] (github account) or [email protected] (nuget account).
The text was updated successfully, but these errors were encountered: