-
Notifications
You must be signed in to change notification settings - Fork 256
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ProjectReference alias support for multiple versions of the same package within 1 project #12400
Comments
This feature request (in my opinion, this isn't just a design change of an existing feature), is really about supporting multiple versions of a package at the same time. If that worked, I wouldn't be surprised if aliases "just work". However, there would need to be significant changes to how NuGet, MSBuild, and the .NET Project System in Visual Studio work. tl;dr if/when this feature gets in, you should be able to use it, and this proposal would not be needed: dotnet/designs#242 Consider a package, let's say Newtonsoft.Json. It doesn't matter what version of the package you use, it always has one file I only know enough about assemblies and MSIL to hurt myself, but I'm curious if .NET assemblies even support having multiple references to the same assembly/module name, just different versions. And this only works if the assembly is strong name signed, which I think is only relevant to .NET Framework, not .NET Core (or maybe the .NET Core runtime just ignores the public key, but the strong name is still "burned in" at compile time) Secondly, most developers don't want multiple versions of the same assembly. When packages with good backwards compatibility are used, or when a customer needs to upgrade a transitive package to get a bug fix, it's desirable to have the version overridden. So, syntax needs to be proposed how developers signal when they want to use this multi-version feature. Using the earlier example of 1.0.0-2.3.4 all being compatible, and 3.0.0-3.4.5 being compatible, what's the syntax so NuGet knows when to "merge" dependencies and select the highest version, and when to keep a duplicate version? Lastly, a special shout out to the .NET project system in Visual Studio. They keep all MSBuild items in a data structure similar to In summary, the linked packaging shading feature linked above might help solve your problem. Otherwise if you (or your company) controls the package in question (API.SDK in your example), then you can consider adding the major version to the package ID, so that each breaking API change is a different package (with a corresponding change to the dll file name). Another option is to put each API version of your webapp in a different microservice and have a reverse proxy service dealing with the public URL routing, but that's a significant architectural change in your web app. Sorry to be so negative, it's not my intention, but there are a lot of challenges to being able to support multiple versions of an assembly or packages. |
For an API author wanting to provide a vended client, there are definitely choices to be made. For a client, it's a lot simpler. Clients intrinsically onboard to a specific version. When the update, it's a opt-in choice. It's very rare, if ever that a client wants side-by-side support. Unless that client is trying to vend their own managed client, I'm trying to think of a single case where I've seen that happen in the last 20 years. A meta package is just a package is just a package that collates other packages and has no content of its own. xUnit is a good example of such a package. Before the concept of a Framework Reference, platforms for like ASP.NET Core used meta packages too. Ultimately, this means that you evolve APIs and their clients independently as separate packages. If a consumer wants to be bound to a specific set, they reference the meta package. They can choose to update the meta package for the latest and greatest when they're ready to. |
@zivkan There's a lot to unpack here. The CLR does not care about file names whatsoever. While it's true most file system will not allow multiple file names (some files systems are case-sensitive) in the same location, that is irrelevant. Assembly probing is a long and battle-hardened approach that largely works the same way today as it did in Assembly binding redirects are very much alive and still happen. Modern tooling enables automatic binding redirects on your behalf. The ability to forward older TFMs to new ones has always been a little gray; especially in the .NET Framework era. Starting in .NET Core, assemblies have affinity to the TFM they were written against. While you might be able to get an assembly targeting an older TFM to load and run on a new TFM, there is no guarantee that will work. I have seen applications build successfully, only to fail spectacularly at runtime. It's worth noting that an assembly version is not a Semantic Version and they have different rules. NuGet does know how to unify and warn about versions if there are explicit upper and lower bounds. Unfortunately, many packages error on an open upper boundary. On one hand, this makes sense; especially to consumers of packages. On the other hand, it makes it very difficult to know with absolute confidence that the older package and version is compatible. Defining versions is also dependent upon human beings, which makes them intrinsically fallible. Not everyone knows, understands, or follows the expected rules and policies for binary, backward-compatibility. While it's definitely true there is wonkiness and differences between the CLI and Visual Studio, I'm not sure what you mean by:
Historically, VS has its own flavor-specific copy of MSBuild versus the .NET SDK. While there is a very large overlap, I believe that is still true today. The assemblies live in different places too. Declaring an item in MSBuild is intrinsically unique based on metadata. Items with the same key, but different metadata have to be merged. This used to be a PITA, but the This is the first time I've seen the Shading feature and, I have to say, I'm not sure I like it. It looks like the same high-level approach to unifying assemblies for .NET Standard. While there probably are some valid use cases, who is this actually for? It seems like a very uncommon scenario. I'd rather see #5556 land so library authors have the option to clip off the upper boundary when you publish multiple packages in a single solution that reference each other. It's a double edge sword as to whether you allow flowing through the to the next major version. It might just work. I've run into many cases, where things fail in unexpected ways and then people file bugs against an unofficially supported combination. |
The feature request affects a lot of things😁
I agree with everything you wrote, but what you wrote doesn't cover any practical considerations about how to actually use any of those features, which is what I understand this feature request to be about. I tried to list a few of things I expect the feature spec to cover before I would consider it ready for consideration. Normally when a feature can be wholly implemented by one team, I'd say a feature spec only needs to discuss the customer facing parts, and the team is responsible for the implementation details themselves. However, in this case there's a lot of impact across several teams components. If I want to be sneaky, I could also argue that since MSBuild doesn't support scoped variables (private/internal/public), basically everything MSBuild is customer facing, for customers who write MSBuild scripts. Even if we considered all the MSBuild stuff "implementation details", these issues still need to be sorted out so that each team can work on their respective parts. In case my previous message made it sound like these things are not technically possible, I'm sorry, that was not my intention. I was trying to communicate that creating a new project from the new project templates, and using "easy to use" features does not have a solution. So it's only achievable today by people who are experts in MSBuild (to copy relevant things to sub directories, which still making the dlls available for APIs to be available to the compiler), plus they also need to know a significant amount about the .NET runtime to configure it (either through code, or app.config for .NET Framework projects) to make the assembly probing work. These are all things I think this specific feature proposal needs to define in order for it to be considered for implementation.
What I wanted to say is that a lot of code in VS assumes that every MSBuild item has a unique identity, and has been working that way for multiple years, maybe even multiple decades. Much of that code will break if this assumption stops being true. It would be much easier to implement if we could make this feature work on the command line only, but I believe many people (customers & people working on .NET and VS) would not consider that acceptable.
You can give feedback about that feature in its design spec. The reason I mentioned shading is because its primary goal is to "solve" the problem when two packages in a graph both have a dependency on a 3rd package (diamond dependency), but the two packages depend on different, incompatible versions of the package. In your comment you mentioned you're aware that customers sometimes get into these scenarios, even if package authors consider them "unsupported". Maven implemented package shading years ago, so it's a tested solution to this problem. Package shading isn't exactly what this feature request is about, a project wanting to reference multiple versions of a single package, and call APIs on those different versions. However, if package shading was already implemented, the scenario this issue described could be mitigated by an additional level of indirection. Each incompatible API SDK could have its own project/package which has a unique assembly name (so no need for subdirectories and probing), each in a unique namespace (so no need for aliases), and each shades its dependencies (so no runtime failures because only a single version of the SDK API dll).
That doesn't solve the diamond dependency problem that shading aims to solve. I deleted a bunch of other stuff I wrote, because this issue isn't the right place to discuss package shading, but feedback we've gotten is that the downsides of package shading is worth the reduced engineering cost of dealing with the problems when it happens. In fact, it was the team who is dealing with the high engineering costs of resolving conflicts that proposed Maven's package shading to us, since they use that in their Java components to reduce engineering effort, despite the trade-offs. There's no perfect solution, and different people will have different preferences to what trade-offs are acceptable to unblock scenarios. |
Hard to disagree that these points are all technically possible, but the methods to achieve them are far from trivial. I definitely agree a switch in the track very quickly goes from the casual happy path to the dangerous and complex trail of doom. 😆 At one point or another, I've hit a few of these issues myself or I wouldn't even bother commenting. Having this support would be useful, but I can certainly see how it would be deprioritized. The ROI of importance doesn't seem to be there. That's not my call and the community voice is should be what drives it. I 💯 sympathize that if you do hit one of these pain points, it is infuriating to deal with. Fortunately, I'd reason that it's fairly uncommon. At some point it should be expected that level knowledge and skill does have to raise for some capabilities. You clearly have some deep understanding of MSBuild. I'm always a bit surprised that after 18 years how little people are familiar with it (I mean Java people usually know ANT). The It's ultimately up to the NuGet team (weighted by community demand) to decide if this feature would be lit up. As devil's advocate, the resistance to supporting it is understanding the difference between when side-by-side loading will occur versus binding redirection. As a recall, once you end up down one of those paths, there's no switching to the other. This can lead to unexpected behavior at runtime. That doesn't invalidate the usefulness of the feature. Unfortunately, there will inevitably be developers that think they need it, but don't and/or don't truly understand what is happening under the hood. This leads to a spike in filed bugs, even if it's the expected behavior. A user can do it, they will. We should be careful when giving people a loaded weapon. "That feature doesn't work the way that you think it does." has happened all to often. 😄 If there is a safe, straight forward way to enable this feature without violating POLA, I'm all about it. 👍🏽 |
I'm the guy from #6693. I don't need nice PackageReference syntax, package shading, binding redirects, assembly rewriting, resolving diamond dependencies and whatnot. I want the consumer of my package to be able to reference a dependency of any version, and my sole package to be able to use it, no matter how incompatible the dependency's versions are. (3 key points in italics.) Choosing the version of the dependency is the worry of the consumer. Being able to use it is mine. I want to "relax" binary compatibility requirements by implementing a proxy using extern aliases. That means less version conflicts, not more. If I understood correctly, PackageReferences with the same Include are problematic. I don't mind using some combination of PackageReferences and References just to make my projects compile. What I don't like is that the solution I came to in #6693 feels like a very fragile set of hacks upon hacks upon hacks. Does a clean solution to the simpler version of the OP's problem exist? As in, without multiple versions of an assembly at runtime, without a very specific PackageReference syntax etc. And if it doesn't exist, can it be implemented without breaking 20 years of MSBuild assumtions? |
NuGet Product(s) Affected
NuGet.exe, MSBuild.exe
Current Behavior
Scenario A. Direct Nuget Dependency
Scenario B. Transitive Dependency through direct Project Reference
Desired Behavior
In Scenario B. There currently appears to be some assumption precedent as 3.0.0 is the version automatically chosen over 1.0.0 and 2.0.0 without instruction.
Scenario A. This would compile and I would continue to expect NU1504 warning from compiler indicating that duplicate 'PackageReference' items found (good attention). Or a compilation error for certain scenarios that make it impossible for reasons beyond my current depth of understanding. At that point the maintainers may have to make a minor patch to update a transitive dependency or whatever is necessary to bring the two major versions of the Api.Sdk package into parity for the two assemblies to work along side of each other. This seems like an extreme edge case though and if to be ignored then in that case default to the behavior it is does today -- not possible.
In the case the transitive dependencies for each version of the same Api.Sdk package, shouldn't they continue to remain isolated like the existing behavior is for the situation when you have two Nuget references to separate packages but each has a transitive dependency on the same package but using a different version? Isn't this one area where Nuget really shines today?
Additional Context
@nkolev92 @commonsensesoftware Continuation of the renewed discussion on a closed issue #4989
@commonsensesoftware mentioned some other strategies for API version affinity here
#4989 (comment)
There are several ways to mitigate breaking changes in API to facilitate a smooth transition. I was just thinking of what I felt would make life easier and reduce the need for any creativity or extra overhead on the API management and maintenance side of the coin when you need to support multiple versions of an API side by side on the same instance and code base.
API deprecation is a common use case and when it involves breaking changes maintaining N-2 or more is doable, just doesn't feel the cleanest. I just thought perhaps this would just give developers another option and tactic for implementing and managing breaking changes in their API code bases. I'd say this approach is closest to option 2) for the consumer of the Sdk but for the maintainer for the API is closer to option 1) without having to clone all the breaking change classes or reference them in a unique way to maintain a version alongside since a Nuget packages previous version could provide that already. Or duplicate and maintain additional infrastructure. Just another option.
I would like to know where I could learn more about option 3) and a "meta package".
Tear it apart :)
The text was updated successfully, but these errors were encountered: