-
-
Notifications
You must be signed in to change notification settings - Fork 171
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
All packages in repo get version bump for every commit, even irrelevant ones #160
Comments
Good question. Yes, theoretically we could probably have a git height walk that is scoped to the path under the version.json file, so far I haven't done it, for this reason: Files outside that path on your file system can still influence the packages. For example, if you have a Directory.Build.props file at the top level of your repo that spans your individual version.json files, and you change that top-level file, then all your individual packages should be rev'd. Yet there's no way a scoped height counter would capture that. You mention you don't know whether to update your project dependencies given the current system. Can you elaborate on that? Project references don't include version information, so what is there to update? I have several repos that build multiple packages and whenever we have a build we want to release, we just publish all the packages together. Yes, some of them may not actually have changes to them, but I've never had a customer complain yet. |
Ah, yeah that does make sense with regards to changes above the project affecting the project... Sorry, I mean that when consuming the updated NuGet packages in a different project (and repo), we now have updates for all of our packages even when only one package has actually been updated (based on code changes). It would be nice if there was someway of knowing when something actually changed. |
It varies. Sometimes I'll do what you say, by way of defining an MSBuild property with the version number in it and using that as an msbuild macro everywhere a But as I actually don't like referencing the "latest" version of a package, but rather the oldest one that meets my requirements, this isn't usually an issue. I'll only update a package reference when I need a fix that the update includes. In case you're interested, I avoid always using the 'latest' packages because that puts undue constraints on folks that use my packages. For instance, if my package references Newtonsoft.Json 10, then all my users have to use that too. But if I use newtonsoft.json 6, then my own users can use any version 6 or above, giving them more flexibility. |
@AArnott I've found a new case where having it based on project/per folder would be helpful. We're currently using a Service Fabric VSTS Task to update our Service Fabric manifest versions and what this does is compare the Code + Config packages with the previous build. If they differ, then it will mark that package as having changed, update the version, and then update the places where it references that service with the new version. Can you think of a good way to improve this? |
@AArnott any thoughts on this? It could be nice to have an option to do the git height on a per project (per |
I suppose we may be able to come up with an algorithm to only count commits where the git tree changed at or below the version.json file. Specifically by perhaps passing in a predicate to |
I could perhaps pick up this work in a few weeks. Or if you make this change with at least a starter set of unit tests to cover it, that could accelerate it getting completed. |
We'd really appreciate this too. The way dependencies work in NuGet means that if a project version gets updated it will depend on the latest version off all the other projects in the solution, and as we version the projects independently that gets quite painful! Of course we'd like to be able to change that NuGet behaviour too by being able to specify a version range in ProjectReferences, but that's another issue... |
@AArnott was there an update on this? We're looking at implementing this to avoid our version-bumping hell, but we're generally opposed to this common bumping for all (60 - yikes) projects in our repositories. Moving the projects to 60+ repos is not an option :). .. so it'd be very awesome to have this feature implemented :) |
Is there quick way to get this out, for example creating a Directory.Build.targets file to override the default NBGV nuget behavior and only make NBGV generate new version info only if MSBuild decides that a project should be rebuilt? |
I'm afraid not. The height calculation is deep inside the heart of nb.gv, and it just counts commits, only caring about the version.json file content at each step. |
I'm trying out the NBGV CLI tool. I found (as you stated) that If I don't install the nuget package, then MSBuild will not trigger a rebuild after a new commit, because there's no nuget package to generate the version info for MSBuild to consider as a code change. So, if I were to include something in MSBuild before build event so that it runs |
@ShamsulAmry this seems off topic. Can you create a new issue for this question? |
I'm not sure how is it off topic. Let say I have few projects: ProjectA, ProjectB and ProjectC. All of these projects are installed with NBGV nuget package to generate their version info. One day, I made a commit for a change in ProjectA. I expected that MSBuild will compile and increase version for ProjectA only, because only that project changed. However, after a commit, the NBGV nuget package installed into those 3 projects will regenerate version info for all 3 projects. Which in turn will cause version bump for ProjectB and ProjectC. Which is irrelevant because there's no actual code change to ProjectB and ProjectC. Doing what I proposed was just an idea to work around the version bumping of ProjectB and ProjectC triggered by the nuget package, while waiting for a better solution to be built into the nuget package. p/s: I think my question falls under the discussion of the same issue, that is to not get version bump for unchanged projects. No? |
Ah, I see. I didn't understand that you were trying to substitute the package for the CLI tool as a workaround for this. I don't think you'll be pleased with the result. While it is apparently true that the nuget package influences the incremental build and invalidates the package so that it is rebuilt on a commit, avoiding that by using the CLI only skips the package build on that very same machine that just barely built the previous one. If you're on a different machine, or do a rebuild (or sneeze), msbuild will build the package anyway and you'll get the new version. |
Yes, I am aware that building on a separate machine will result in a new version. I configured another machine as a build server and made it so that it will do build incrementally. I know it will result in inconsistent versioning (which I think I understood is what NBGV trying to solve), but for my use case, it's okay to have inconsistent versioning. I'm using the version increment to auto-trigger release of new components from the build server. The releases will then need to be manually approved before being deployed to our clients. Having NBGV bumping the version for all components, I need to manually delete the unchanged component releases before handing it off to the other team for deployment approval. So I'm currently not so pleased that I need to manually delete the unchanged component releases in the otherwise already fully automated release creation process. If the clients report any issue, I find it very easy to just refer the commit hash being stamped almost everywhere by NBGV to troubleshoot accordingly. In this case, I don't really refer to the version numbers, just the commit hash will do. So the increasing version number helps with the auto release creation. And the commit hash is already helpful enough for troubleshooting. The workaround should work (not yet tried), so that should solve the process of manually deleting unchanged component releases. And in the rare case that we need to trigger a rebuild which will bump all the versions, well, since it's rare, it's all good enough. For now, I will try to (mis)use NBGV CLI to achieve my goal of build to release automation. Really nice! Looking forward to future versions. 👍 |
CALL FOR VOTE Consider a situation of projects A and B, where A references B. B is changed. What should happen to A's version? Please thumbs up this particular message if and only if: A should experience a version bump. Please thumbs down this particular message if you do not expect A to experience a version bump. IMO, if B is a packaged project of its own, I can see an argument where A needn't be incremented (although it means installing A will give you an older version of B). But if B is not a packaged project, it seems paramount that A's version is incremented or else the change to B will never ship. |
I'd say if you want A to change you'd need to specify the dependency change somewhere in A too (like bump a lower version for B in a nuspec file) so only B should be bumped automatically. |
@bencyoung Suppose you want to ship A again with a higher version, so that it packages up B. What would you change in A to induce the version bump, if it isn't implicit by changing B? If these are msbuild projects, with a P2P ref, nothing naturally needs to change to A, so whatever you change to bump the version, it feels to me unnatural. |
I think it would be awesome if a change in B would increment the version of A if A was dependent on it. However, that seems like it may be pretty hard to do? |
Are you talking about solution level Project Dependencies? Or are you just talking loosely about all project references? Project references are certainly a Good Thing. Solution level project dependencies... not so much. |
It's not a model we use (packaging multiple assemblies in one NuGet). If the API is changed then we'd have to change A anyway to use the new interface. If the API doesn't change then A works with the old and new version of B so it doesn't matter that the version doesn't change If we did pack together then I'd prefer to use something like Fody.Costura to pack into one assembly with one version and we'd have to do something different anyway? Currently we manually version each assembly, so anything is an improvement |
I might just keep it simple then, and give the ability to only count commits in which a given directory actually changes (including any descendant). Supporting multiple top-level directories, or automatically figuring out those directories via P2P, could maybe come later given demand and a reasonable design to deliver on it. |
We've been having a hard think about how to handle this better. As people say, it's non-trivial. Current set-up
Naive transitive set-up which filters the git height based on transitive refs at HEADSame steps 0, 1, 2. But when we get to step 3:
The solution to this is obviously to filter the history for relevant paths using the reference graph at that commit, instead of calculating it once for current HEAD. This unfortunately means that you need to calculate the graph at every commit, which sounds pretty expensive to me. You can obviously cache some of this stuff, but it won't help you for clean continuous build scenarios, etc. There's also the issue of what to do if you move the library within the repo. Probably require people to fix it manually with their Despite the issues here, we're sufficiently keen for this to work that we're going to have a proper look at this and see how slow/feasible it might be. It'll need to work properly for |
Great points, @herebebeasties. Thanks for investigating this. We get a little negative feedback about the perf hit NB.GV has on the build already. Establishing the transitive closure of project references requires a design-time build and yes it is pretty expensive. Far too expensive to add (many!) thousands of iterations of it for the build of (just one) project. Some aggressive and clever caching might help, but with Azure Pipelines, GitHub Actions v2, etc. all offering cloud builds on clean agents, I'm afraid the caching won't help in those builds. Local dev builds would likely benefit from it. If we can come up with a good, solid solution that doesn't compromise reliability (at all) or perf (too much), I'd entertain a PR. But to save you some potentially wasted effort I'd love to review your plans and learnings as they come before you prepare for a final PR. One suggestion: if you can come up with a reasonable solution that impacts perf within acceptable bounds for some folks, and can offer a switch in version.json that will disable the feature and not incur any perf penalty, then that sounds like a good PR to take. |
I think it should be opt-in, not opt-out, as it's bound to result in version numbers going backwards when added, otherwise. Newer SDK-style projects probably save us a bit, in that at least they don't thrash their file contents so much. Totally agree on the caching aspect, as I mentioned - it needs to be fast enough full stop. To that end, I don't think doing a design-time build is the way to go. I'm aware of all the subtleties of not doing that (Directory.build.props files that might themselves have imports driven by variables or whatever other clever people are using) but there are subtleties the other way around, too. For example, what if a project reference is conditionally included based on target framework? You could end up with a different (divergent) version number per framework, which doesn't sound like what you'd want. Better to just treat all transitive imports/refs as if they're unconditional IMO. My team has some experience here, so we'll look into it all a bit further and let you know. |
I've opened #399 to solve this problem in a 'low tech' way. I've spent some time trying to figure out how this could be done by traversing commits and calculating the dependency graph at each commit, but it's very difficult and expensive. It is a relatively cheap operation (order of milliseconds) to calculate the dependency graph at head using Microsoft.Build.Evaluation.Project. The difficulty comes when you try to do this for other commits - Microsoft.Build.Evaluation has no support for faking the file system, so you would have to checkout the repo at each commit each time. This would obviously be extremely costly. An alternative would be to write your own project file evaluator to determine the dependency graph, but this would be quite an undertaking. |
@saul I think what you have is a reasonable middle ground. And it actually is very similar to something we use internally at Microsoft to build Visual Studio and track versions of setup packages. It's certainly not perfect, but as we've discussed here, 100% correct results would be very expensive. In the interest of "not letting perfection be the enemy of good enough", I think #399 shows promise. I look forward to its development. Other thoughts? |
Just to give you an idea of how I'm hoping to use the feature in #399: In my team at work we have a library for automatically bumping the version in version.json when we see the API surface has changed (bump major for removal, bump minor for additions). I plan to extend it such that we bump the minor version if the dependency graph got smaller, and also keep the This would mean that GitVersioning would remain as hands-free as possible for us (even if it means occasionally bumping the minor version when it's not needed). At least then we wouldn't have to manually keep |
Neat, @saul. Maybe you can share your tool or make it open source so folks can tweak if necessary. |
What if there was a build step that incremented the version when there are changes to the specific project using git to compare the project. |
@loligans Where would it increment the version? Are you talking about a source change? The whole point of this project is to automatically bump the version per-commit without a source change involved. |
We have multiple projects inside a repo for all of our shared code that we use to build Nuget packages from (one Nuget package per project). I just started using this project for versioning all of our packages and really liking it so far - thanks for the awesome work!
The one thing I'd like to improve is how to get the Git Height working well for multiple projects.
Right now, if we make changes to one project (say 6 commits in one folder for the one project) then it will result in the version number being bumped up by 6 for all of our projects:
This makes it hard to know when to update our project dependencies as it's unclear whether it's actually a new package or not.
Of course, this makes sense based on how the version is calculated, but I was wondering if there is a way to have this be based on the folder/project without having to move to separate git repos for each project. We currently have a
version.json
file per project so it would be neat if the git height could be calculated from the location of theversion.json
file down or something like that.Any suggestions on how to do this better?
The text was updated successfully, but these errors were encountered: