-
Notifications
You must be signed in to change notification settings - Fork 281
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
FR: Support multiple scala language versions in one workspace #1290
Comments
@virusdave do you have any ideas how should multiple version support look from a user side? I think having Scala targets that build for different versions is involving, but not hard to achieve (fix compiler bootstrapping, toolchains and parametrize rules), what is hard how do we configure and provide deps for different Scala versions. I see a few challenges related to deps:
|
@liucijus We want to work on supporting multiple scala versions in one build. Inspired by pythonI was looking into It has some utility macros which allows to specify which versions you want to use, which version is the default and it generates a repository for each version. The rules are wrapped in transition that changes python version. User can import Intitial approachThis approach would at least enable us to create separate, unrelated targets that build for different version of scala in one build. In this basic approach, I believe users could specify each library with required version in Assuming shared targets that can be built with multiple versions of scala, this could be archieved with macros. For example As for IDE support, I think that during import we could force to only use targets with default scala version and it should be good enough to work with existing tooling. I think that having separate targets with different scala versions that are not crossbuilt could be supported in the IDE. Targets with shared sources could be messy but I think we could somehow handle it. More complex approachMaybe more involved idea would be to only do target transition on _binary and _test targets and _library targets would be transitioned through the deps field of _binary/_test targets. The library targets would use Backward compatibilityAlso, the single scala version in the repo would look as usual, using rules without transitions, so the simple use case would not break. I'd be glad to hear what do you think about these ideas and if it makes sense. If it is doable, could you provide some initial guidance on what needs to be fixed to enable scala rules to use transitions? We already know that the SCALA_VERSION dependent code has to be refactored to read this information from the toolchain, but for example I don't know what is the issue with compiler bootstrapping. |
@lukaszwawrzyk thanks for looking into this! |
I prepared a poc that demonstrates a possible way to register multiple scala toolchains
The toolchains are then restricted to a specific scala using:
Please let me know what you think about this. If this approach is approved, in the next step, I can modify |
This exciting. A couple of comments:
|
@mateuszkuta256 looks good and definitely worth trying out. @simuons what do you think? |
I really hope we can do this without breaking anyone. Migrating monorepos isn't a fun task. Putting that tax on everyone to help what I believe to be a rare case isn't great. Anything that requires a change to existing rules (which is to say O(N) changes in your repo) I hope we can rule out. Making some O(1) change to the workspace setup I think isn't too burdensome but if there is a speed bump to upgrade we are encouraging users to fork or stay on old versions. |
This prepares us for changes in `rules_scala` that will allow customizing the Scala version for each target. In order to achieve that, we can no longer resolve the Scala SDK globally and need to use per-target info. We still use the mechanics of discovering the Scala SDK based on the compiler class path. We look particularly into one of a dep providers of a Scala toolchain, namely the `scala_compile_classpath` – a canonical place to put all compile-related jars. This change is backward-compatible, as the mentioned data is already available. It is also forward-compatible with the anticipated cross-build feature of `rules_scala`. (see: bazelbuild/rules_scala#1290) The aspect will produce additional data – namely few compiler classpath jars per Scala target. For modules using different Scala versions (either directly or as dependencies), a mix of libraries and/or SDKs for different Scala versions will be returned.
This prepares us for changes in `rules_scala` that will allow customizing the Scala version for each target. In order to achieve that, we can no longer resolve the Scala SDK globally as the maximal version used. We need to use per-target info. Scala SDK will be still discovered based on the compiler class path. Now though we will look into one of a dep providers of a Scala toolchain, namely the `scala_compile_classpath` – a canonical place to put all compile-related jars. This change is backward-compatible, as the mentioned data is already available. It is also forward-compatible with the anticipated cross-build feature of `rules_scala`. (see: bazelbuild/rules_scala#1290) The aspect will produce additional data – namely few compiler classpath jars per Scala target.
This prepares us for changes in `rules_scala` that will allow customizing the Scala version for each target. In order to achieve that, we can no longer resolve the Scala SDK globally as the maximal version used. We need to use per-target info. Scala SDK will be still discovered based on the compiler class path. Now though we will look into one of a dep providers of a Scala toolchain, namely the `scala_compile_classpath` – a canonical place to put all compile-related jars. This change is backward-compatible, as the mentioned data is already available. It is also forward-compatible with the anticipated cross-build feature of `rules_scala`. (see: bazelbuild/rules_scala#1290) The aspect will produce additional data – namely few compiler classpath jars per Scala target.
This prepares us for changes in `rules_scala` that will allow customizing the Scala version for each target. In order to achieve that, we can no longer resolve the Scala SDK globally as the maximal version used. We need to use per-target info. Scala SDK will be still discovered based on the compiler class path. Now though we will look into an implicit dependency of each target, the `_scalac`. This change is backward-compatible, as the mentioned data is already available. It is also forward-compatible with the anticipated cross-build feature of `rules_scala`. (see: bazelbuild/rules_scala#1290) The aspect will produce additional data – namely few compiler classpath jars per Scala target. Perhaps we could review this change in the future to normalize the data back. Now it's not possible, as the data about alternative configurations (here: of scalac) is discarded in the server.
This prepares us for changes in `rules_scala` that will allow customizing the Scala version for each target. In order to achieve that, we can no longer resolve the Scala SDK globally as the maximal version used. We need to use per-target info. Scala SDK will be still discovered based on the compiler class path. Now though we will look into an implicit dependency of each target, the `_scalac`. This change is backward-compatible, as the mentioned data is already available. It is also forward-compatible with the anticipated cross-build feature of `rules_scala`. (see: bazelbuild/rules_scala#1290) The aspect will produce additional data – namely few compiler classpath jars per Scala target. Perhaps we could review this change in the future to normalize the data back. Now it's not possible, as the data about alternative configurations (here: of scalac) is discarded in the server.
This prepares us for changes in `rules_scala` that will allow customizing the Scala version for each target. In order to achieve that, we can no longer resolve the Scala SDK globally as the maximal version used. We need to use per-target info. Scala SDK will be still discovered based on the compiler class path. Now though we will look into an implicit dependency of each target, the `_scalac`. This change is backward-compatible, as the mentioned data is already available. It is also forward-compatible with the anticipated cross-build feature of `rules_scala`. (see: bazelbuild/rules_scala#1290) The aspect will produce additional data – namely few compiler classpath jars per Scala target. Perhaps we could review this change in the future to normalize the data back. Now it's not possible, as the data about alternative configurations (here: of scalac) is discarded in the server.
* Pre-change refactoring * Resolve Scala SDK independently for each target This prepares us for changes in `rules_scala` that will allow customizing the Scala version for each target. In order to achieve that, we can no longer resolve the Scala SDK globally as the maximal version used. We need to use per-target info. Scala SDK will be still discovered based on the compiler class path. Now though we will look into an implicit dependency of each target, the `_scalac`. This change is backward-compatible, as the mentioned data is already available. It is also forward-compatible with the anticipated cross-build feature of `rules_scala`. (see: bazelbuild/rules_scala#1290) The aspect will produce additional data – namely few compiler classpath jars per Scala target. Perhaps we could review this change in the future to normalize the data back. Now it's not possible, as the data about alternative configurations (here: of scalac) is discarded in the server. * Cleanup after * Output "external-deps-resolve" again * Code style fix * Remove `_scalac` again from compile deps
We would like to be able to use this functionality. Do we have an idea of when this will be available in rules_scala mainline? |
An update: I have no more commits planned for this feature. Most common cases should be covered now.
|
* Pre-change refactoring * Resolve Scala SDK independently for each target This prepares us for changes in `rules_scala` that will allow customizing the Scala version for each target. In order to achieve that, we can no longer resolve the Scala SDK globally as the maximal version used. We need to use per-target info. Scala SDK will be still discovered based on the compiler class path. Now though we will look into an implicit dependency of each target, the `_scalac`. This change is backward-compatible, as the mentioned data is already available. It is also forward-compatible with the anticipated cross-build feature of `rules_scala`. (see: bazelbuild/rules_scala#1290) The aspect will produce additional data – namely few compiler classpath jars per Scala target. Perhaps we could review this change in the future to normalize the data back. Now it's not possible, as the data about alternative configurations (here: of scalac) is discarded in the server. * Cleanup after * Output "external-deps-resolve" again * Code style fix * Remove `_scalac` again from compile deps
Not a new desire or request, of course. See this prior issue as an example.
We'd really like to be able to build different targets with different versions of Scala to make language version upgrades sane & realistic in large codebases. Atomically changing everything at once gets really difficult as codebases scale, especially with large external dependencies like Spark.
The text was updated successfully, but these errors were encountered: