-
Notifications
You must be signed in to change notification settings - Fork 12.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Google feedback on TS 4.1-beta #41369
Comments
Thank you for writing this up! 😀 |
+1, we really appreciate this level of feedback. Glad to hear it was a mostly positive experience! |
I've tested
I also separately tried using the |
The table below gives more accurate numbers acquired through a slightly different method.
|
Thanks for these write-ups! They have been a helpful reference for us because we upgrade a large monorepo at Airbnb as well. Out of curiosity, in addition to looking at the new errors/issues introduced by the upgrade, do you also do any performance measurements? For example, we've noticed slowdowns from the 3.7 to 3.8 and 3.9 to 4.0 upgrades. Have you noticed that as well or have things been roughly the same given your build system setup. Also, curious if you've developed some automated way of detecting and isolating performance regressions across versions at Google. Thanks for the tips! |
Do you mean slowdowns in compiler speed, or in application performance?
We have not noticed a substantial change in TS compiler performance (e.g.
50th/90th percentile compiler performance). Admittedly our signal is pretty
noisy in 90th, but 50th looks flat in 3.9->4.0. We do not have good broad
data on IDE performance but would expect this to match compiler performance.
We have some command line tooling that spins up a compiler and type checks
& auto-completes some code through the language service API. We use that to
reproduce performance changes, but it's not particularly automated (the
debugging process roughly follows the movie script of The Men Who Stare at
Goats).
|
Was meaning time to compilation completion (which as you mentioned does have an impact on editor performance). In CI, we build 8M lines of TypeScript. We cache our TS output directory between CI builds so that we can incrementally build from it on subsequent builds. We invalidate our cache whenever node_modules changes (#38648). So, when we do a TS upgrade we do a cold build. I was looking at these cold build timings to get a sense of the raw performance of the upgraded version, and we also want to generally keep these full-rebuilds under a 15 min budget for developers who make changes to the package.json. We've seen these cold build times trending upwards as part of the last few upgrades. In these tests, we use the same snapshot of the code, vary the TS version, and run a few trials to get an average. We did some timings when upgrading from 3.7 to 3.9 and found Since then, we've added/migrated more TS code, parallelized our build, and sharded it across five machines. The maximum time of one of the shards represents how long a developer needs to wait until the overall build is finished. On average, we found the longest shard to take 1m 14s more time. Here is a replica of our repo setup: https://github.com/brieb/ts-monorepo Q: Have you noticed a change in the full rebuild times across upgrades? Q: Out of curiosity, how long do your 50th and 90th percentile builds take? How many lines of TS are there? Q: Do you use a local build cache on developer machines so that you can make use of the cache locally as well? Side note: We are in the process of switching to Bazel for our TypeScript builds which should hopefully make things way faster. And perhaps there's a way we could encode our package dependencies explicitly so that we don't have to blow the entire cache away when something in package.json / node_modules changes. The better cache utilization should help us easily clear our 15m budget.
Q: Have you thought about doing CI measurements on this at all? Or are the compilation times a reasonable enough proxy, as editor performance is faster when compilation is faster?
😂 |
I think our setup is unfortunately not really comparable. I hope what I write below still might be useful or at least curious to you. We're using bazel (or its Google-internal cousin). All builds are incremental for us, we don't have full rebuilds. We compile compilation units of (only) hundreds of lines at a time, with all their dependencies coming in as (cached) For cluster-run builds of individual targets whose compilation ended up contributing to the critical path of a build, P50 is at <1 second, P99 is at ~15 seconds, and we haven't seen them move substantially across recent upgrades. But again, those numbers are in no way comparable to what the typical open source toolchain would see. Re local build cache, developer workstations pull compiled files from the compilation cluster (with precise hashing/cache invalidation, so they work across builds). It's essentially the same as caching the Re CI measurements - we don't make a difference between CI and regular builds, so we just measure builds. The only minor difference is that we track incremental interactive builds separately, as those tend to run on local machines instead of the cluster (depending on size of the build and capacity etc). |
@DanielRosenwasser Do you have any thoughts on what might have caused the TS3.9 -> TS4.0 slowdown we observed here #41369 (comment)? Any advice on changes we can make to our setup to avoid the perf regressions introduced in 4, so we could proceed with the upgrade? We haven't merged the upgrade yet, because we wanted to avoid slowing down our CI jobs. Also, do you have any advice for catching this more proactively, in a way that would be beneficial to the TS team? Perhaps we could set up an automated build process that pulls down the latest TS snapshots to pinpoint when slowdowns happen, or come up with a representative example monorepo to add to your profiling suite? EDIT: Also, realized that my comments are beyond the original purpose of this thread, so I'll file a separate issue for this. Filed here: #41570 |
No worries! I'll continue the convo on the other thread though. |
I think this would be extremely valuable as long as it's not a burden on you. If we could pinpoint regressions with that much granularity, it would probably go a long way for everyone. Let us know if there's anything we can do to help there! |
FTR, Google's upgrade to TS 4.1 is complete. |
This GitHub issue contains feedback on the TS 4.1-beta release from the team
that is responsible for keeping Google's internal software working with the
latest version of TypeScript.
Executive summary
We do not expect to have significant difficulty in upgrading Google to TS
4.1.
We found one change to the emitted JS that broke our
tsickle
tool's outputin some cases. We will fix
tsickle
to match the new behavior.Some changes to our TypeScript code are required to make it compile with TS
4.1.
Most of the new errors reported by TS 4.1-beta are clearly related to
announced changes.
The remaining new TS error messages are few and appear to report
legitimate source code errors that were previously missed by TS 4.0.
Detail sections below explain the changes to our code we expect to make
to unblock the upgrade.
Impact summary
resolve()
parameter requiredabstract async
disallowedany
/unknown
propagated in Falsy positionsThe Announced column indicates whether we were able to connect the observed
change with a section in the
TS4.1-beta announcement.
The following sections give more detailed explanations of the changes listed
above.
CommonJS exports initialization emit change
Our
tsickle
tool depends on CommonJS mode output fromtsc
, which it furtherconverts into the
goog.module()
format used byclosure-compiler
.In TS 3.9
tsc
started emitting a statement to initialize exported symbols tovoid 0
before the statements that give those symbols their intended values.This seems to be for reasons having to do with CommonJS spec. (See
#38552 (comment))
During our upgrade to TS 3.9, we updated
tsickle
to recognize and drop thisstatement, because it violates
closure-compiler
's input expectations.In TS 4.1 the emit changes so that the initialization is broken into one or more
statements containing no more than 50 variable assignments.
This resulted in compile time errors from
closure-compiler
, becausetsickle
only recognized and removed the first statement.
We will update
tsickle
to recognize and drop all of these statements. We haveconfirmed with a draft change that doing so fixes all cases.
resolve()
parameter requiredThis change was an
announced part of TS 4.1-beta.
We believe fixing the type errors that arise from this change will improve the
quality of our code.
Our team will likely unblock the TS upgrade by inserting a cast to
any
foreach of these errors along with a
TODO
comment containing the error messageand a request to the team owning the code to investigate and make a more correct
fix when they can.
miscellaneous TS errors
There are several different error messages in this category, but all appear to
be cases where TS 4.1-beta correctly identifies coding mistakes that were
previously missed.
Exactly how we resolve these error messages will be determined later, but we
will likely resolve most of them by inserting a cast to
any
for the erroralong with a
TODO
comment containing the error message and a request to theteam owning the code to investigate and make a more correct fix when they can.
abstract async
disallowedThis change was an
announced part of TS 4.1-beta.
This seems like a reasonable change to the language to us.
We expect to fix these errors by dropping the
async
and changing the currentreturn type from
T
toPromise<T>
.any
/unknown
propagated in Falsy positionsThis change was an
announced part of TS 4.1-beta.
We believe fixing the type errors that arise from this change improves the
quality of our code.
Our team will likely fix these errors by inserting
!!
before the non-booleanexpression that causes the error.
The text was updated successfully, but these errors were encountered: