-
Notifications
You must be signed in to change notification settings - Fork 8.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Stop clearing all caches with yarn kbn clean
#67718
Comments
Pinging @elastic/kibana-operations (Team:Operations) |
@spalger I like the idea but why not just use |
Mostly because I don't think that most people think of
|
I like it, updating the name to current state should help a lot. |
@spalger I was not seeing this by that point of view. For me |
How about |
@spalger don't know how but you've got it again! 😄 |
Recent changes to packaging and use of Bazel remote cache make this not an issue we should continue to track. |
The issues yarn has with updating the node_modules to support massive dependency tree changes within our workspace (usually caused by switching between minor branches of Kibana) has led to a ton of people adopting the strategy of running
yarn kbn clean
beforeyarn kbn bootstrap
. I've spent the last several months trying to combat this but it is still pervasive best I can tell.I think we should adapt to the state of things by moving the full cache clearing powers of
yarn kbn clean
to a new command, maybeyarn kbn destroy-caches
, and updateyarn kbn clean
to only delete thenode_modules/.yarn-integrity
file and then runyarn kbn bootstrap
automatically.We could also support a
--force
flag that recursively deletesnode_modules
directories before runningyarn kbn bootstrap
, but unless you ranyarn kbn destroy-caches
would you delete the caches which actually make things better and shouldn't be removed constantly.Additionally, the
--no-cache
flag could be passed toyarn kbn clean
(just like we support withyarn kbn bootstrap
today) to bypass the bootstrap cache when it is automatically run.The text was updated successfully, but these errors were encountered: