-
Notifications
You must be signed in to change notification settings - Fork 92
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Terraform plan wants to revert changes made by the autoscaler #462
Comments
One more thing is that if we do terraform apply against the corresponding plan the error appeared:
|
Ok, there are two separate issues:
|
Ok, we've decided to opt-in for autoscaling. And rewrote the module in a way that autoscaling is not configurable and topology.size is ignored. Diff is more or less respectable. However the error
still prevents us from applying changes. |
@andrewnazarov , regarding this issue
It looks like the problem is the one mentioned by @tobio in #401 (comment) - if you remove Also, the error message should be fixed in the provider 0.4.1. |
@andrewnazarov , I think you can try to use autoscaling with configurable topology using something similar to the below snippet. The important part is to specify topology elements for all tiers and list them in alphabetical order (this is a limitation of the implementation - please see #336 (comment) for more details)
The snippet sets initial size for |
IMO this should be part of the autoscaling example within the provider docs. |
If `autoscale` is set, all topology elements that - either set `size` in the plan or - have non-zero default `max_size` (that is read from the deployment templates's `autoscaling_max` value) have to be listed in alphabetical order of their `id` fields, even if their blocks don't specify other fields beside `id`. Recommend use of `lifecycle`'s `ignore_changes` attribute. Addresses elastic#462
If `autoscale` is set, all topology elements that - either set `size` in the plan or - have non-zero default `max_size` (that is read from the deployment templates's `autoscaling_max` value) have to be listed in alphabetical order of their `id` fields, even if their blocks don't specify other fields beside `id`. Recommend use of the `lifecycle`'s `ignore_changes` attribute to ignore potential changes that can be made by the autoscaler. Addresses elastic#462
If `autoscale` is set, all topology elements that - either set `size` in the plan or - have non-zero default `max_size` (that is read from the deployment templates's `autoscaling_max` value) have to be listed in alphabetical order of their `id` fields, even if their blocks don't specify other fields beside `id`. Recommend use of the `lifecycle`'s `ignore_changes` meta-argument to ignore potential changes that can be made by the autoscaler. Addresses elastic#462.
If `autoscale` is set, all topology elements that - either set `size` in the plan or - have non-zero default `max_size` (that is read from the deployment templates's `autoscaling_max` value) have to be listed in alphabetical order of their `id` fields, even if their blocks don't specify other fields beside `id`. Recommend use of the `lifecycle`'s `ignore_changes` meta-argument to ignore potential changes that can be made by the autoscaler. Addresses #462.
@andrewnazarov, I'm going to close the issue. Please feel free to reopen it or open a new one if the proposed workaround doesn't work for you. |
Readiness Checklist
Expected Behavior
The possibility to apply changes without reverting changes made by autoscaler.
Current Behavior
For our deployments we enabled autoscaling. First, it was just set via
autoscale = true
. The EC performed the procedure successfully. However, after some time, we noticed that theterraform plan
sees the difference and wants to revert changes made by the autoscaler. We tried to define theautoscaling {}
block in a hope that maybe it would help not to detect changes, but it didn't help. Note "16g" -> "8g" change:If we want to have a configurable autoscaling it's not obvious how to achieve it with ignore_changes.
## Terraform definition
Steps to Reproduce
Context
We would like to have autoscaling configurable and a possibility to continue applying tf code without reverting changes.
Possible Solution
Your Environment
The text was updated successfully, but these errors were encountered: