You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The proposal is that we add the concept of a "Wave" to each stage. In the default ADF pipeline, this would be Stage actions that have the same run order, and so happen in parallel.
Each "Wave" would happen in sequential order. allowing for some more control over what a deployment looks like.
Whilst it doesn't solve #250 straight away (The sequential / parallel limit is still 50 actions per stage) combining this with #285 would allow for the creation of some new pipeline types that could leverage integrations with CodePipeline.
How?
I'm proposing that we add a new property to the target schema definition for max_wave_size this would default if not set to the current parallel action of 50. (This ensures that there's no backwards compatibility issues and that existing pipelines aren't changed unintentionally)
This max wave size would then be set on the TargetStructure class and exposed as a property wave_size
Inside generate_pipelines_input, we would batch up the target_structure.account_list based on the wave size and set this into the pipeline template dictionary.
Inside the cdk_stack for generating the pipeline we would continue to process targets as normal however with one exception. Instead of targets being a list of targets. It will become a list of a list of targets. [[12345678910],[12345678911]] rather than the current [12345678910. 12345678911]
The same assumptions can be made that any properties defined for the first target in a wave also applies to the rest of the wave.
TL;DR
A significant change to how we format pipeline targets from the deployment map definition, that will not impact the current default pipeline, but will allow for more customisable custom pipeline types going forward.
The text was updated successfully, but these errors were encountered:
Thank you for your patience. I am happy to inform you that this feature has been released as part of v3.2.0 just now.
I'm hereby closing this issue. Please open a new issue if you are experiencing any issues related to this feature.
What and Why
This is partly inspired by: https://aws.amazon.com/builders-library/automating-safe-hands-off-deployments
But also working towards a solution to Issue #250
The proposal is that we add the concept of a "Wave" to each stage. In the default ADF pipeline, this would be Stage actions that have the same run order, and so happen in parallel.
Each "Wave" would happen in sequential order. allowing for some more control over what a deployment looks like.
Whilst it doesn't solve #250 straight away (The sequential / parallel limit is still 50 actions per stage) combining this with #285 would allow for the creation of some new pipeline types that could leverage integrations with CodePipeline.
How?
I'm proposing that we add a new property to the target schema definition for max_wave_size this would default if not set to the current parallel action of 50. (This ensures that there's no backwards compatibility issues and that existing pipelines aren't changed unintentionally)
This max wave size would then be set on the TargetStructure class and exposed as a property wave_size
Inside generate_pipelines_input, we would batch up the target_structure.account_list based on the wave size and set this into the pipeline template dictionary.
Inside the cdk_stack for generating the pipeline we would continue to process targets as normal however with one exception. Instead of targets being a list of targets. It will become a list of a list of targets.
[[12345678910],[12345678911]]
rather than the current[12345678910. 12345678911]
The same assumptions can be made that any properties defined for the first target in a wave also applies to the rest of the wave.
TL;DR
A significant change to how we format pipeline targets from the deployment map definition, that will not impact the current default pipeline, but will allow for more customisable custom pipeline types going forward.
The text was updated successfully, but these errors were encountered: