-
Notifications
You must be signed in to change notification settings - Fork 344
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Load env variables in the given secretName in Spark dependencies #651
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. Perhaps @objectiser would like to take a look as well.
@@ -79,6 +79,17 @@ func TestCreate(t *testing.T) { | |||
assert.NotNil(t, CreateSparkDependencies(&v1.Jaeger{Spec: v1.JaegerSpec{Storage: v1.JaegerStorageSpec{Type: "elasticsearch"}}})) | |||
} | |||
|
|||
func TestSparkDependenciesSecretSecrets(t *testing.T) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: TestSparkDependenciesSecrets
? - if changed can you also update the test name in the line below.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good suggestion. It is changed now.
Before merging, please rebase this PR. Perhaps we'll get a clean CI run then. A couple of axis in the first run failed due to some networking issue, and the second is failing due to actions/checkout#23. |
Signed-off-by: Gorka Maiztegi <[email protected]>
Signed-off-by: Gorka Maiztegi <[email protected]>
Signed-off-by: Gorka Maiztegi <[email protected]>
Everything seems fine now. Thank you for your comments guys. |
Thank you for your contribution! |
So, I store the Elasticsearch credentials in a secret which I then pass in the
storage.secretName
parameter. These credentials are then correctly loaded by the collector, the query and the Elasticsearch cleaner cronjob, but not by Spark dependencies.I am noob Go programmer so feel free to edit this piece of code as you see fit.