-
Notifications
You must be signed in to change notification settings - Fork 344
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Spark dependencies as child of 'dependencies' #156
Comments
+1 I don't like to expose However, I don't see benefits scoping it in a separate object under dependencies like
The most I like is |
@jpkrohling @pavolloffay Sorry didn't pick this up on the original review, but the spark PR has elaborated all of the (additional) storage properties in the CRD model - which is different to how the storage options work. One suggestion - we could simply have:
(or dependencies at the top level) Another possibility would be to just have the options included under the |
Downside of using poperies is documentation and perhaps type validation. With properties we will not be able to auto-generate docs.
Seems confusing to me. |
Agree - but that is the approach currently being taken in the operator, so from a consistency approach it would be better. |
I do not see that consistency. I think it's used only for jaeger services (query, collector) where the props are passed as flags because these services support flags which makes easy to use map and pass it to flags. For instance
Edit: Using properties map makes sense if binary supports flags(cmd parameters).This gives the flexibility to bump component version (assumably with new flags) without changing operator. Mapping env vars to prop map loses docs and type safety. |
Yes, forgot about the cassandra schema job. |
Closing, we don't want to introduce a breaking change to the CR. There are no requests to create new dependencies implementation. |
Follow-up from #140: the spark dependencies spec type is currently a child of the storage spec type. While it might be OK if we only ever have Spark to process the dependency graph, it would probably be better to have a 'dependency spec' type with a 'spark dependency spec' child. This way, we could easily add other dependency processors in the future.
cc @pavolloffay, @objectiser
The text was updated successfully, but these errors were encountered: