You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
IBM FHIR Server currently has an "export to parquet" feature that is disabled by default.
The feature has seen limited usage and the currently implementation brings in much of Apache Spark which leads oto the following issues when enabled:
greatly increases the size of the ibm-fhir-server image
increases the attack surface area (e.g. the recent log4j curfuffle)
Unless we can come up with a much better implementation (no small feat), I think we should remove this feature and replace it with documentation that clearly shows how to convert the exported NDJSON to Parquet using spark (maybe a blog post?)
The text was updated successfully, but these errors were encountered:
* Deprecate and remove export to parquet feature #3156
Signed-off-by: Paul Bastide <[email protected]>
* Update Parquet to Deprecated
Signed-off-by: Paul Bastide <[email protected]>
I think we should remove this feature and replace it with documentation that clearly shows how to convert the exported NDJSON to Parquet using spark (maybe a blog post?)
this piece of it is not done yet and I think we should do it either for this one or in a new related task
I split the doc task into its own issue. The spark and stocator depencies have been removed and references to "export to parquet" are removed from the documentation
IBM FHIR Server currently has an "export to parquet" feature that is disabled by default.
The feature has seen limited usage and the currently implementation brings in much of Apache Spark which leads oto the following issues when enabled:
Unless we can come up with a much better implementation (no small feat), I think we should remove this feature and replace it with documentation that clearly shows how to convert the exported NDJSON to Parquet using spark (maybe a blog post?)
The text was updated successfully, but these errors were encountered: