You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am currently implementing a Confluent-Kafka stream on a managed cluster called Hopsworks. They provide an integrated Schema Registry that requires an API key to access the Schema Registry. When implementing the Confluent-Kafka without spark or abris integration, it looks like this:
Now i want to use abris to make it work with spark structured streaming, but i can't figure out how to add the API key required to the abris config. Can you maybe help me out there?
Kind Regards
The text was updated successfully, but these errors were encountered:
There is no single config that would set the header for you in Abris, but there is a config to use a custom client implementation, where this should be possible.
Confluent's CachedSchemaRegistryClient has httpHeaders as constructor parameter.
In Abris look at SchemaManagerFactory there is a code that will look for REGISTRY_CLIENT_CLASS and instantiate it.
You will have to implement AbrisRegistryClient trait and set the class name as a value for REGISTRY_CLIENT_CLASS key in the config.
The implementation should be very similar to ConfluentRegistryClient with just the headers added in the constructor.
Hello there,
I am currently implementing a Confluent-Kafka stream on a managed cluster called Hopsworks. They provide an integrated Schema Registry that requires an API key to access the Schema Registry. When implementing the Confluent-Kafka without spark or abris integration, it looks like this:
Now i want to use abris to make it work with spark structured streaming, but i can't figure out how to add the API key required to the abris config. Can you maybe help me out there?
Kind Regards
The text was updated successfully, but these errors were encountered: