-
-
Notifications
You must be signed in to change notification settings - Fork 27
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Trouble getting this to work with BigQuery #89
Comments
Hello, Thank you for reporting the issue. I will debug your sample and come back with a conclusion. Regards, |
That's a tough one. I've tried the combinations of different serializers and schemas with no success. The data just seems to not match the schema, which I don't really believe. Do you know if any codec was used during the serialization? Or (would be the best) could you provide a code snippet used for the serialization of this data? |
Unfortunately the serialization happens in google's cloud, this is how the data comes from the bigquery storage api so I have not transparency into what's happening on the other end |
Hello,
There is no more way I could try to deserialize the file with AvroConvert so I've tried other libraries - still with no success. To proceed with the issue I need to have a readable version of the data (c# or json) and its representation in big query Avro. Only then I would be able to debug the deserialization part. My feeling is that there is some additional part embedded into serializedBinaryRows (could be array start, sync interval, or something similar) that would have to be excluded from deserialization. Regards, |
Closed due to inactivity. |
What is the bug?
Unable to deserialize data from BigQuery Storage API
If I try to read the data using the
OpenDeserializer
method as such I get the following exceptionTrying with the method mentioned in #69 as such does not throw any exceptions but the
AvroConvert.DeserializeHeadless<List<TResult>>
method always returns an empty list of 0 rows, even though I can see that avroEntry.AvroRows.SerializedBinaryRows has a significant amount of content.I have tried both of these methods with my handwritten row class (with and without the
DataContractAttribute
) as well as with the example class generated at the AvroConvert site using the bigquery schema and gotten the same results.My row class:
Output of
AvroConvert.GenerateSchema(typeof(MeterReadingEntry))
:Schema provided by bigquery:
Here is some small sample data:
What is the expected behavior?
Rows are deserialized successfully
Thanks for any help
The text was updated successfully, but these errors were encountered: