Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: BigQueryIO Write does not work using Storage_write_api on DataflowRunner with unbounded data and autosharding #30058

Open
5 of 16 tasks
Polber opened this issue Jan 20, 2024 · 0 comments

Comments

@Polber
Copy link
Contributor

Polber commented Jan 20, 2024

What happened?

BigQueryIO has a bug where Write cannot be performed using .withAutoSharding() or .withNumShards(n) where n = 0 on DataflowRunner with unbounded data.

Setting .withNumShards(n) where n > 1 and disabling autosharding

This issue affects ALL streaming python pipelines run on Dataflow with cross language transform BigQueryStorageWriteApiSchemaTransform

Issue Priority

Priority: 2 (default / most bugs should be filed as P2)

Issue Components

  • Component: Python SDK
  • Component: Java SDK
  • Component: Go SDK
  • Component: Typescript SDK
  • Component: IO connector
  • Component: Beam YAML
  • Component: Beam examples
  • Component: Beam playground
  • Component: Beam katas
  • Component: Website
  • Component: Spark Runner
  • Component: Flink Runner
  • Component: Samza Runner
  • Component: Twister2 Runner
  • Component: Hazelcast Jet Runner
  • Component: Google Cloud Dataflow Runner
robertwb added a commit that referenced this issue Jan 24, 2024
…QueryStorageWrite

Require numStreams for unbounded BigQueryStorageWriteApiSchemaTransform xlang transform.

This is to work around #30058
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant