Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

sql: CREATE TABLE AS in an explicit txn does not flush batched rows #82951

Merged
merged 1 commit into from
Jun 21, 2022

Conversation

fqazi
Copy link
Collaborator

@fqazi fqazi commented Jun 15, 2022

Fixes: #81554

Previously, when running a CREATE TABLE AS in explicit
transactions, a single batch inserted all rows. This can
lead to errors running into the KV command size limits. To
address this, this patch will intermittently flush batches
KV, so that a single gigantic batch isn't used.

Release note (bug fix): CREATE TABLE AS in explicit transaction
would fail with an error if the size of the source table exceeded
the raft command size limit.

@fqazi fqazi requested a review from a team June 15, 2022 19:03
@cockroach-teamcity
Copy link
Member

This change is Reviewable

Copy link
Contributor

@postamar postamar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, modulo a small request. I'm amazed that this bug hasn't been reported before.

# explicit transactions
statement ok
BEGIN;
CREATE TABLE source_tbl_huge AS SELECT 1::CHAR(256) FROM generate_series(1, 500000);
Copy link
Contributor

@postamar postamar Jun 20, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was curious and tried this on master and got

ERROR: command is too large: 126737600 bytes (max: 67108864)

Perhaps I'm too paranoid but those two values are too close for comfort. I'd be more comfortable if we explicitly set the kv.raft.command.max_size setting for this test to a specific value (like the floor value of 4MB) and explained why we're doing so in the comment for this test. This way we're not relying on the current default value for this setting. The test itself is great otherwise.

Fixes: cockroachdb#81554

Previously, when running a CREATE TABLE AS in explicit
transactions, a single batch inserted all rows. This can
lead to errors running into the KV command size limits. To
address this, this patch will intermittently flush batches
KV, so that a single gigantic batch isn't used.

Release note (bug fix): CREATE TABLE AS in explicit transaction
would fail with an error if the size of the source table exceeded
the raft command size limit.
Copy link
Collaborator Author

@fqazi fqazi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@postamar TFTR!

Reviewable status: :shipit: complete! 0 of 0 LGTMs obtained (waiting on @postamar)


pkg/sql/logictest/testdata/logic_test/create_as line 390 at r1 (raw file):

Previously, postamar (Marius Posta) wrote…

I was curious and tried this on master and got

ERROR: command is too large: 126737600 bytes (max: 67108864)

Perhaps I'm too paranoid but those two values are too close for comfort. I'd be more comfortable if we explicitly set the kv.raft.command.max_size setting for this test to a specific value (like the floor value of 4MB) and explained why we're doing so in the comment for this test. This way we're not relying on the current default value for this setting. The test itself is great otherwise.

Done.

@fqazi
Copy link
Collaborator Author

fqazi commented Jun 21, 2022

bors r+

@craig
Copy link
Contributor

craig bot commented Jun 21, 2022

Build succeeded:

@craig craig bot merged commit 377b113 into cockroachdb:master Jun 21, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

sql: CREATE TABLE AS in an explicit transaction does not flush batched rows
3 participants