You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When we INSERT data into transactional table rowId counter is reset on each input page. As a result for any INSERT which processes significant number of rows we will end up with duplicate rowIDs in created file.
trino:default> CREATE TABLE lineitem WITH(transactional=true) AS select * from tpch.sf1.lineitem limit 1;
CREATE TABLE: 1 row
-- insert significant amount of rows
trino:default> INSERT INTO lineitem select * from tpch.sf1.lineitem;
INSERT: 6001215 rows
-- verify total number of rows
trino:default> select count(*) from lineitem;
_col0
---------
6001216
(1 row)
-- select single row for deletion
trino:default> select count(*) from lineitem where orderkey=5942403 and partkey=80946 and suppkey=947 and linenumber=4;
_col0
-------
1
(1 row)
-- delete the row
trino:default> delete from lineitem where orderkey=5942403 and partkey=80946 and suppkey=947 and linenumber=4;
DELETE: 1 row
-- verify row count after delete
trino:default> select count(*) from lineitem;
_col0
---------
5999480
(1 row)
SHOULD BE 6001215
The text was updated successfully, but these errors were encountered:
When we INSERT data into transactional table rowId counter is reset on each input page. As a result for any INSERT which processes significant number of rows we will end up with duplicate rowIDs in created file.
Noticed by @electrum here: #8268 (comment)
REPRO:
The text was updated successfully, but these errors were encountered: