Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TiFlash consumes a lot of memory when restarting after delete many rows #1885

Closed
lidezhu opened this issue May 11, 2021 · 0 comments · Fixed by #1997
Closed

TiFlash consumes a lot of memory when restarting after delete many rows #1885

lidezhu opened this issue May 11, 2021 · 0 comments · Fixed by #1997

Comments

@lidezhu
Copy link
Contributor

lidezhu commented May 11, 2021

Reproduce step:

  1. set set flash.compact_log_min_period to 60 (just and arbitrary value that is not close to 0, default value is 120);
  2. load tpch_10.lineitem to tiflash;
  3. execute the sql "delete from lineitem limit 300000" until all rows is deleted;
  4. wait for some time and restart tiflash;

04063363B3E13BF2DBAAE037647F3F18

If we set flash.compact_log_min_period to 0(default value is 120) and redo the above steps 2 - 4, the problem won't occur.
image

flash.compact_log_min_period controls the min time tiflash can flush region in memory to disk, and we will ignore compact log raft command in the period.

And tikv by default send compact log raft command every 10s for all updated region. So if the config value is too big, there may be too much region data left in memory, and the applied index will left behind a lot.

And after we ignore the compact log raft command, we will never receive new command to flush region data and forward applied index because of no more update. So when we restart tiflash, there will be many raft command to apply, and it will cost a lot of memory.

But we cannot simply set flash.compact_log_min_period to 0 because of this issue: https://docs.google.com/document/d/1eg7CDiVkXIgHpN9GdCA-puhi0UBCvs28H44Kj7-QjS0/edit#bookmark=id.avai2iw8w9d3

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants