Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NextQueueFinishedEarlyException #69

Closed
laudney opened this issue Mar 21, 2018 · 29 comments
Closed

NextQueueFinishedEarlyException #69

laudney opened this issue Mar 21, 2018 · 29 comments

Comments

@laudney
Copy link

laudney commented Mar 21, 2018

0.00% done, Block 511859/514562terminate called after throwing an instance of 'NextQueueFinishedEarlyException'
what(): Next queue finished early
Aborted

@Averylamp
Copy link

I also am having a similar exception when building it myself. When doing initial parsing, I get through a lot of the blocks, then get the following:

25.77% done, Block 368504/514585Segmentation fault (core dumped)

Then if I try to rerun it, I get:

100.0% done fetching block headers
Starting with chain of 368511 blocks
Removing 0 blocks
Adding 146083 blocks
0.00% done, Block 368512/514594terminate called after throwing an instance of 'NextQueueFinishedEarlyException'
  what():  Next queue finished early
Aborted (core dumped)

@thedarkspy
Copy link

Confirmed, same error when trying to restart the parser. Using v0.4.5.

@boshmaf
Copy link

boshmaf commented Mar 22, 2018

Same problem. Using v0.4.5.

@Voelundr
Copy link

Voelundr commented Mar 22, 2018

You can not resume a mid run canceled parser. See #32 (comment)

@hkalodner
Copy link
Collaborator

As @yiwesi after the parser is killed, it is currently impossible to recover which can manifest is getting a NextQueueFinishedEarlyException. The main issue is whatever caused the initial segfault. Would you mind telling me how much memory your machines have? One possible cause of crashes is running out of memory. Reducing the memory requirements of the parser is an ongoing issue.

@Averylamp
Copy link

I have been running the parser on two machines, both running into the same error with 32 GB and 24 GB of ram. There is plenty of space on both hard drives though. I'll monitor ram and rerun it to see if seems to run out.

@hkalodner
Copy link
Collaborator

Hmm. I normally test it out on a 64 GB machine, but that doesn't appear to be the problem. I'm currently running the parser on a 32 GB AWS instance and it's made it up to block 407511 so far without any difficulties. I'm going to let it keep running and see whether it hits the crash you're seeing.

What operating system are you using? I've been mainly testing on Ubuntu so there might be compatibility issues with other OSes.

@Averylamp
Copy link

I was able to get it to parse a full set with 380000 blocks, where at max memory load it used approx 14 GB. I realized that I also had another program running that was taking up more that half the ram so it is likely that it ran out earlier. For some reason though there was a parsing error that prevented it from seeing the full blockchain, so I'm now redownloading the full blockchain to try again. My other computer with 32 gb is now at Block 323869/514766. Both are running Ubuntu 16.04

@hkalodner
Copy link
Collaborator

Hmm. I've seen someone else mention the

prevented it from seeing the full blockchain

error before, but I haven't been able to pin down what's going wrong. That's a pretty weird one.

For the other segfault errors, there's a good chance that running out of memory was the culprit. My run of the parser on a 32 GB machine is continuing to function properly and has reached 450486 blocks and hasn't hit any trouble yet. Memory usage is staying below 20GB. Based on that I think 24 GB may on the tight side to run the parser.

@laudney
Copy link
Author

laudney commented Mar 23, 2018

I ran it on a machine with 128GB memory. It quit with the error i posted without mentioning anything about segmentation fault. The OS is Archlinux.

@hkalodner
Copy link
Collaborator

Hmm. I haven't tested at all on Archlinux so I don't know if there are any platform specific problems there. @laudney, that is the output I would expect it the parser had crashed on a previous run. It looks like the parser had previous parsed up to block 511859 since it says 0.00% done. Did previous runs of the parser complete successfully? It surprises me somewhat that it would crash that close to done. Most people run into problems before that point if they're going to and 128 GB is certainly more than enough memory.

@laudney
Copy link
Author

laudney commented Mar 23, 2018

It was the very first run from scratch. I pasted the wrong error message. please see below

88.29% done, Block 493817/514433
Back linking transactions
99.99% done
91.56% done, Block 498422/514433
Back linking transactions
99.96% done
94.83% done, Block 502955/514433
Back linking transactions
99.97% done
98.10% done, Block 509658/514433
Back linking transactions
99.99% done
98.94% done, Block 511790/514433terminate called after throwing an instance of 'NextQueueFinishedEarlyException'
what(): Next queue finished early
Aborted

@hkalodner
Copy link
Collaborator

Ah. Interesting. Ok that's definitely a different error than other people are hitting. I'll spin up an arch instance and see if I can recreate the crash. I'm going to assume that this is an Arch issue (Or at least not an issue on Ubuntu) unless other people report the same problem. I think the other people in this thread were hitting the NextQueueFinishedEarlyException after resuming after a previous crash unlike what happened to you.

@laudney
Copy link
Author

laudney commented Mar 23, 2018

Let me start the parser from scratch again and see what happens.

@Averylamp
Copy link

Averylamp commented Mar 24, 2018

The logs for the prevented it from seeing the full blockchain are as follows:
32.8% done fetching block headersFailed to read block header information from "/home/avery/.bitcoin/blocks/blk00399.dat" at offset 129895712: Tried to advance past end of file
if that is helpful. Other than that, the parsing of 389173 blocks is working fine for me now with <24gb ram on the machine

@laudney
Copy link
Author

laudney commented Mar 24, 2018

success on second attempt from scratch! It used up to 25GB of memory

99.99% done
100.00% done, Block 514857/514861
Back linking transactions
99.96% done

bloomNegativeCount: 518963242
multiCount: 469051988
dbCount: 40953381
bloomFPCount: 2151397
Updating hash index
Updating index with 306385833 txes
100.00% done
Updating address index
Updating index with 306385833 txes
0.57% done

@hkalodner
Copy link
Collaborator

@laudney, glad to hear that it worked on the second try though I really wish I knew what caused the initial crash.

@Averylamp, thanks that error message is useful. I'm not sure exactly whats causing that error, but now I at least know where the error is showing up. How's your progress going on your current run of the parser?

@Averylamp
Copy link

After redownloading the blockchain I am weirdly getting the same exact error stoping at block 389173. On my other computer which has a hard drive though it is currently at Block 405566/514766, after two days of running. It did manage to get past the 380000 that the other computer that was having trouble though. Sadly, it is unbearably slow as it is running on a hard drive (not sure if it should be faster?).

@hkalodner
Copy link
Collaborator

@Averylamp if possible could you send me your blk00399.dat file. I want to see if the issue is with BlockSci on your machine or if it's actually related to that file.

I think the slowness your hitting is probably more an effect of memory pressure than HD speed. Using a 128 GB machine I can do the full parser in about 10-12 hours and using a 64 bit machine it takes more like 16-18 hours. I can imagine a 32GB it would be even worse.

@Voelundr
Copy link

Maybe Offtopic:
Hello. Unfortunaly I have no supercomputer and try to parse with my home PC with only build 6GB of wich about 2GB are used by the OS. So the parser only has about 4 GB. The first run i did without the max-block option and had to cancel the parser manually because of time issues. So I discovered the #32 Thread. And after the discovery I used the max-block option to parse the first 200k Blocks and then step by step 10k blocks. I got a core dump at around 380k but I think this core dump was because I runned parallel a multimedia Video in Firefox. So i decided to rerun it from beginning with no other program than the parser (except the OS) and again after 200k Block step by step 10k Blocks. I was able to get to 460k blocks but annoyingly got again a core dump at/around Block 468803. (I needed about 4 hours to parse 10k blocks)
Question: If I parse fully the chain on another PC, can i copy over the data to another PC and work with the parsed data on that other PC and can even parse new blocks on that other PC? And if this is so that the parsed data can be imported, why is nobody hosting this data and others then can download and import.
Back to topic: Please implement the function to continue a core dump broken parser asap. Thank you.
Cheers and have a good time.
PS: No offense.

@Voelundr
Copy link

Answering myself:
Please read: #2

@hkalodner
Copy link
Collaborator

@Averylamp sorry it took me forever, but I finally pinned down the bug you were hitting. Apparently, as documented in bitcoin/bitcoin#8614, Bitcoin core sometimes has some strangeness in how it stores serialized blocks. I've now added code to account for this.

@josiahgray12
Copy link

josiahgray12 commented Jun 26, 2018

I am still having this issue, the parser spitting out

100.0% done fetching block headers
Starting with chain of 528113 blocks
Removing 0 blocks
Adding 0 blocks
Updating hash index
Updating address index
Updating index with scripts of type multisig_script

Even though the blockchain is currently past block 529000, the parser thinks it is up to date.

Is there something I have to do in order to access the code you added to fix the issue? @hkalodner

I originally downloaded v0.5.0

@Voelundr
Copy link

Did you try the update on the very same machine you parsed the 528113 blocks?

@josiahgray12
Copy link

Yes. I am running on a Microsoft Azure machine, I do not know if that should change anything.

@Voelundr
Copy link

Shot in the dark: Did you update your locale Blockchain? ;)

@josiahgray12
Copy link

I think so, when I run "bitcoin-cli getblockcount" it gives me the most recent information.

@Voelundr
Copy link

Ok now I have no further ideas (I'm not a blocksci developer) Hope that @hkalodner can help you.
Note: I think you are wrong in this issue thread because as I can see you did not get an error (NextQueueFinishedEarlyException).

@josiahgray12
Copy link

Yeah the thread I started in brought me to the patch posted in this one. They are linked so I figured I would just post in here.

Thank you for taking some time to help me @yiwesi

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants