-
Notifications
You must be signed in to change notification settings - Fork 260
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
NextQueueFinishedEarlyException #69
Comments
I also am having a similar exception when building it myself. When doing initial parsing, I get through a lot of the blocks, then get the following:
Then if I try to rerun it, I get:
|
Confirmed, same error when trying to restart the parser. Using v0.4.5. |
Same problem. Using v0.4.5. |
You can not resume a mid run canceled parser. See #32 (comment) |
As @yiwesi after the parser is killed, it is currently impossible to recover which can manifest is getting a |
I have been running the parser on two machines, both running into the same error with 32 GB and 24 GB of ram. There is plenty of space on both hard drives though. I'll monitor ram and rerun it to see if seems to run out. |
Hmm. I normally test it out on a 64 GB machine, but that doesn't appear to be the problem. I'm currently running the parser on a 32 GB AWS instance and it's made it up to block 407511 so far without any difficulties. I'm going to let it keep running and see whether it hits the crash you're seeing. What operating system are you using? I've been mainly testing on Ubuntu so there might be compatibility issues with other OSes. |
I was able to get it to parse a full set with 380000 blocks, where at max memory load it used approx 14 GB. I realized that I also had another program running that was taking up more that half the ram so it is likely that it ran out earlier. For some reason though there was a parsing error that prevented it from seeing the full blockchain, so I'm now redownloading the full blockchain to try again. My other computer with 32 gb is now at Block 323869/514766. Both are running Ubuntu 16.04 |
Hmm. I've seen someone else mention the
error before, but I haven't been able to pin down what's going wrong. That's a pretty weird one. For the other segfault errors, there's a good chance that running out of memory was the culprit. My run of the parser on a 32 GB machine is continuing to function properly and has reached 450486 blocks and hasn't hit any trouble yet. Memory usage is staying below 20GB. Based on that I think 24 GB may on the tight side to run the parser. |
I ran it on a machine with 128GB memory. It quit with the error i posted without mentioning anything about segmentation fault. The OS is Archlinux. |
Hmm. I haven't tested at all on Archlinux so I don't know if there are any platform specific problems there. @laudney, that is the output I would expect it the parser had crashed on a previous run. It looks like the parser had previous parsed up to block 511859 since it says |
It was the very first run from scratch. I pasted the wrong error message. please see below
|
Ah. Interesting. Ok that's definitely a different error than other people are hitting. I'll spin up an arch instance and see if I can recreate the crash. I'm going to assume that this is an Arch issue (Or at least not an issue on Ubuntu) unless other people report the same problem. I think the other people in this thread were hitting the |
Let me start the parser from scratch again and see what happens. |
The logs for the prevented it from seeing the full blockchain are as follows: |
success on second attempt from scratch! It used up to 25GB of memory
|
@laudney, glad to hear that it worked on the second try though I really wish I knew what caused the initial crash. @Averylamp, thanks that error message is useful. I'm not sure exactly whats causing that error, but now I at least know where the error is showing up. How's your progress going on your current run of the parser? |
After redownloading the blockchain I am weirdly getting the same exact error stoping at block 389173. On my other computer which has a hard drive though it is currently at |
@Averylamp if possible could you send me your blk00399.dat file. I want to see if the issue is with BlockSci on your machine or if it's actually related to that file. I think the slowness your hitting is probably more an effect of memory pressure than HD speed. Using a 128 GB machine I can do the full parser in about 10-12 hours and using a 64 bit machine it takes more like 16-18 hours. I can imagine a 32GB it would be even worse. |
Maybe Offtopic: |
Answering myself: |
@Averylamp sorry it took me forever, but I finally pinned down the bug you were hitting. Apparently, as documented in bitcoin/bitcoin#8614, Bitcoin core sometimes has some strangeness in how it stores serialized blocks. I've now added code to account for this. |
I am still having this issue, the parser spitting out 100.0% done fetching block headers Even though the blockchain is currently past block 529000, the parser thinks it is up to date. Is there something I have to do in order to access the code you added to fix the issue? @hkalodner I originally downloaded v0.5.0 |
Did you try the update on the very same machine you parsed the 528113 blocks? |
Yes. I am running on a Microsoft Azure machine, I do not know if that should change anything. |
Shot in the dark: Did you update your locale Blockchain? ;) |
I think so, when I run "bitcoin-cli getblockcount" it gives me the most recent information. |
Ok now I have no further ideas (I'm not a blocksci developer) Hope that @hkalodner can help you. |
Yeah the thread I started in brought me to the patch posted in this one. They are linked so I figured I would just post in here. Thank you for taking some time to help me @yiwesi |
The text was updated successfully, but these errors were encountered: