-
Notifications
You must be signed in to change notification settings - Fork 248
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Create a makefile target that checks number of races in e2e tests #710
Comments
One more idea about how to make it slightly less strict, so we don't have to fix all the data races before making this go green. |
Totally agree! This is the way to have a good CI one day. |
In TeamCity this is easy to implement. You can tell it to watch a build metric and fail the build if that metric increases. I was using that to watch for the number of build warnings (it would fail if a new warning appeared). The best thing about that system is that it allows the metric to go down, but then the new number becomes the new limit, and you don't need to manage it. |
@pombeirp exactly! |
I wouldn't mind taking this on. I'm getting sick of hearing myself say "it's easy with TeamCity, not sure about Jenkins", so might as well take advantage of something like this to get my hands dirty with Jenkins. |
* Add `race-check` and `test-unit-race` target to Makefile. Closes #710
Problem
From time to time we found new data races in our code or in Ethereum. However, we haven't any process to clean up the code regularly.
We're migrating to Geth 1.8 and we don't know is it safe for us.
E.g we found a race in
les.downloader
and after it was fixed one more was raised #709 (comment)Implementation
Make a
race-check
script that runs tests with-race -v
flag and counts the number of discovered races. If that number is higher than expected number of races, it should fail. If less or equal — it should still succeed.Acceptance Criteria
make race-check
is added to a MakefilenetworkID
Future steps
Make a nightly run on Jenkins for the race check.
The text was updated successfully, but these errors were encountered: