-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Run adapter unit tests against latest versions #4
Comments
Maybe build matrix in Ideally, the tests against master should serve as a 'red flag' to the developer that the incompatibilities w/ master should be fixed, but not necessarily in this particular PR. From this perspective, I think that the build:
|
What does "soft fail" mean to you? If Travis reports green, your incentive to read its report is very low (as it should be, not to waste dev time). I see the path of reporting red if the test against master failed, but allowing the merge to proceed with an explanation note on the PR. The alternative is to not test against master in Travis but leave it to the developer (locally, having changed the version dependency). @demmer thoughts? |
@dmehra IIUC, travis can distinguish between 'failing a build' and 'failing a single test suite in the matrix'. The tests would still be red but the build would pass - which I call 'soft fail' (I assume that the badge would be red in that case - which needs verification). The feature is described here |
Interesting, i didn't know about this feature. This post has a screenshot of what it will look like in Travis. |
@rlgomes have you tried using |
nope but I have yet to find in all my career a situation where "green that is really red" or "yellow instead of red" leads to anyone actually caring about it... so lets just live with RED if it happens to be red and do something about it at that point. |
@rlgomes On the other hand, perpetual redness creates blindness and “real” redness can end up being ignored because of that. If failed tests in |
Lets let red be red and green be green and when there's a problem with actual tests failing for reasons other than there's something to fix we can look closely and why this is happening and change the way we're testing rather than the meaning of passing vs failing. |
(This is a juttle-engine issue because it's cross-repo)
Set up a way in travis to run unit tests of each adapter against the latest release and latest master of juttle. The goal is to notice when adapter and juttle have diverged.
npm cache clean
? at the expense of slowing down travis)The text was updated successfully, but these errors were encountered: