Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Run adapter unit tests against latest versions #4

Open
dmehra opened this issue Feb 9, 2016 · 8 comments
Open

Run adapter unit tests against latest versions #4

dmehra opened this issue Feb 9, 2016 · 8 comments

Comments

@dmehra
Copy link
Contributor

dmehra commented Feb 9, 2016

(This is a juttle-engine issue because it's cross-repo)

Set up a way in travis to run unit tests of each adapter against the latest release and latest master of juttle. The goal is to notice when adapter and juttle have diverged.

  • currently adapters use a hard-coded version of juttle to point at
  • pointing at master can be tricky, as npm caching gets in the way (maybe possible with npm cache clean? at the expense of slowing down travis)
@bkutil
Copy link

bkutil commented Feb 10, 2016

Maybe build matrix in travis.yml could be used to archieve this - the 'columns' being node versions, and 'rows' juttle master and juttle x.y.z.

Ideally, the tests against master should serve as a 'red flag' to the developer that the incompatibilities w/ master should be fixed, but not necessarily in this particular PR.

From this perspective, I think that the build:

  1. should always fail if tests using stable Juttle release fail - we don't want broken releases
  2. but should soft-fail, or be green, if tests fail with Juttle master - we don't want to prevent merge of the current PR based on an incompatibility created in core

@dmehra
Copy link
Contributor Author

dmehra commented Feb 10, 2016

What does "soft fail" mean to you? If Travis reports green, your incentive to read its report is very low (as it should be, not to waste dev time).

I see the path of reporting red if the test against master failed, but allowing the merge to proceed with an explanation note on the PR.

The alternative is to not test against master in Travis but leave it to the developer (locally, having changed the version dependency).

@demmer thoughts?

@bkutil
Copy link

bkutil commented Feb 10, 2016

@dmehra IIUC, travis can distinguish between 'failing a build' and 'failing a single test suite in the matrix'. The tests would still be red but the build would pass - which I call 'soft fail' (I assume that the badge would be red in that case - which needs verification). The feature is described here

@dmehra
Copy link
Contributor Author

dmehra commented Feb 10, 2016

Interesting, i didn't know about this feature. This post has a screenshot of what it will look like in Travis.
Keep in mind, however, that the github badge on the PR will be green, and it's only if you click on the details to proceed to the travis page, you'll see the allowed failures there. So it still leaves the developer in the somewhat odd situation of needing to triage their green badge every time to see if it's "all green". Don't know if travis would email you about the failures in this case.

@dmehra
Copy link
Contributor Author

dmehra commented Feb 10, 2016

@rlgomes have you tried using allow_failures travis feature?

@rlgomes
Copy link
Contributor

rlgomes commented Feb 10, 2016

nope but I have yet to find in all my career a situation where "green that is really red" or "yellow instead of red" leads to anyone actually caring about it... so lets just live with RED if it happens to be red and do something about it at that point.

@dmajda
Copy link

dmajda commented Feb 11, 2016

@rlgomes On the other hand, perpetual redness creates blindness and “real” redness can end up being ignored because of that.

If failed tests in allow_failures send failure e-mails like normal tests do, I’d say that would be enough to alert the owner (which is the point of the whole exercise).

@rlgomes
Copy link
Contributor

rlgomes commented Feb 11, 2016

Lets let red be red and green be green and when there's a problem with actual tests failing for reasons other than there's something to fix we can look closely and why this is happening and change the way we're testing rather than the meaning of passing vs failing.

dmajda referenced this issue in juttle/juttle-influx-adapter Feb 16, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants