-
Notifications
You must be signed in to change notification settings - Fork 3.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
More NTest prep work for eventual test harness #3353
Conversation
I would rather go another way. something like this: But it's only a rough sketch which I didn't think too much about. {
"outputhandler": "TAPoutput",
"features": {
"host": {},
"bme280": {
"sda": 5,
"sdc": 6
},
"testboard": {
"version": 1
}
}
} BTW, shim would not be my preferred name, as it does not describe well what it is. TAPoutputhhandler or TAPreport seems better to me. |
I agree that I'm not opposed to that kind of configuration file per se, but I think that your proposal is conflating parameters to NTest (the output handler) with parameters to particular users of NTest (the |
I feel that the output handler is a TestEnv configuration just as the bme280 configuration. My idea was that the configuration would be passed to each test function as another parameter. In addition to that, the N.test calls have a require field which contains a list of required features and if they are not all in the config aren't even called. Additionally to the features from the config file all available C-Modules would be added automatically. |
Hm. I am not (yet) convinced, but maybe.
Actually that... gives me an idea. The test harness can simply fling the following code at the DUT at startup:
It's kind of terrible, but it certainly would work and requires no modification to
Why not simply make the per-test configuration
I think this is better handled by
|
Here's a very basic start at |
But why not modify NTest as long as it is in a compatible way (or even a breaking change, given that we only have about 10 testfiles so far).
That makes sense. So NTest would be available to the tests as upvalue?
The preflight thing might be a good addition to check more functional conditions but it can only decide for all tests in the file. Hmm unless it changes the set of features available which will be checked by the requirements list of the individual tests.
Rereading the comment a simple
yes, the parser is quite flexible together with metatables
Maybe But using the hasFeat at the beginning of the test might also work. I think a special "inconclusive" message type for tests that fail their requirements would make sense though. |
Well, I'm open to suggestion, but you didn't seem to approve of my very lightweight change to try
Well, I imagine that
now paired with
I think, in the interest of simplicity, there should be a way to indicate that the entire file should be skipped and we should keep files topicalized. I suspect almost all of our feature checks will be of this variety -- "Do we have a second DUT?" / "Do we have a mumbledefrotz attached to us?". For individual tests within a topic that might need to skip out, having a simple conditional that
So I found and sketched in the attached deltas.
I suppose varargs is fine, too, yes, but AIUI the way one works with varargs in Lua is
Come again?
Concretely I'd suggest we have ways of signaling what TAP calls |
It seems a good idea to require the output handler. What I don't like is the preloading with
sounds perfect to me expect that my suggestion from above to give the possibility to have an external NTest given as param which would lead to code like
That was what I was trying to suggest. So we need to define a new function
True. but especially in the case of only one requirement it is easier to type and read.
Sorry but I don't understand this comment.
You are probably right on that one. In the case of But reading the documentation |
Oh, very good. That's cute; I think I'm onboard. I'll add patches to our existing
OK. So Can you expand on "should it break the test as ok can do and the return actually is only a tail call" ?
That's fair.
Sorry, by "Come again?" I meant "I don't understand this comment". Sorry to have made the confusion mutual! XD |
Here's a draft of some of the changes for consideration. The test harness will invoke tests by causing the DUT to run code like
|
tests/NTest/NTestEnv.lua
Outdated
return mstr:match("^"..m..",") -- first module in the list | ||
or mstr:match(","..m.."$") -- last module in the list | ||
or mstr:match(","..m..",") -- somewhere else | ||
or error("Missing C module " .. m) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would a has* function to return a boolean and not fail
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fair. If it returns a boolean the convention can be that tests assert
on its result. I think I was thinking they'd just call it, but that's pretty bogus.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Its a question of naming. Call it checkFeat or even assertFeat and it will be ok to fail.
At $work we had a length discussion once what assure and ensure should do, we came up that one should just check and fail and the other should contain code to meet the expected requirement. But I can't remember the outcome.
tests/NTest/NTestEnv.lua
Outdated
local cstr | ||
repeat cstr = cfgf:read(); decoder:write(cstr) until #cstr == 0 | ||
cfgf:close() | ||
local givenFeats = decoder:result().features |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe check for no features table at all
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My initial hunch is to check by assert
ing, but that does raise the question of ergonomics of hasEnv
for prefligh tests... we'd have to pcall
them to route exceptions to skip
. Can we change NTest
just a little in some way to indicate that some .test()
functions should treat exceptions as Bail out!
and not not ok
? I'm not sure what's maximally ergonomic here and so am quite open to suggestions.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If occurs to me that it's most likely that these NTestEnv
functions are likely to be used before the first test
call in a NTest file. By way of example, the ADC-via-the-I2C-expander test will probably look like
local N = -[[ ... ]]
local E = require "NTestEnv"
assert E.hasC("adc")
assert E.hasC("i2c")
assert E.hasL("mcp23017")
local config = assert(E.getConfig("adc", "mcp23017", "i2c"))
N.test(-[[ ... ]])
-- ...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
given that hasEnv (or (hasFeat) now returns a boolean it is easier now.
if not hasEnv("myrequirement") the skip("some optional message") end
But what I was requesting is to handle the case that there is no features
section in the config file at all. Meaning to treat it as if it was empty.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does the new assert
for the features
table not do the right thing?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If occurs to me that it's most likely that these
NTestEnv
functions are likely to be used before the firsttest
call in a NTest file. By way of example, the ADC-via-the-I2C-expander test will probably look likelocal N = -[[ ... ]] local E = require "NTestEnv" assert E.hasC("adc") assert E.hasC("i2c") assert E.hasL("mcp23017") local config = assert(E.getConfig("adc", "mcp23017", "i2c")) N.test(-[[ ... ]]) -- ...
That would abort th run outside of NTest which would make it harder to use TAP.
I understood so far that these functions would be called in the first test which works as preflight.
Something like this:
local N = -[[ ... ]]
local E = require "NTestEnv"
local config
N.test("preflight", function ...
if not E.hasConfig("adc", i2c", "mcp23017") then bailOut() end
local config = E.getConfig("adc", "mcp23017", "i2c")
end)
Not really sure about the details yet, so if you have a better Idea ...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does the new
assert
for thefeatures
table not do the right thing?
Yes of coarse. Now I see it. It does even better in failing early with a clear error message.
tests/NTest/NTestEnv.lua
Outdated
|
||
local res = {} | ||
for k, v in ipairs(reqFeats) do | ||
res[v] = givenFeats[v] or error("Missing required feature " .. v) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no need to copy into another array here. If all is there just return reqFeats
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
But reqFeats
doesn't have the associated configuration values? Sorry, I must be missing something.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry, I meant return givenFeats
which should be filtered already.
If you like go ahead. I would prefer to explicitly configure it.
As stated in the review I miss the else part.
Exactly.
I meant whether we should just check for a certain return code of the test or throw an error as the ok/nok methods do to terminate the testcase. But I have been thinking about it and came to the conclusion that we must handle it similar to ok/nok failures to allow for async and coroutine tests to work as
Uh what a mess. But now I got you :-) I meant that maybe only a So {
"featName": {}
} But just thinking about the difference between having a module and having a configuration for it. Maybe we could introduce a list of required modules in the feature configuration instead. So |
tests/NTest/NTestEnv.lua
Outdated
end | ||
|
||
function NTE.hasL(m) | ||
return pcall(require, m) or error("Missing Lua module " .. m) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would rather check for existence of a file or the module in LFS than actually requireing it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll see if I can figure out how to drive package.loaders
myself, then.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok, didn't expect other loaders to be in place. Just taking account of the 2 known ones seemed enough to me, but maybe that's not enough.
Yes, I think I agree, on second thought. The configuration is, in the present commits, handled as per #3353 (comment), which also answers your question above about how the test name strings get to the right place and also why we don't need an if/else.
I think I agree.
Given the rather different nature of modules from environment configuration, I think I'd prefer to keep them separated.
I think I disagree. This seems to suggest that it's the job of the configuration file, and not the test code, to know which modules are required for which hardware feature. |
d2d6232
to
6229982
Compare
Also fine
On second thought I agree. |
This is a decomposable proposal for the next release. It contains most, but not all, of the TCL goo I've been maintaining for a while now, but if that's seen as a bridge too far, the first six or fewer commits might still be usefully pulled to |
I should probably rope @pjsg in, as he was asking for test suite dox. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some small changes, but then it looks good.
Havn't had a look at the expect and tcl files though
Would it be possible to have several tests in one LFS.img? I also wonder what the reasoning is to write own transfer routines and not use existing packages. |
I don't quite understand the impetus for the question; have I mis-stated something? Or is it just that I didn't explicitly call out that In answer to the question, tho', yes, it's possible to have the entire test suite in one LFS image, and indeed the rest of the machinery I've prototyped does so; see, for example, https://github.com/nwf/nodemcu-firmware/blob/dev-active/tests/preflight-lfs.sh . Since the rest of the pipeline is in a little more state of flux, I haven't included it in this PR, but feel free to suggest other bits of |
I was dis-satisfied with existing transfer mechanisms for one reason or another; perhaps writing my own was simply frustration. |
Sorry, didn't read close enough.
Ah, found it. https://github.com/AndiDittrich/NodeMCU-Tool seems to do what you need. |
The test harness really wants something in-tree, and I don't think a NodeJS package fits the bill better than a small handful of TCL. I am not opposed to someone rewriting the xfer logic in python (upon which we already depend) and making available at the command line (or writing a separate TCL program, extracting the relevant bits from Maybe the ideal would be for |
e9326d4
to
1c80716
Compare
I lost track a bit but AFAICS its ready to go. @nwf Nathaniel it is still draft. Do you intend to add more changes? |
Use a metatable to provide defaults which can be shadowed by the calling code.
I think we have few enough tests that we can verify not needing this alert for ourselves.
Allow for NTest constructor to be passed in to the test itself. The test harness can use this to provide a wrapper that will pre-configure NTest itself.
... if checked out under windows and executed under linux (say docker)
Rebasing,,, let's see how CI feels. |
I guess if the author, both CI processes and Gregor are happy then this should land 😄 |
Continuing to pull things off the bottom of my
dev-active
branch... Here is a proposal to add "output shim" support toNTest
so that the test harness can get more structured output when it wants it without requiring that structure in all cases. Right now it's written by looking for arequire
-able module with a somewhat funky name (NTestOutShim
), but I'm open to other suggestions for worming something into place.dev
branch rather than for therelease
branch.docs/*
.