-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add initial take at a more formalized JEP process #14
Conversation
I'm sure this will evolve over time so I wanted to start with as lightweight a process as possible.
While refactoring the spec repo I changed the format of the JEP https://github.com/jmespath-community/jmespath.spec/blob/efbdbd9ee3292285b65f1f8b38fbcf56760087fa/TEMPLATE.md?plain=1#L8 I also significantly reworked the way tests are tracked in the spec repo. The tests were changed to YAML format to make them more readable and leverage YAML features to de-duplicate data. The concept of a 'suite' in the original testing structure was done away with as it was more just a means of de-duplicating data. The function tests were split out into separate YAML documents, one for each function. Along with the specific tests, these documents contain descriptions of the functions signature in a structured format that lends itself to decoupling the presentation in the website and meta-programming. Ultimately the idea was that if an implementation wanted to be included in the list it would need to provide a CLI that would allow running the compliance tests against. I combined the tests repo, the JEPs and the spec all into one repo so that changes are tracked in the same commit history. The expectation was that a JEP would accompany all of the required tests, grammar changes, and documentation all in the same PR. |
In our effort with @innovate-invent we settled on using version number Our plan was to go with semantic versioning from there onwards. |
I like the idea, couple of questions:
It's an interesting idea that I'd like to look over more. One concern I have is that many of the existing implementations have test runners for the existing test suite format, and I'm hesitant to require them to rewrite their runner unless there's significant benefit in doing so (not that there isn't, haven't had a chance to look them over yet). |
To clarify, are you saying that the |
When we debated this, I was leaning towards using this style going forward but we landed on the consensus that semantic versioning was better in the long run. Since no version |
I hadn't gotten that far. I was thinking of just kina winging it and tagging a version any time an "important" JEP was merged.
The semver is incremented by a JEP being merged. I it is occurring to me that most JEPs would be a MINOR change so I am not too sure how to actually work this. I think the MAJOR would be incremented once per batch of breaking changes rather than per breaking JEP. Perhaps the same should be done for the MINOR as well.
We wanted to push for implementations to provide a CLI rather than their own test pipeline. That said, I can very easily provide a script that will convert the existing tests to the original format. They can clone the repo and run this script to generate the json on demand. I don't expect it to be too difficult to swap to a yaml parser from a json parser for their test suites though. |
I'm sure this will evolve over time so I wanted to start
with as lightweight a process as possible.
I will also backfill the existing JEPs to this repo in a separate PR.
Unresolved Issues
There's still a few things we need to figure out in the overall process, but I don't think it needs to block reviewing JEPs.
Interested in hearing other ideas people may have.