-
-
Notifications
You must be signed in to change notification settings - Fork 6.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Typescript-Node] Mark deprecated model attributes #19756
[Typescript-Node] Mark deprecated model attributes #19756
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Intent (adding @deprecated
support) of the pull request LGTM
samples/client/petstore/typescript-node/default/.openapi-generator/FILES
Outdated
Show resolved
Hide resolved
public void testApisGeneration() throws IOException { | ||
|
||
File output = Files.createTempDirectory("typescriptnodeclient_").toFile(); | ||
output.deleteOnExit(); | ||
|
||
final CodegenConfigurator configurator = new CodegenConfigurator() | ||
.setGeneratorName("typescript-node") | ||
.setInputSpec("src/test/resources/3_0/typescript-node/SampleProject.yaml") | ||
.setOutputDir(output.getAbsolutePath().replace("\\", "/")); | ||
|
||
final ClientOptInput clientOptInput = configurator.toClientOptInput(); | ||
DefaultGenerator generator = new DefaultGenerator(); | ||
List<File> files = generator.opts(clientOptInput).generate(); | ||
files.forEach(File::deleteOnExit); | ||
|
||
TestUtils.assertFileContains(Paths.get(output + "/api/basicApi.ts"), | ||
"* Sample project"); | ||
TestUtils.assertFileContains(Paths.get(output + "/api/basicApi.ts"), | ||
"export class BasicApi {"); | ||
} | ||
|
||
@Test | ||
public void testModelsGeneration() throws IOException { | ||
|
||
File output = Files.createTempDirectory("typescriptnodeclient_").toFile(); | ||
output.deleteOnExit(); | ||
|
||
final CodegenConfigurator configurator = new CodegenConfigurator() | ||
.setGeneratorName("typescript-node") | ||
.setInputSpec("src/test/resources/3_0/typescript-node/SampleProject.yaml") | ||
.setOutputDir(output.getAbsolutePath().replace("\\", "/")); | ||
|
||
final ClientOptInput clientOptInput = configurator.toClientOptInput(); | ||
DefaultGenerator generator = new DefaultGenerator(); | ||
List<File> files = generator.opts(clientOptInput).generate(); | ||
files.forEach(File::deleteOnExit); | ||
|
||
TestUtils.assertFileContains(Paths.get(output + "/model/group.ts"), | ||
"export class Group {"); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think if these were broken we'd notice it in other ways, too, so I don't think these tests cover anything new?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was no test that generated the client code and tested the creation of models, so I have added one.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry, what I meant was that there are a whole bunch of generated samples with files checked in. If code were broken that removed these models it would either show as a diff (removals) in a pull request or the build would fail.
files.forEach(File::deleteOnExit); | ||
|
||
TestUtils.assertFileContains(Paths.get(output + "/model/group.ts"), | ||
"* @deprecated"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see a few issues with this:
- The mustache template content is now bound to this particular unit test
- The match is quite imprecise, if I put the token
@deprecated
anywhere in any description of any variable of this model, the match here would be truthy
I think the sample output below should be sufficient?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Indeed not the best approach, but this is similar in most generators where tests search for patterns. The alternative is maybe what it was done with the Go Server Generator, adding a workflow that tests the generated samples and checks the specific file.
What do you think?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
moving it into a workflow means that any local development will not pick up on it anymore, the feedback look would get much slower.
I am suggesting to remove these tests altogether and rely on the generated sample output and the assertion about them being up-to-date. See #19756 (comment)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
rely on the generated sample output
? without a test? I am not sure I understand.
I think we need a test, either in TypeScriptNodeClientCodegenTest
or in a workflow verifying the samples (those tests can be run locally, not only by the workflow).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@macjohnny can correct me if I am wrong, but our current setup is using the generated samples as a form of snapshot test.
A diff in them will either show up in the pull request or fail on CI.
Think of it like this:
- You introduce the new logic to add a deprecated marker to generated code
- In the future someone else makes a PR that accidentally changes that logic, so the markers get lost
- Their PR would fail CI
- They would either fix the logic or regenerate the samples
- If they regenerate the samples it would show as a diff
- A reviewer would decide whether the removal of the marker is intended, or not
- If not, they'll prompt the code to be updated to not remove the markers, possibly with a reference to this pull request
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@joscha @gcatanese often automated tests are implemented as integration tests that run the generated code and make certain assertions, like
Line 70 in 90bc100
it(`should add a pet`, waitForAsync(() => { |
in some cases unit tests like https://github.com/OpenAPITools/openapi-generator/blob/master/modules/openapi-generator/src/test/java/org/openapitools/codegen/typescript/fetch/TypeScriptFetchClientCodegenTest.java can make very specific assertions that are either not covered by the generated samples (one needs to balance number of different samples vs covering each variation in configuration) or are hard to catch and thus merit a specific assertion in a unit test.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
so usually adding a unit test is a good thing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TIL. Would it make sense to fail on any diff that is produced by generating from anything under config/ ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You're saying the diff produced in the generated samples would not fail the CI checks if you didn't update them?
no, the CI will fail if the samples are outdated, i.e. the changed code produces a diff in samples and the PR does not add this diff means the CI fails. so when the diff is added to the PR, the reviewer needs to manually check whether this is fine. this does not mean it would actually compile, or sometimes a reviewer might just overlook a specific aspect (especially if the diffs are large), which is why additional assertions in unit tests can make sense.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Got it. I personally think the generated fixtures are sufficient. Unit tests string.contains() assertions seem quite fragile in comparison.
Thank you @joscha for the great feedback, and @macjohnny for following it up. I think the PR is good to go, if not (sorry, I got lost through the threads and comments) please let me know. |
Add
@deprecated
tag when processing deprecated attributes in the OpenAPI filePR checklist
Commit all changed files.
This is important, as CI jobs will verify all generator outputs of your HEAD commit as it would merge with master.
These must match the expectations made by your contribution.
You may regenerate an individual generator by passing the relevant config(s) as an argument to the script, for example
./bin/generate-samples.sh bin/configs/java*
.IMPORTANT: Do NOT purge/delete any folders/files (e.g. tests) when regenerating the samples as manually written tests may be removed.
master
(upcoming7.x.0
minor release - breaking changes with fallbacks),8.0.x
(breaking changes without fallbacks)@TiFu (2017/07) @taxpon (2017/07) @sebastianhaas (2017/07) @kenisteward (2017/07) @Vrolijkx (2017/09) @macjohnny (2018/01) @topce (2018/10) @akehir (2019/07) @petejohansonxo (2019/11) @amakhrov (2020/02) @davidgamero (2022/03) @mkusaka (2022/04) @joscha (2024/10)