Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

test(fixtures): add fixtures testing framework #221

Merged
merged 5 commits into from
Oct 20, 2018

Conversation

moorereason
Copy link
Contributor

Create a basic framework for adding test fixtures. The workflow is to
add .adoc files to the fixtures folder with corresponding .html files to
act as the "golden" HTML output. Include files are in the includes
sub-folder so as to differentiate them from the main fixture files.

Failed tests show a diff against the golden output.

Tests are currently hidden behind a "fixtures" build tag. Use
"-tags=fixtures" when running tests to activate.

The existing test fixtures are mostly taken from the asciidoctor test
fixtures. The golden files were generated with asciidoctor v1.5.7.1
using the "-s" option.

Updates #79

Create a basic framework for adding test fixtures. The workflow is to
add .adoc files to the fixtures folder with corresponding .html files to
act as the "golden" HTML output. Include files are in the includes
sub-folder so as to differentiate them from the main fixture files.

Failed tests show a diff against the golden output.

Tests are currently hidden behind a "fixtures" build tag. Use
"-tags=fixtures" when running tests to activate.

The existing test fixtures are mostly taken from the asciidoctor test
fixtures. The golden files were generated with asciidoctor v1.5.7.1
using the "-s" option.

Updates bytesparadise#79
The previous .gitignore file was ignoring all .html files.
Copy link
Member

@xcoulon xcoulon left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for this awesome contribution, @moorereason! 7/8 tests failing, there's room for more fixes :)

One comment: the tests seem to run in debug mode, so the output contains stats about the grammar rules that were used. I believe that we don't need them for those test, so we should find a way to run with a log level at info or higher.

fixtures_test.go Outdated
require.NoError(GinkgoT(), err)

for _, input := range matches {
Context("["+input+"]", func() {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we really need to use a Context here? If we pass the .adoc file name directly into the It func, then the output will be easier to read, IMO.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good idea. 👍 I just tested the removal of the Context, and it works.

Some background: I went through many iterations of changes trying to get Ginkgo to do what I wanted in that for-loop. The key was to save a local copy of input. After I got that working, I didn't think to try removing the Context.

fixtures_test.go Show resolved Hide resolved
fixtures_test.go Outdated Show resolved Hide resolved
The target will run ginkgo with the "fixtures" build tag and focus only
on specs with descriptions contains "fixtures."
Use the existing go-diff package and set logging level to Fatal during
fixtures tests.
@xcoulon xcoulon merged commit 1b1008c into bytesparadise:master Oct 20, 2018
@moorereason moorereason deleted the iss79 branch October 30, 2018 13:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants