Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Consider a refactoring for the tests #80

Open
javagl opened this issue Jun 15, 2022 · 0 comments
Open

Consider a refactoring for the tests #80

javagl opened this issue Jun 15, 2022 · 0 comments

Comments

@javagl
Copy link
Contributor

javagl commented Jun 15, 2022

The approach for the tests in wetzel is explained in a comment in test.js : Based on a set of example schemas, the tests consist of automatically generating the property reference in different configurations, and comparing the resulting files with the "golden" output that is checked in into the repository. The exact inputs and parametrizations are summarized in the index.json.

The example schemas for the tests are apparently based on glTF. But additional features have been added to some of these schema files, and it is not always clear which aspect of the schema files are supposed to cover which functionality. One specific example: The image.schema.json is largely a glTF image, but contains some test for fractions. I think that it could make sense to break these tests down into smaller pieces that have a "semantic meaning". For example, these 'fractions' could be tested with a dedicated fractions.schema.json.

Other aspects that could be covered with dedicated tests could be

  • circular references
  • nested type definitions
  • the handling of additionalProperties
  • details about strings (patterns, lengths, formats)
  • Maybe important: Subtle differences regarding the JSON schema version. For example, the change of the meaning of minimum/exclusiveMinimum that was done after Draft 04 - see Range for details

The advantage would be that these schemas can be documented via the description, clearly explaining which aspect of the schema is tested there, and that it could be easier to apply specific changes or add and test specific new functionality.

There are some questions that will certainly come up either in this process, or in the medium term in general:

  • Which parts of JSON Schema are supposed to be supported in the first place?
  • What should the generated documentation look like, exactly?
  • In how far should details of the generated result be configurable (e.g. via CLI parameters)?

But I think that a few, first steps for creating such a set of example schemas could be done independently.

(Note: all this could be done as a pure addition to the current tests. But we might as well try to "clean up" the current schemas so that they more closely resemble the relevant parts of the current glTF schema, and use this part as a more coarse-grained "integration test". And these changes would solely be on the test schemas, and not affect any part of the actual schema generation code, just to avoid regressions)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant