-
Notifications
You must be signed in to change notification settings - Fork 23
Using the interactive exploratory tests
Interactive test files provide flexibility and ease of use that is useful for experimental gap-analysis testing.
These tests allow you to experiment, and quickly create tests, document them, and record results. They are particularly useful for tests which are likely to result in slightly different results across a number of languages, or for testing features that involve many permutations in the input data. They are also useful for simply exploring what might happen in certain situations.
See the list of test results pages, and test pages, and repositories.
See an example of a test. You can start with an empty test page, or modify an existing test. See the list of test results and tests for a list of currently available test pages.
When you open a test page, you can add whatever text you need to the uppermost text input box. When you click on Go the text is displayed in one or more boxes below with orange borders.
Controls at the top of the page allow you to set various parameters, including the dimensions of the test box. Controls lower down are typically associated with CSS properties, and allow you to set various property values on the text in the orange box(es).
If you have created a test and want to point people to it, or record it in a GitHub repository, click on the button Take a snapshot at the bottom of the page. This will produce a URL which you can copy where you like. Anyone who follows the URL will see the page exactly as it was when you snapped it.
Head to the list of test results and tests to find tests.
You can look them up in the GitHub repositories, where you can filter and sort the tests using labels to home in on what you are interested in. There are repositories for each of the major sections in the Language Enablement Index.
Alternatively, you can start from one of the test results pages. There is one such page for each of the subsections in the Language Enablement Index. These pages provide an easy to read overview of results, and also have links to the more detailed test information stored in the GitHub issues.
If you have created a test that is worth preserving, follow these steps.
Go to one of these repositories and create a new issue. There is one repository for each of the major sections in the Language Enablement Index.
The format for describing a test in an issue is very simple, and the template provides guidance. It involves
- an assertion of what is expected to happen
- a link to the test
- instructions for how conduct the test and what to look for, and
- a brief description of the results for major browsers.
W3C staff or editors will then review the test and assign labels to it. Once the labels are applied, the test will appear in the appropriate test results page.
Ask the W3C staff, and they'll try to do it, or find someone who can.