Testing of Web Components
Contributions, which include enhancements or new features, should include tests that verify the enhancement or new feature works as expected. A general advice is to follow Test-Driven Development (TDD) and Don’t Repeat Yourself (DRY) practices when preparing a contribution.
Importance of Testing
Adequate test coverage is what enables maintainers to refactor and enhance the product. The tests are what guarantee the products to still work the same as before and that new contributions aren’t re-introducing issues that have already been fixed once. Success of the refactor depends a lot on the quality of the existing tests.
Preparations
Before writing a new test start by familiarizing with existing tests.
Those are located under test/
folder.
The instructions how to run the tests and debug are in the README file of the correspondent repository.
For example, this is instructions for running text field web-component’s tests.
As for the technologies (testing framework, libraries etc), those can be changed in future.
The best way to track what’s used at the moment is to go through README file of the repository and metadata file, for example, package.json
to analyze dependencies and scripts used for testing.
When creating new test, there is no need to run all the tests each time. You can isolate the test case during development and run it in conjunction with other tests in the end.
-
mocha
for test language -
fixture
from open-wc for rendering the component -
sinon
for stubs and mocks
Selecting File
Files in the test/
folder are logically named and separated by the topic categories they are covering.
Select the file with the name of the category the contribution is targeting.
For example, implementing aria-describedby
attribute for text-field based components requires tests to be added to test/accessibility.test.js
.
If none of the files suits the context of the contribution, it’s possible to create a new file. Make sure that tests from newly created file are passing.
Visual Tests
If change affects the visual representation of the component a visual test can be added.
Those can be found under test/visual
.
-
Review the existing test files and construct a new one based on the existing ones.
-
Update test configuration file (for example,
vaadin-text-field/test/visual/test.js
) to include new test and run it for the existing themes (for example, Lumo and Material).
If needed open a discussion in the Pull Request to ask maintainers to update screenshot(-s).
Note
|
Isn’t Mandatory
At the moment there is no way to update visual tests screenshots without having account / access to automated testing platform used in tests.
Writing visual tests can be omitted.
|
Not Reinventing the Wheel
It’s a good practice to check existing tests for the behaviour needed to be reproduced in the new test.
For example, looking through the existing files or searching for keydown
word in web-components tests leads to mock-interactions
usages for pressing specific keys.
Some components can have common helpers exposed (for example, vaadin-combo-box/test/helpers.js
).
Following DRY principle all the logic used in multiple files ends up in one file.
New logic can be added it to it per need.
In case the same logic is reused within the same test file in multiple test, it can be exposed as function (for example, scrollToIndex
in combo-box dynamic size tests)
Test the Use Case
Start with going through the requirements for the feature or description of the bug.
When writing a test, consider the use-case itself rather the implementation details. That comes naturally if TDD principle is followed.
For example, contribution is going to provide field component with a feature to prevent invalid input.
During development process a boolean property preventInvalidInput
is added to control possibility of typing invalid characters to the field input part.
Test created shouldn’t be testing toggling of the property preventInvalidInput
, but instead the use-case of user typing invalid characters to the field and that those aren’t accepted by the field.
Test Coverage
Make sure that test coverage isn’t decreased with the contribution. The easiest way to check is to remove lines of the contribution one by one and check if there are correspondent test failures for each of those.
Avoid Asynchronous Tests
If possible try to avoid asynchronous tests. It simplifies reviewing of the test and helps future contributors. If it’s impossible, the delays should be explained in the comments and no magic numbers used (for example timeout for explicitly 300 milliseconds).
Browser Differences
It’s important to keep in mind that there could be differences in browsers behaviour, so it’s a good habit to check if test passes on supported browsers. To check list of the supported browser check release notes or the docs for that version.
54083DC0-C2C9-49D2-A1FA-8D4BCCEE851E