- Published:
- By:
- Colin Tester
Adding the AVA test runner to a project which uses ES6 modules.
Setting up a JavaScript testing environment can be a complex process; there is much to learn and numerous tools that need loading and configuring manually. AVA – the JavaScript test runner – offers a much simpler test API with a minimal amount of setting up.
In this post I wish to demonstrate how simple – relatively speaking – it is to get AVA up and running tests on a example ES6 module which implements a simple interface to interact with a browser DOM.
In my project working folder I have two sub-folders, labelled: src and tests to hold, respectively, our JavaScript source file and test file.
- ./src/example-module.js
- ./tests/example-module-test.js
Installing AVA and supporting dependencies.
AVA is a test runner for Node.js so we will install the required modules using npm. I am assuming that npm is installed already and initialised for this project.
We want to load AVA and Babel as our project dev dependencies, with additional support to compile our ES6 written source files.
Babel configuration.
With the AVA and Babel modules installed, we will next create a .babelrc
file and save it into the project’s root folder. Within the .babelrc
file we will set the preset that Babel should use when compiling our source files.
AVA configuration.
AVA operational options can be set within the project’s package.json file. We want to inform AVA where our test files and source files are located.
We also added, in the above code, further AVA options to make sure tests fail as soon as an error is encountered (failFast) and receive a more verbose test result log.
AVA does not, by default, compile ES6 source (src) files, only the test and helper files. To compile ES6 source files, we need to use Babel’s register module, which can also be added to the AVA configuration.
However, when using Babel’s register module, it will now compile not only our source files, but also the test files, which is not want we need nor is it necessary. We can inform Babel’s register to ignore test files via an external JavaScript file.
The external file (_babel-register.js) is to be stored within our ./tests folder and the file name is prefixed with an underscore as AVA will ignore files with that prefix. The _babel-register.js file will contain the code to load (require) the @babel/register
module and set the ignore option.
We have now specified that @babel/register ignores all files in our tests folder and – for good measure – our node_modules folder.
Now, back within the AVA configuration; located in the package.json file, we can require and substitute our external register file to be preloaded before AVA runs any tests.
Browser testing.
AVA provides currently no support for running tests in a browser. However, we can mock browser globals such as window
and document
in order to test DOM integrations. That mocking is assisted by using the npm browser-env module.
Firstly, let’s install the browser-env module.
Next we create a helper file for the browser-env module and store it in our tests folder. Again, the helper file name is prefixed with an underscore so that AVA ignores the file.
Within the helper file ./tests/_browser-env.js include the following code to load (require) the browser-env
module:
By default browser-env
will add all global browser variables to the Node global space. However, this is considered generally to be not such a good idea, therefore, we can reduce the created browser globals by specifying what’s needed for the tests via an array.
In the above code, we have specified required globals for window
and document
only.
Finally, within the package.json file we can require that AVA preloads our helper _browser-env.js file.
We are now ready to run AVA and as we have installed AVA locally, relative to our project folder, we need to run AVA from its location within the node_modules folder. This can be done in one of two ways:
- via npx which will run a locally npm installed module as a binary executable.
- from an npm script defined within the package.json file.
To run AVA directly from the command line with npx, type:
We can also add the --watch
option to our command to allow AVA to watch for changes made to our source and test files and run tests automatically.
Note: When running in watch mode, CTRL + c
will stop the watching process.
However, let’s follow an npm convention and also add a test script to our package.json file.
We can then run the AVA tests from the command line simply with:
Currently, we have no tests, so let’s create our example source JavaScript and some tests for it.
Writing tests.
Create the file example-module.js and save it within the ./src folder. Add the following code to the file:
Our JavaScript example module defines a simple object exampleModule with a simple interface method setupView which will create an HTML element and insert it into the DOM, within the browser’s document body element.
Next let’s create a test file named example-module-test.js and save it in the ./tests folder. Add the following code to the test file:
We are importing AVA as ‘test’ and our example module file as ‘module’. We also have two tests defined and if we now run AVA again on the command line - we should see:
We also need to test that our module’s browser DOM integration is working, and via the browser-env module; that we set to preload before running AVA tests, we have access to a mocked browser document object. We can, therefore, check our module’s DOM integrations just as if the module were running in a browser.
Let’s add another test to our test file:
In the above test, we first call the module interface function setupView, and then attempt to find in the DOM, the element our module built and inserted.
Run the AVA tests again from the command line:
As you should see we now have 3 tests passed.
I hope that this post is helpful in getting the AVA test runner set up for your project, and please see the links below for more on AVA and the other supporting modules. I would also recommend highly learning about test-driven development (TDD) as a better way to plan and implement tests for your next project.
Links: