Have you been using a tried-and-true testing framework for years and wondered whether it might ever be worth it to migrate over to one of the scrappy newcomers? Maybe you’ve wished someone would write up a practical comparison or put together a simple how-to-guide to ease the transition. This was exactly the situation in which we found ourselves on a team that had long used Nodeunit. We decided to take the jump and switched a newer project over to Jest. What follows is a practical guide born out of that experience.
This post was originally written as documentation to help developers on that team make the transition as smoothly as possible. As such, it will be particularly helpful if you are migrating or have migrated from Nodeunit to Jest. Regardless, there’s a lot to glean for anyone considering Jest or using it in any capacity.
So here you are, your team has switched from Nodeunit to Jest and everything is different and frustrating. Have no fear! Jest has particularly helpful guides and documentation and here I’ve put together a how-to-guide addressing the specific hiccups you are most likely to encounter. Each of the following points answers the question: “Wait. How do I __?”.
The first thing you’ll notice when you start working with Jest is that the structure and syntax is very different. This is the easy part, though. Jest’s syntax is very natural and intuitive. Here’s how you would add a new test, first in Nodeunit as a familiar starting point for comparison and then in Jest.
In Nodeunit, we used
module.exports to define test suites. Starting with a top level index.js file, we’d kick off a chain of requires and exports that would make up our test suite. Adding a new test in Nodeunit looked like adding a method to an exported object within that chain.
Jest, on the other hand, uses path matching to automatically require and test each file individually. By default, all the
*.js(x) files inside the
__tests__ directory and
*.spec.js(x) files anywhere in the repository are tested. So, to create a new test, all that is required is adding a file matching those patterns (example:
__tests__/testsAreHere.js) which contains at least one test.
Grouping tests is helpful for organization, selective test runs, and consolidating setup and teardown operations. This is one of the largest practical differences between Nodeunit and Jest.
In Nodeunit, tests were grouped together simply by exporting them within the same parent object.
Jest groups tests together in two ways. The first is by simply including them in the same file together. The second is by using the
Here’s where things start to get dicey. Good thing you have this trusty guide!
In Nodeunit, this was easy because a
tearDown() in an exported object would be inherited by all tests contained within the same object, regardless of whether they were in different files.
In Jest, this one’s a bit trickier. You can’t rely on a hierarchical
module.exports for setup and teardown inheritance, because each file is tested individually in isolation from the rest. But you’re in luck! There are alternative methods. Here are a couple:
One potential method for sharing setup and teardown between files is by managing context and methods like beforeEach and afterEach in a separate file. Then you can require that file into your tests wherever it is relevant.
You’ll want to make sure you ignore the above file by utilizing the testPathIgnorePatterns configuration option with a value like:
Jest provides a setupFiles configuration option which allows you to define files which will be run before each test. This is a great option for when there are general setup or teardown tasks which should apply to all tests.
Sinon is an excellent and robust tool for mocking, stubbing, and spying in your tests. I’ve found, however, that I no longer need it since Jest provides very similar functionality. Here’s a comparison of a simple stub implementation in both Nodeunit/Sinon and Jest.
You’ll notice that I’m recreating the stub before and fully restoring it after each test in the below example unlike in the Jest example below where I simply reset it before each test. This is because Nodeunit doesn’t have an equivalent to Jest’s
afterAll(). So I need to restore it after each test to prevent any subsequent test from getting a stubbed version of
toMock.thing() when it may expect the real deal.
Require Subvert is handy when you need to stub a function that a module exports directly (which Sinon cannot do), rather than as a method on an object.
Here it becomes even more difficult to clean up after stubbing. Require Subvert provides a helpful
cleanUp() method, but there’s no easy way to cleanup after all tests are done without subverting and cleaning up after each test. That starts to get messy since it requires re-creating the stub, re-subverting the thing you want to mock and re-requiring the module that uses the mocked thing before every single test. So in most cases we just left the subverted require subverted and hoped for the best.
Using Jest, this complicated operation becomes very simple.
Jest can also handle your coverage reporting similarly to Istanbul. Simply add
--coverage to your test script in your package.json file.
Jest provides a number of methods for selectively running specific tests. Here are a few of them:
Assuming you have the following test:
Then you can run just that test (and any other test with the same name) like this:
You can also use regex patterns to run multiple tests with names that match the pattern.
Given you have the following tests:
The following command will run all the tests inside
Well, if you’ve made it this far, you’ve survived! Congratulations! We’ve already seen how Jest provides improved syntax and functionality, but let’s get out of survival mode and look at how Jest goes beyond simply replacing our old tooling and improves our testing workflow in new ways.
On our particular project, we were sinking hours into managing exceptionally complicated JSON files which represented expected output. A relatively simple code change could set off a ripple of small required changes dispersed throughout a file, each needing to be tediously manually entered. Then we discovered Snapshot testing.
Snapshot testing allows you to easily manage and compare changes to expected output. Usually it’s used to ensure a user interface doesn’t change unexpectedly, but it’s not limited to markup. You can create snapshots of any serializable value. As I mentioned above, in our case, we used it to manage complicated JSON object output expectations. Using snapshots, Jest creates the expected objects automatically the first time the test is run. Every time the test is run after that, Jest will check to see if the newly generated snapshot matches the previously created one. If not, the test will fail and a helpful diff will be reported to the console. If the change is expected, simply run the test again with the
-u flag to update the snapshot.
To start using snapshots, simply use
expect(someValue).toMatchSnapshot() in your test. A snapshot will be created against
If you’re working on a project with a large or time-consuming test suite, you may find the
-o flag useful. When run with this flag, Jest will identify tests which should be run based on which files have been changed in your current git working tree. This allows you to quickly test for regressions based on changes you’ve made locally.
--watch to watch code for changes and rerun tests when a change is detected. Running Jest in watch mode also provides a convenient method for running all tests, selective tests, or re-running tests. Here’s the prompt you’ll see when you start watching:
A brief google search will reveal that performance has historically been an issue for Jest. After we migrated to Jest, we found it generally took about twice as long as the same suite running on Nodeunit. It’s a legitimate trade-off. For some, the extended feature-set and improved syntax Jest offers is worth the hit in performance. For others, especially those with already time-consuming test suites, it may not be. There are, however, some ways to mitigate the performance hit you’ll likely see when switching to Jest. Here are some recommendations:
The default test environment is jsdom, but if your project does not require a browser-like environment, you’ll see a significant increase in performance if you switch the test environment to “node”. You can do this by configuring jest in your package.json.
By default, Jest looks for
.jsx files in the
__tests__ directory as well as any
__tests__ directory. So we set
testMatch to contain a single value.
Jest stores a cache in the
/tmp directory which, once created, significantly speeds up subsequent runs. In CI builds without proper configuration, however, the cache directory will usually get wiped out with every build. To resolve this in our case, we moved the cache directory to
.tmp/jest_cache at the root of our project and configured Travis to preserve that directory in its own cache between builds.
In our experience, tests within CI builds occasionally fail with random SyntaxError exceptions like
SyntaxError: missing ) after argument list or
SyntaxError: unexpected token }. It appears this is caused by a race condition where Jest attempts to write a transformed file to cache from multiple child processes at the same time. If you run into this bug and simply restarting the build is not a reasonable option, the workaround for now is to run Jest with either the
--no-cache or the
--runInBand flag. There is, however, an active issue and an open pull request for this bug on GitHub. So hopefully we’ll see a fix soon.
UPDATE (2017-08-28): A new pull request was merged which should address this issue. Let’s hope for a fix in v20.0.5! In the meantime, you can test it out by requiring
jest@test in your project.
Although we experienced some minor pain along the way, overall our team’s transition to Jest was smooth and beneficial. The team enjoyed the new tools at their disposal and most of us agreed the syntax and organizational differences in Jest were a huge step up. If you’ve been on the fence about switching to Jest, maybe you ought to give it a whirl? This guide can help!