Around 2019 we replaced Mocha with Jest for testing Node.js applications in our back-end team to simplify our testing setup. Let us share what we learned and what are some hidden traps we fell for.
After using Mocha for testing practically on all internal and open-source projects we noticed some issues. The main defect from my perspective (being one of the main critics) was the complexity associated with the openness of the framework: It is just a test runner.
Apart from Mocha we needed:
- Istanbul for coverage reporting (istanbul, nyc later),
- Chai for assertions
- and some other related extensions of these libraries to provide even more sugary testing or integrations e.g. junit reporting for our Jenkins pipeline we used at that time.
Back then, Jest seemed almost too good to be true: not only we could replace the numerous dependencies with a single module jest (well, and a junit formatter at that time to be frank), but many things got simplified drastically — for example generating code coverage was no longer a voodoo script one had to copy from previous project, but just a matter of adding --coverage (and setting the formatter). Atop of that, everything was well documented on glamorous websites and we got used to the gimmick feature of jest: the snapshots.
In retrospect, I think I had been too harsh on Mocha and had judged too hastily in the prospect of a shiny replacement. After some (both joyful and dreadful) time spent with Jest, I think it is time to look back and call out some of its features that make our life difficult.
Typescript aids
After adapting TypeScript we ran tests as we did any other application code: build and run JavaScript. This bugged me a bit, because it was a tad slow (TypeScript was a bit slower and did not have some optimizations that are available now, e.g. the incremental builds) and stack traces and debugging required additional configuration to work properly.
We solved the problem with ts-jest, which lets you interpret the TypeScript files on the fly and easily solves both of the issues.
We did notice the warning in the docs:
We DO NOT use SemVer for versioning.
But it did not stop us. Take this with a pinch of salt. It is almost SemVer, but aligns the major versions with the ones of Jest. Needing the versions to be lined up with the framework caused us numerous issues:
- Bots do not know that versions need to be bumped together
- If you have some issues after the update it can be in jest or ts-jest and debugging is a bit harder
- TypeScript support is getting better, but there were issues with support for certain files (you could write suites in TypeScript, but not detached setups, configs etc.)
- You can upgrade jest only when there is an appropriate version ready.
Living with TS Jest and Jest is just a little painful. It would be great to have a node module plugin that is completely unrelated to Jest to solve this issue for us.
Snapshot testing
To be fair, snapshots are not the only way to write assertions in Jest and Jest is not the only library you can use for writing snapshot tests, but people using Jest tend to use snapshots more and snapshots tend to be used with Jest so I think the warning against overusing them is still relevant.
This is how I felt about the snapshot testing throughout the two years or so using them:
- Snapshots are an ultimate assertion strategy, changing how people write tests.
- Writing tests using snapshots is kind of lazy: you do not need to specify the important rules and just preserve everything. It is an excellent tool for prototyping.
- Maintaining snapshot tests is extremely challenging. Updates in the system produce large diffs in the snapshots that are hard to validate and review.
- All external snapshots from tests are difficult to review (creations and updates alike). Use inline snapshots by default.
- Do not use snapshots unless you need to, prefer short inline snapshots with carefully selected attributes.
Snapshot testing might be a good tool for a specific usage. The issue however is that you know about the benefits from the start (easy to write) and feel the pain much later on down the road (maintaining a large system with large snapshot data, where it is difficult to say whether a change breaks the system or changes it) so people tend to overuse it.
If you start using Jest, be more conservative with adapting the snapshot testing. Everything comes with a price.
Magical life cycle
This is an example form the docs that showcases the test life cycle:
beforeAll(() => console.log('1 - beforeAll'));
afterAll(() => console.log('1 - afterAll'));
beforeEach(() => console.log('1 - beforeEach'));
afterEach(() => console.log('1 - afterEach'));
test('', () => console.log('1 - test'));
describe('Scoped / Nested block', () => {
beforeAll(() => console.log('2 - beforeAll'));
afterAll(() => console.log('2 - afterAll'));
beforeEach(() => console.log('2 - beforeEach'));
afterEach(() => console.log('2 - afterEach'));
test('', () => console.log('2 - test'));
});
// 1 - beforeAll
// 1 - beforeEach
// 1 - test
// 1 - afterEach
// 2 - beforeAll
// 1 - beforeEach
// 2 - beforeEach
// 2 - test
// 2 - afterEach
// 1 - afterEach
// 2 - afterAll
// 1 - afterAll
If you skip some tests however, Jest used to behave in mysterious ways.
beforeAll(() => console.log('1 - beforeAll'));
afterAll(() => console.log('1 - afterAll'));
beforeEach(() => console.log('1 - beforeEach'));
afterEach(() => console.log('1 - afterEach'));
test.skip('', () => console.log('1 - test'));
describe('Scoped / Nested block', () => {
beforeAll(() => console.log('2 - beforeAll'));
afterAll(() => console.log('2 - afterAll'));
beforeEach(() => console.log('2 - beforeEach'));
afterEach(() => console.log('2 - afterEach'));
test.skip('', () => console.log('2 - test'));
});
// jest@24 and prior
// 1 - beforeAll
// 2 - beforeAll
// 2 - afterAll
// 1 - afterAll
// jest@25 onward
// (no output)
Furthermore, the output are still the same even if you skip the whole describe block. I do not wish to blame Jest now for bugs long fixed, but this behavior taught us it is better not to trust it with setup and tear-down when it comes to modifying the suits with skip / only.
Only works only within test suite
Want to run a single test? Sure, for Mocha, add .only. For Jest add .only and update your test task to run only the only suite you want to test. Or check the CLI docs to find the parameters you can use with some pattern (that is hopefully regex) which matches the test names (to be honest, I am not sure what it matches). Or just read this guide which explains it thoroughly in a few pages until you reach the voilà! at the very end.
It is a feature of Jest that it runs every suite independently, each in a new process, so it finds the only when it launches all the suits. But more on that in the next point.
Random by default
Running the tests independently allows you for parallelized runs. If you use Jest to run integration tests on a single resource, that might not be what you really want (a test for listing resources from shared storage might encounter data from a different test suite). In that case you probably know about the --runInBand option that allows you to run tests serially.
If your tests, however, do not perform a 100% cleanup and are at least a little sensitive to the shared resource, you might encounter random fails. Unless explicitly specified, your tests run in random order, even with the run in band option. You can make the order deterministic by writing a custom TestSequencer). Which of course (coming to the previous point) you cannot write in TypeScript (or at least could not a few months back).
Global setup like a box of chocolates
Running different tests in different processes takes another toll: global setup. You can have a module that sets up your environment (e.g. connects to the database and runs migrations). You can use a variety of options: setupFiles, setupFilesAfterEnv, or globalSetup. Can you guess which is used for what?
- globalSetup is run only once (before spawning processes) at the beginning and can be used e.g. for aborting tests based on some configuration.
- setupFilesAfterEnv can be used for setting up individual suites (e.g. starting server for integration tests etc.).
- I have never used setupFiles and I have no idea what it does.
Life is simply too short to know the setup configuration options for Jest.
Invisible output ink
Standard output from the tests usually gets out to your terminal, but sometimes it does not. There is a long-running issue on GitHub about this from 2016 that is still commented on. I have seen at least two times (#6871, #3895) in release notes for Jest that this issue is fixed or improved upon, but it is still not reliable.
I agree that logging things is not a good debugging tool, but it does not mean that it is not useful or that it is an excuse for neglecting the issue for years.
Summary: Jest vs. Mocha
Jest offers an all-in-one solution for testing that makes it ideal for smaller projects and libraries. It can be a good fit if you can leverage the parallel runs to cut the time without worrying about shared state components, which might be a good fit for frontend development.
If your tests operate on a shared resource and you cannot afford to run tests asynchronously, you might be better off with an alternative. If you want to use Jest with TypeScript, consider not using ts-jest to avoid issues with fragile dependencies and transpile your tests.
Would we choose it again today as a replacement for Mocha for larger projects with integration API testing? I know I would not.