Test coverage with Karma, browserify and Coffeescript - coffeescript

I'm having troubles to add test code coverage, I'm using Karma and files added to Karma are already bundled with browserify, so in karma.conf.coffee it looks like that:
files: [
{ pattern:'bin/public/client/app.js', served:yes: included:yes }
{ pattern:'src/lib/vendor/angular-mocks/angular-mocks.js', served:yes: included:yes }
{ pattern:'bin/tests.js', served:yes: included:no }
]
And that works for running the test, but not coverage
I'm using karma-coverage npm package, and this:
preprocessors: 'bin/public/client/app.js':['coverage']
reporters: ['progress','coverage']
Actually does create coverage stat files, but those are completely wrong, because it reddens parts that browserify brought from node_modules (because I don't have tests to cover those)
Ideally I have to gather source maps that browserify generates, and run coverage against those, but browserify embeds source maps into .js files. Using karma-sourcemap-loader lets me see original coffeescript files of tests, when debugging (for some reason it works only in ChromeCanary, nevertheless it works)
I tried to do preprocessors: 'src/client/**/*.coffee':['coverage'], but that yields no stats at all saying 'No data to display'
Do you have any ideas?
upd:
I've figured by running browserify-istanbul transform right after coffeeify and that gave me nice diagram like this:
Now, I need somehow to remove app.js from it, because really it doesn't matter and really confuses
upd:
Oh, instead of javascript I have to supply coffee files:
preprocessors : {
'bin/tests.js': ['sourcemap']
'src/client/**/*.coffee': ['coverage']
}

Seems I answered my own question. Also it seems there's a bug in current version of karma-coverage - it throws an error when coverageReporter.type is html (which is by default html). I'm glad I've figured it out. It's always nice to see how much code covered by tests

Related

A constructor from a node module I'm importing works when using Create React App, but errors in ParcelJS. What is going on?

I'm converting a project that was built using Create React App to use ParcelJS as a bundler instead. Strangely, a dependency that I imported during development (#twilio/voice-sdk) works fine in the CRA version of the application, but I get the following error when I try to invoke the constructor in the Parcel version:
TypeError: (this._options.AudioHelper || audiohelper_1.default) is not a constructor
The package is identical between both (#v2.1.1, the latest). I'm importing using ESM syntax, so:
import { Device } from '#twilio/voice-sdk'
I trying using CommonJS syntax (require) and it still didn't work. I've dug into the compiled code, and that seems to be the issue. I imagine there are a lot of differences, but one that I've noticed is here:
On the left is the code compiled by Create React App, which does seem to be exporting something more substantial than on the left - is the export just an empty object? If so, it's no wonder I'm getting a constructor error.
Unfortunately, no amount of googling and SO sleuthing has clarified what I could do to make ParcelJS transpile this dependency properly, if that's the issue. I've tried to make the babel config for ParcelJS match CRA more closely by adding the following to a babel.config.json
{
"plugins": [
"#babel/plugin-transform-modules-commonjs"
]
}
But no luck. Any ideas from where to go from here, or is it time to switch to Webpack?
It looks like Twilio package has a problem when using Parcel 2: https://github.com/twilio/twilio-voice.js/issues/101

babel-jest doesn't handle ES6 within modules

I am trying to set up Jest on a React based project which uses ES6 modules. However I seem to be having issues with ES6 modules, I am using babel-jest and believe I have this set up properly (Jest detects it automatically).
Jest doesn't seem to have a problem using ES6 imports however as soon as it hits on an import statement within one of the imported modules it chokes. It's as if it is only transpiling the initial test script and not any of the imported modules. I have tried various configurations and tried searching Google with no luck. Running tests without any imports works fine.
Here is the error:
({"Object.<anonymous>":function(module,exports,require,__dirname,__filename,global,jest){import Predications from './predications';
^^^^^^
SyntaxError: Unexpected token import
Here are the relevant bits of config:
jest.conf.json
{
"testRegex": "\/test\/spec\/.*\\.js$",
}
.babelrc
{
"presets": ["es2015", "stage-0", "react"]
}
Test script
import React from 'react';
import { mount, shallow } from 'enzyme';
import Slider from 'react-slick';
import Carousel from '../../client/components/carousel/carousel.js'; // test chokes on when I include this module
describe('carousel component', () => {
it('is a test test case', () => {
expect(1 + 2).toEqual(3);
});
});
Update:
As suggested, I have tried running the test without jest.conf.js, however the testRegex is needed in order for Jest to find my tests, I tried moving tests to the default test directory and they still fail.
I would like to clarify that tests themselves are running fine, the issue seems to be where one of my imported modules uses ES6, in my example above, if I don't import my carousel component the test runs fine, as soon as I import that the test chokes on the import statement within that file. It seems as though the imported modules are not getting transpiled.
Update #2
After some investigation it appears the issue is that babel is not transpiling ES6 within node_modules. I have created an example repo to demonstrate this here: https://github.com/jamiedust/babel-jest-example
I understand that third party modules should be handling their own transpiling, however we have a number of modules which are hosted on our own npm registry and are re-used between projects, in these cases Webpack handles transpiling, for the Jest tests we need these node_modules to be transpiled by Babel, or a way of leveraging our webpack set up to do this for us.
Solution
Add the following config in package.json (or Jest config file).
"jest": {
"transformIgnorePatterns": [
"/node_modules/(?!test-component).+\\.js$"
]
}
By default any code in node_modules is ignored by babel-jest, see the Jest config option transformIgnorePatterns. I've also created a PR on your example repo, so you can see it working.
While this works, I've found it to be extremely slow in real applications that have a lot of dependencies containing ES modules. The Jest codebase has a slightly different approach to this as you can find in babel-jest transforming dependencies. This can also take much longer on Windows, see Taking 10 seconds on an empty repo.
If doing "unit" testing, mocking is probably the better way to go.
You could try adding the transform-es2015-modules-commonjs plugin to your babel config file for testing only. Here is an example config file which tells babel to transpile modules only when in a testing environment. You can put it underneath your presets:
{
"presets": [
"react",
["es2015", {"modules": false, "loose": true}]
],
"env": {
"test": {
"plugins": ["transform-es2015-modules-commonjs"]
}
}
}
You can read about the plugin here:
https://www.npmjs.com/package/babel-plugin-transform-es2015-modules-commonjs
Then, when running your Jest tests on the command line specify NODE_ENV=test (you may need to add the --no-cache flag to the command the first time after making the change to the babel config because Jest caches babel output, but after that you can leave it off:
NODE_ENV=test jest --no-cache
I learned about this issue in a React seminar by Brian Holt at Frontend Masters. https://frontendmasters.com/courses/
faced the same issue, followed the steps to resolve,
install babel-jest
in jest config add this configuration
transform: {
'^.+\\.js?$': require.resolve('babel-jest')
}
make sure you have babel.config.js present (your config might be different than provided below)
module.exports = {
"env": {
"test": {
presets: [
[
'#babel/preset-env',
{
targets: {
node: 'current',
},
},
],
]
}
}
};
I faced the same problem (node_module not transpiled by babel-jest), without being able to solve it.
Instead, I finally succeed by mocking the node_module, like described here https://facebook.github.io/jest/docs/manual-mocks.html
NB: setting mocks in __mocks__ subfolders did not work for me. So I passed the mock as the second parameter of the jest.mock() function. Something like :
jest.mock('your_node_module', () => {})
Another possible cause. Babel now ignores your .babelrc inside node_modules and uses the one provided by the dependency. If you have control of the dependency you would have to add a .babelrc to it and babel would use those settings for it.
this can cause problems though if your dependency and your project use different babel versions or modules.

How can I fix tests in Ember testem with errors such as 'could not load', 'failed', 'could not find module' or 'died'?

I managed to get a couple of EAK/grunt based Ember apps upgraded to 1.11 with HTMLBars, and then got them migrated to Ember CLI/Brocolli. The unit tests were setup for karma test runner so I'm looking at how to get those running in the CLI projects now, but I didn't write the tests and really have no experience with unit testing javascript modules.
Searching around the iNet, I can see that others have also used karma becasue of its coverage output and are trying to get it to work with Ember CLI, but that Ember Core isn't supporting it, though they say anyone should be able to get it set up with a custom addon. I'm also trying to use the 'testem' runner to see what sticks with that.
The Ember site does have an 'automating tests with runners' page for v1.10, with sections on 'testem' and 'karma', but it doesn't appear for v1.11 so I can't tell from that site what is or isn't relevant. But it seems like I should be able to work out a solution for the karma test runner, so I added the old devDependencies to the project package.json:
"karma": "^0.12.31",
"karma-chai": "~0.1.0",
"karma-chrome-launcher": "~0.1.2",
"karma-coverage": "~0.2.1",
"karma-firefox-launcher": "~0.1.3",
"karma-junit-reporter": "~0.2.1",
"karma-mocha": "~0.1.3",
"karma-phantomjs-launcher": "~0.1.2",
"karma-sinon-chai": "~0.1.5"
I also dropped the old 'karma.conf.js' (along with a few other karma confs) in the project and updated the paths inside (from 'vendor' to 'bower_components'). I did find a 'ember-cli-karma' node mode and installed it, but it seems to just have a 'package.json'. It has no docs and seems like just a stubbed out starter project with no implementation. I also installed 'karma', 'karma-cli' and 'testem' node modules.
The testem docs say to add you src and test files to 'testem.json', but with out examples I don't know what that means; a list of every src and test file? With what path; relative, absolute? Forward slashes, backslashes? preceded with / or ./ or ../? I just left them out because I think the system just finds the src and tests by convention.
When I run 'karma init' I get:
readline.js:529
this.line = this.line.slice(this.cursor);
^
TypeError: Cannot read property 'slice' of undefined
When I run 'testem' I get:
TEST'EM 'SCRIPTS!
Open the URL below in a browser to connect.
http://localhost:7357/aN;0faN;NaNf
...then the project's '../tests/index.html' loads in a browser, but is not able to 'find' any of the asset files (css, js) so nothing executes or renders correctly. I just see template expressions ({{content-for 'head'}}, etc).
When I run 'ember test' I get:
Building...BuildingBuilding.Building..Building...Built project successfully.
1..0
# tests 0
# pass 0
# fail 0
# ok
No tests were run, please check whether any errors occurred in the page (ember test --server) and ensure that you have a test launcher (e.g. PhantomJS) enabled.
When I run 'ember test --server' I get:
The test index.html loaded in a browser with a test report. When I uncheck 'hide passed tests' the report indicates '29 passed, 28 failed'. It has 11 sections where a particular test may have 3 problems such as 'could not load', 'failed', 'could not find module', 'attempting to register an unknown factory' or 'died'.
With this, I'm obviously running testem and not karma, so may as well work on getting testem working and figure out karma later. If there were more examples and migration troubleshooting docs I might have a systematic way to work through some of these problems.
I ran into "No tests were run,..." problem recently after a node upgrade. I fixed it with a:
npm install -g phantomjs
This provides some additional options as well:
https://github.com/ember-cli/ember-cli/issues/3969
I had the Cannot read property 'slice' of undefined error on MS Windows, running via MSys2. I have solved it by using karma init from an ordinary cmd prompt.

How can I instrument mocha and a code coverage tool with CoffeeScript?

Istanbul seems great - seen lots of awesomeness there. Even blanket seems pretty cool. But none seem to play nice, and if they do, I can't figure out how to get it to play with grunt and if I can, I'm left with the problem of running coffee-script.
For example, this post gives a great example and it seems great, but no grunt! Any help?
I'm using one grunt library for that. It's called Grunt Mocha Test.
It plays well when your mocha tests are for nodejs backend project. I hope my configuration helps you :
mochaTest:
options:
require:
- 'coffee-script/register'
- './test/mocha.coffee'
- 'coverage/blanket'
quiet: true
reporter: 'html-cov'
captureFile: 'coverage.html'
src:
- 'test/**/*.coffee'
The format for my grunt configuration files is in yaml, because I'm using another plugin that separates the tasks.
my mocha.coffee file looks something like this :
# Initialize Should for chai
global.chai = require 'chai'
global.chai.use require 'chai-as-promised'
global.chai.config.includeStack = true
global.should = chai.should()

my coffeescript file compiles but mocha gives an error

I have a project that uses "coffee-script": "^1.7.1" in its package.json.
The code has this line in it:
[{id: id, name: name}, ...] = result.rows
This compiles fine using coffeescript version 1.7.1
The problem is that I am trying to use mocha for unit tests and it gives me an error on this line:
Parse error on line xyz: Unexpected '...'
Apparently mocha uses an older coffeescript. Is there a way to make it work without adjusting the source for mocha?
EDIT:
my Gruntfile.coffee:
'use strict'
module.exports = ->
#initConfig
cafemocha:
src: ['test/*.coffee']
options:
reporter: 'spec'
ui: 'bdd'
coffee:
compile:
files:
'lib/mylib.js': ['src/*.coffee']
#loadNpmTasks 'grunt-cafe-mocha'
#loadNpmTasks 'grunt-contrib-coffee'
#registerTask 'default', ['coffee', 'cafemocha']
I added mocha.opts to the test directory:
--require coffee-script/register
--compilers coffee:coffee-script/register
--reporter spec
--ui bdd
but, still, when I run grunt, it gives me the same error. I am new to this environment, and I find it too complicated, please help.
Starting from version 1.7.x CoffeeScript compiler should be explicitly registered (see change log for version 1.7.0).
So, the problem is that CoffeeScript compiler is not registered when you're running your mocha tests, so node.js treats all your .coffee files as .js files.
The best possible solution is to specify --compilers option for your mocha tests:
--compilers coffee:coffee-script/register
If you don't want to include it to every mocha call, you could set it up using mocha.opts file.
Here are some useful links:
issue about it on github
reference in mocha docs
the reason behind this breaking change in CoffeeScript engine
Update
Looks like your issue is much deeper then I thought.
First, grunt-cafe-mocha doesn't respect mocha.opts because it's running tests by requireing mocha as a dependency, instead of calling mocha test runner.
So, it would've been enough to add require('coffee-script/register') to the top of your gruntfile, if not for this old grunt issue.
In short, grunt uses coffee-script 1.3.x, forcing all its tasks to use the same version of coffee. I had the same problem with grunt-contrib-connect, being unable to use latest coffee-script in my express app.
So, the only help I can offer you is a small grunt task I wrote to solve similar problem in one of my projects. It runs mocha in a separate child process, thus completely isolating it from grunt.
N.B. I had a thought about releasing this task to npm, but considered it too minor.