How can I use my webpack's html-loader imports in Jest tests? - import

I am just getting started with the Jest test framework and while straight up unit tests work fine, I am having massive issues testing any component that in its module (ES module via babel+webpack) requires a HTML file.
Here is an example:
import './errorHandler.scss';
import template from './errorHandler.tmpl';
class ErrorHandler {
...
I am loading the component specific SCSS file which I have set in Jest's package.json config to return an empty object but when Jest tries to run the import template from './errorHandler.tmpl'; line it breaks saying:
/Users/jannis/Sites/my-app/src/scripts/errorHandler/errorHandler.tmpl.html:1
({"Object.<anonymous>":function(module,exports,require,__dirname,__filename,global,jest){<div class="overlay--top">
^
SyntaxError: Unexpected token <
at transformAndBuildScript (node_modules/jest-runtime/build/transform.js:284:10)
My Jest config from package.json is as follows:
"jest": {
"setupTestFrameworkScriptFile": "<rootDir>/test/setupFile.js",
"moduleDirectories": ["node_modules"],
"moduleFileExtensions": ["js", "json", "html", "scss"],
"moduleNameMapper": {
"^.+\\.scss$": "<rootDir>/test/styleMock.js"
}
}
It seems that the webpack html-loader is not working correctly with Jest but I can't find any solution on how to fix this.
Does anyone know how I can make these html-loader imports work in my tests? They load my lodash template markup and i'd rather not have these at times massive HTML chunks in my .js file so i can omit the import template from x part.
PS: This is not a react project, just plain webpack, babel, es6.

I encountered this specific problem recently and creating your own transform preprocesser will solve it. This was my set up:
package.json
"jest": {
"moduleFileExtensions": [
"js",
"html"
],
"transform": {
"^.+\\.js$": "babel-jest",
"^.+\\.html$": "<rootDir>/test/utils/htmlLoader.js"
}
}
NOTE: babel-jest is normally included by default, but if you specify a custom transform preprocessor, you seem to have to include it manually.
test/utils/htmlLoader.js:
const htmlLoader = require('html-loader');
module.exports = {
process(src, filename, config, options) {
return htmlLoader(src);
}
}

A bit late to the party, but wanted to add that there is also this html-loader-jest npm package out there to do this if you wanted to go that route.
Once you npm install it you will add it to your jest configuration with
"transform": {
"^.+\\.js$": "babel-jest",
"^.+\\.html?$": "html-loader-jest"
}

For Jest > 28.x.x with html-loader:
Create a custom transformer as documented here.
jest/html-loader.js
const htmlLoader = require("html-loader");
module.exports = {
process(sourceText) {
return {
code: `module.exports = ${htmlLoader(sourceText)};`,
};
},
};
Add it to your jest config.
jest.config.js
...
// A map from regular expressions to paths to transformers
transform: {
"^.+\\.html$": "<rootDir>/jest/html-loader.js",
},
...
It will fix the error : Invalid return value: process() or/and processAsync() method of code transformer found at "<PATH>" should return an object or a Promise resolving to an object.

Maybe your own preprocessor file will be the solution:
ScriptPreprocessor
Custom-preprocessors
scriptpreprocessor: The path to a module that provides a synchronous function from pre-processing source files. For example, if you wanted to be able to use a new language feature in your modules or tests that isn't yet supported by node (like, for example, ES6 classes), you might plug in one of many transpilers that compile ES6 to ES5 here.
I created my own preprocessor when I had a problems with my tests after added transform-decorators-legacy to my webpack module loaders.

html-loader-jest doesn't work for me. My workaround for this:
"transform": {
'\\.(html)$': '<rootDir>/htmlTemplateMock.html'
}
htmlTemplateMock.html is empty file

For Jest 28+ you can use jest-html-loader to make Jest work with code that requires HTML files.
npm install --save-dev jest-html-loader
In your jest config, add it as a transformer for .HTML files:
"transform": {
"^.+\\.html?$": "jest-html-loader"
},

Related

How to resolve absolute path using Vite and ESlint in Svelte?

I have an import import icon from 'src/assets/icon.png', which I can't resolve.
I have "baseUrl": "." in jsconfig.json and
"settings": {
"import/resolver": {
"node": {
"paths": ["."]
}
}
}
in .eslintrc, but the thing is that if I use absolute import this way I get an error from Vite - [plugin:vite:import-analysis] Failed to resolve import "src/assets/icon.png" from "src\lib\Logo.svelte". Does the file exist?
At the same time if I add a forward slash before src in import like so import icon from '/src/assets/icon.png', it will work fine with NO error from vite, but eslint/no-unresolved-imports rule will give me a linting error.
I checked vite docs, but the only thing they offer is to use an alias for the path, which I'm not willing to do. Another workaround could be disabling the eslint rule, which is not an option for me either.
Question: Is there a way to resolve this path 'src/assets/icon.png' using "import/resolver" or vite's settings?
I found this page helpful in getting my setup working for absolute/aliased imports with Vite + React, but maybe it will be helpful to you too.
https://dev.to/abdeldjalilhachimi/how-to-avoid-long-path-import-using-react-with-ts-and-vite-4e2h
You don't have to define an alias for every folder - instead you use a dynamics alias that reads the directory name.
Add this to your vite.config.ts to set up the dynamic alias:
import * as path from 'path';
//...
export default defineConfig({
// ...config
resolve: {
alias: {
'~': path.resolve(__dirname, 'src'),
},
},
});
Then in your tsconfig.json you can define the alias like so:
"baseUrl": "./src",
"paths": {
"~/*": ["./*"]
},
The baseUrl by itself almost worked well enough, but this solution seems to be more robust and lets me do exactly the kind of asset and module imports that you're talking about.
import logoPNG from '~/assets/logo.png';

Requiring config.js file in VSCode extension with absolute path (e.g. "C:\...") does not work

I am developing the Argdown VSCode extension. The Argdown parser can be configured using either argdown.config.json files or argdown.config.js files exporting a config object. Using Javascript files is the easiest way to allow users to add custom plugins to the Argdown parser.
If the user tells the parser to use a Javascript file, the file is loaded using import-fresh, (which uses node's require, but deletes the cached version.
Using the Argdown commandline tool (#argdown/cli) this works fine, but in the VSCode extension the module of the config file can not be found. The extension is using absolute file paths to require the config module (e.g. "C:\Users\my-username\projects\my-argdown-project\argdown.config.js"). These paths work with import-fresh outside of the VScode extension.
Is there a security restriction for VSCode extensions that does not allow to require modules with absolute file paths? Or is there some other reason why this does not work?
This was not related to VSCode. The problem was caused by bundling up import-fresh with webpack. I thought that webpack would ignore dynamic imports, but it did not.
I was lucky: Since last month, webpack supports "magic comments" for require (not only for import). So I can use:
require(/* webpackIgnore: true */ file);
You have to activate magic comments support in your webpack config:
module.exports = {
parser: {
javascript: {
commonjsMagicComments: true,
},
},
}
Now the next question is how to add the magic comments to the import-fresh package. For that I used the string-replace-loader:
module.exports = {
module: {
rules: {
{
enforce: "pre",
test: /import-fresh[\/\\]index\.js/,
loader: "string-replace-loader",
options: {
search:
"return parent === undefined ? require(filePath) : parent.require(filePath);",
replace:
"return parent === undefined ? require(/* webpackIgnore: true */ filePath) : parent.require(/* webpackIgnore: true */ filePath);",
},
},
}
}
}
After that, I could load the argdown.config.js files again, even after bundling everything with webpack.

jspm not loading bundles with --inject

Been experimenting with jspm and systemjs over the weekend. Everything is working fine except for the bundling jspm offers. I can load individual files, but jsmp refuses to load the bundle file (which is optimized).
I'm creating the bundle file using:
jspm bundle lib/login assets/js/1-login.js --inject
This updates the config.js file which looks like:
System.config({
baseURL: "/",
defaultJSExtensions: true,
transpiler: "babel",
babelOptions: {
"optional": [
"optimisation.modules.system"
]
},
paths: {
"github:*": "jspm_packages/github/*",
"npm:*": "jspm_packages/npm/*"
},
bundles: {
"1-login.js": [
"lib/login.js",
"lib/sample.js"
]
},
map: {....}
});
lib/login.js
import * as sample from 'lib/sample'
export function test() {
sample.testMethod();
}
lib/sample.js
import $ from 'jquery'
export function testMethod( ) {
console.log( $('body') );
}
So, according to the jsmp docs:
As soon as one of these modules is requested, the request is intercepted and the bundle is loaded dynamically first, before continuing with the module load.
It's my understanding that running
System.import('lib/login.js');
should load the bundle (and optimised file), but is doesn't - it just loads the actual file. What am I missing here? And as a bonus question, why is jquery not in the bundle list?
Well, figured out where I went wrong.
I keep all the generated assets in assets/js, but in my config.json I didn't change the baseUrl to reflect this. I did in fact have the baseUrl set correctly in package.json, which is why jspm didn't throw a lot of errors.
This was the same reason jquery wasn't loading, so problem solved :)

Importing external module with ES6 syntax and absolute path

So I have a project that looks something like this:
app/
bin/
lib/
src/
main/
submodule.ts
utilities/
common.ts
main.ts
tsconfig.json
gulpfile.js
and app/src/main/submodule.ts needs to import app/src/utilities/common.ts. I am trying to use the ES6 syntax for this. Thus I expect something like this in submodule.ts:
import {common} from '/utilities/common';
Where the root / is app/src/ since that is where tsconfig is found. Yes, app/src/utilities/common.ts does export a module named common.
The problem is that I get "cannot find module '/utilities/common'" errors. I have tried a variety of things:
utilities/common
/src/utilities/common
/app/src/utilities/common
None of these work. A relative path of ../utilities/common does work, but relative paths for common modules is a maintenance nightmare.
It may be worth noting that I just updated from TS 1.5 to 1.6: using utilities/common had worked in 1.5. I cannot find any mention of a breaking change along these lines in the 1.6 notes, though.
I mention the gulpfile.ts and other folders because ultimately I want Gulp to get the TS files from src and put the compiled JS files in bin. I am reasonably confident that I have correctly configured Gulp for this, using gulp-typescript, but for completion's sake, here are my tsconfig.json and gulpfile.js.
tsconfig.json
{
"compilerOptions": {
"module": "commonjs",
"target": "es5",
"experimentalDecorators": true,
"emitDecoratorMetadata": true,
"noEmitOnError": true
},
"filesGlob": [
"./**/*.ts",
"!./typings/**/*.ts"
]
}
gulpfile.js
var gulp = require('gulp');
var ts = require('gulp-typescript');
var less = require('gulp-less');
var sourcemaps = require('gulp-sourcemaps');
var merge = require('merge2');
var path = require('path');
var tsProject = ts.createProject('src/tsconfig.json', { noExternalResolve: true });
gulp.task('html', function () {
gulp.src([
'src/**/*.html',
])
.pipe(gulp.dest('bin'));
});
gulp.task('typescript', function () {
tsProject.src()
.pipe(sourcemaps.init())
.pipe(ts(tsProject))
.js
.pipe(sourcemaps.write())
.pipe(gulp.dest('bin'));
});
gulp.task('less', function () {
gulp.src([
'src/**/*.less',
])
.pipe(sourcemaps.init())
.pipe(less())
.pipe(sourcemaps.write())
.pipe(gulp.dest('bin'))
});
gulp.task('default', ['html', 'typescript', 'less']);
Finally solved this. Per What's New, 1.6 changed module resolution to behave like Node's. I have not yet investigated Node's module resolution to determine if a fix is possible using that behavior, but I have found a workaround:
The old behavior can be triggered by specifying "moduleResolution": "classic" in tsconfig.json.
Module resolution is performed relative to the current file if the path starts with ./ or ../.
Here is a quick example using:
/
/src/
/src/thing.ts
/main/
/main/submodule.ts
/utilities/
/utilities/common.ts
So the correct statement to import common.ts into submodule.ts would be:
import {common} from '../utilities/common';
You can also use the following root-path (note that there is no leading / or any .s here):
import {common} from 'src/utilities/common';
This works for me in Visual Studio code, with the parent folder of src opened as the working folder. In my case I am targeting ES5 with UMD modules.
You can also resolve a module if it can be found by traversing up from the current location (this is a feature of NodeJS). So you can import thing.ts into submodule.ts using:
import {something} from 'thing';
The resolver will check the current folder, then the parent, then the parent's parent... and so on.
Absolute vs Relative Paths
When it comes to links on web pages, I'm in agreement with you that absolute paths are easier to maintain, especially where resources are shared at multiple levels.
When it comes to modules, I'm not sure I see the same maintenance problem as the paths here are "relative to the file that the import statement appears in" not relative to the web page. I wonder if this may be the application of a very sensible rule in the wrong place.

Uglifying for require.js with shim doesn't work on play2 with CoffeeScript

I work on Play 2.1.2 project, using Angular.js, CoffeeScript, require.js and bower to organize front-end.
With bower, I use shim in my /app/assets/javascripts/main.coffee file.
Then I deploy using play clean stage and running target/start.
The problem is: during stage phase, Play doesn't uglify resources.
In Build.scala:
val main = play.Project(appName, appVersion, appDependencies).settings(
requireJs += "main",
requireJsShim += "main.js"
)
Then after uglyfying css in stage:
Tracing dependencies for: main
Error: Load timeout for modules: angular-bootstrap,angular
http://requirejs.org/docs/errors.html#timeout
In module tree:
main
jquery
Error: Load timeout for modules: angular-bootstrap,angular
http://requirejs.org/docs/errors.html#timeout
In module tree:
main
jquery
[info] RequireJS optimization finished.
So nothing was uglified. In main.coffee:
require.config
paths:
jquery: "lib/jquery/jquery"
angular: "lib/angular/angular"
...
shim:
angular: {deps: ["jquery"], exports: "angular"}
...
define [
"angular-bootstrap"
"angular"
...
], ->
app = angular.module "app"
...
app
It works perfectly on client side, all paths are correct and so on.
requireJsShim += "main.js" also looks correct: it looks like require.js optimization takes place after compiling assets, so main.coffee or just main doesn't work.
Any ideas what are the roots of the problem? Have anyone faced it before?
I have an example application using the shim where I just answered a question very similar to yours. In a nutshell, the shim overwrites the app.build.js file.
What finally solved my problem is creating custom shim.coffee with part of require.config in it:
require.config
paths:
jquery: "lib/jquery/jquery"
angular: "lib/angular/angular"
...
Without shim part.
Then I had to explicitly define shimmed dependencies in define clauses and use requireJsShim += "shim.js" -- not the same file that I use for client-side configuration.
Then uglifying and require.js optimization begin to work!
I've encountered exactly this problem (almost; I'm not using CoffeeScript in my project), and it turns out easier to solve that I thought. To restate the issue: certain JavaScript resources—particularly those without an export setting in their shim—would produce the “Load timeout for modules” stated above. Worse, the problem appeared to be transient.
Separating the RequireJS configuration (e.g., paths, shim) from the module seemed to help, but compiling remained unreliable and it made working in development mode more complex.
I found that adding waitSeconds: 0 to the configuration object contributed to reliable builds. Why timeouts are even possible for accessing local resources during compilation is beyond me. See the RequireJS API waitSeconds documentation for details.
Here's a snippet from my RequireJS module, located in public/javascripts (your paths will likely differ).
require({
/* Fixes an unexplained bug where module loads would timeout
* at compilation. */
waitSeconds: 0,
paths: {
'angular': '../vendor/angular/angular',
'angular-animate': '../vendor/angular/angular-animate',
/* ... */
'jquery': '../vendor/jquery/jquery'
},
shim: {
'angular': {
deps: [ 'jquery' ],
exports: 'angular'
},
'angular-animate': ['angular'],
/* ... */
'jquery': {
exports: 'jQuery'
}
},
optimize: 'uglify2',
uglify2: {
warnings: false,
/* Mangling defeats Angular injection by function argument names. */
mangle: false
}
})
define(['jquery', 'angular'], function($, angular) {
/* Angular bootstrap. */
})