Been experimenting with jspm and systemjs over the weekend. Everything is working fine except for the bundling jspm offers. I can load individual files, but jsmp refuses to load the bundle file (which is optimized).
I'm creating the bundle file using:
jspm bundle lib/login assets/js/1-login.js --inject
This updates the config.js file which looks like:
System.config({
baseURL: "/",
defaultJSExtensions: true,
transpiler: "babel",
babelOptions: {
"optional": [
"optimisation.modules.system"
]
},
paths: {
"github:*": "jspm_packages/github/*",
"npm:*": "jspm_packages/npm/*"
},
bundles: {
"1-login.js": [
"lib/login.js",
"lib/sample.js"
]
},
map: {....}
});
lib/login.js
import * as sample from 'lib/sample'
export function test() {
sample.testMethod();
}
lib/sample.js
import $ from 'jquery'
export function testMethod( ) {
console.log( $('body') );
}
So, according to the jsmp docs:
As soon as one of these modules is requested, the request is intercepted and the bundle is loaded dynamically first, before continuing with the module load.
It's my understanding that running
System.import('lib/login.js');
should load the bundle (and optimised file), but is doesn't - it just loads the actual file. What am I missing here? And as a bonus question, why is jquery not in the bundle list?
Well, figured out where I went wrong.
I keep all the generated assets in assets/js, but in my config.json I didn't change the baseUrl to reflect this. I did in fact have the baseUrl set correctly in package.json, which is why jspm didn't throw a lot of errors.
This was the same reason jquery wasn't loading, so problem solved :)
Related
I am developing the Argdown VSCode extension. The Argdown parser can be configured using either argdown.config.json files or argdown.config.js files exporting a config object. Using Javascript files is the easiest way to allow users to add custom plugins to the Argdown parser.
If the user tells the parser to use a Javascript file, the file is loaded using import-fresh, (which uses node's require, but deletes the cached version.
Using the Argdown commandline tool (#argdown/cli) this works fine, but in the VSCode extension the module of the config file can not be found. The extension is using absolute file paths to require the config module (e.g. "C:\Users\my-username\projects\my-argdown-project\argdown.config.js"). These paths work with import-fresh outside of the VScode extension.
Is there a security restriction for VSCode extensions that does not allow to require modules with absolute file paths? Or is there some other reason why this does not work?
This was not related to VSCode. The problem was caused by bundling up import-fresh with webpack. I thought that webpack would ignore dynamic imports, but it did not.
I was lucky: Since last month, webpack supports "magic comments" for require (not only for import). So I can use:
require(/* webpackIgnore: true */ file);
You have to activate magic comments support in your webpack config:
module.exports = {
parser: {
javascript: {
commonjsMagicComments: true,
},
},
}
Now the next question is how to add the magic comments to the import-fresh package. For that I used the string-replace-loader:
module.exports = {
module: {
rules: {
{
enforce: "pre",
test: /import-fresh[\/\\]index\.js/,
loader: "string-replace-loader",
options: {
search:
"return parent === undefined ? require(filePath) : parent.require(filePath);",
replace:
"return parent === undefined ? require(/* webpackIgnore: true */ filePath) : parent.require(/* webpackIgnore: true */ filePath);",
},
},
}
}
}
After that, I could load the argdown.config.js files again, even after bundling everything with webpack.
I am just getting started with the Jest test framework and while straight up unit tests work fine, I am having massive issues testing any component that in its module (ES module via babel+webpack) requires a HTML file.
Here is an example:
import './errorHandler.scss';
import template from './errorHandler.tmpl';
class ErrorHandler {
...
I am loading the component specific SCSS file which I have set in Jest's package.json config to return an empty object but when Jest tries to run the import template from './errorHandler.tmpl'; line it breaks saying:
/Users/jannis/Sites/my-app/src/scripts/errorHandler/errorHandler.tmpl.html:1
({"Object.<anonymous>":function(module,exports,require,__dirname,__filename,global,jest){<div class="overlay--top">
^
SyntaxError: Unexpected token <
at transformAndBuildScript (node_modules/jest-runtime/build/transform.js:284:10)
My Jest config from package.json is as follows:
"jest": {
"setupTestFrameworkScriptFile": "<rootDir>/test/setupFile.js",
"moduleDirectories": ["node_modules"],
"moduleFileExtensions": ["js", "json", "html", "scss"],
"moduleNameMapper": {
"^.+\\.scss$": "<rootDir>/test/styleMock.js"
}
}
It seems that the webpack html-loader is not working correctly with Jest but I can't find any solution on how to fix this.
Does anyone know how I can make these html-loader imports work in my tests? They load my lodash template markup and i'd rather not have these at times massive HTML chunks in my .js file so i can omit the import template from x part.
PS: This is not a react project, just plain webpack, babel, es6.
I encountered this specific problem recently and creating your own transform preprocesser will solve it. This was my set up:
package.json
"jest": {
"moduleFileExtensions": [
"js",
"html"
],
"transform": {
"^.+\\.js$": "babel-jest",
"^.+\\.html$": "<rootDir>/test/utils/htmlLoader.js"
}
}
NOTE: babel-jest is normally included by default, but if you specify a custom transform preprocessor, you seem to have to include it manually.
test/utils/htmlLoader.js:
const htmlLoader = require('html-loader');
module.exports = {
process(src, filename, config, options) {
return htmlLoader(src);
}
}
A bit late to the party, but wanted to add that there is also this html-loader-jest npm package out there to do this if you wanted to go that route.
Once you npm install it you will add it to your jest configuration with
"transform": {
"^.+\\.js$": "babel-jest",
"^.+\\.html?$": "html-loader-jest"
}
For Jest > 28.x.x with html-loader:
Create a custom transformer as documented here.
jest/html-loader.js
const htmlLoader = require("html-loader");
module.exports = {
process(sourceText) {
return {
code: `module.exports = ${htmlLoader(sourceText)};`,
};
},
};
Add it to your jest config.
jest.config.js
...
// A map from regular expressions to paths to transformers
transform: {
"^.+\\.html$": "<rootDir>/jest/html-loader.js",
},
...
It will fix the error : Invalid return value: process() or/and processAsync() method of code transformer found at "<PATH>" should return an object or a Promise resolving to an object.
Maybe your own preprocessor file will be the solution:
ScriptPreprocessor
Custom-preprocessors
scriptpreprocessor: The path to a module that provides a synchronous function from pre-processing source files. For example, if you wanted to be able to use a new language feature in your modules or tests that isn't yet supported by node (like, for example, ES6 classes), you might plug in one of many transpilers that compile ES6 to ES5 here.
I created my own preprocessor when I had a problems with my tests after added transform-decorators-legacy to my webpack module loaders.
html-loader-jest doesn't work for me. My workaround for this:
"transform": {
'\\.(html)$': '<rootDir>/htmlTemplateMock.html'
}
htmlTemplateMock.html is empty file
For Jest 28+ you can use jest-html-loader to make Jest work with code that requires HTML files.
npm install --save-dev jest-html-loader
In your jest config, add it as a transformer for .HTML files:
"transform": {
"^.+\\.html?$": "jest-html-loader"
},
I'm trying to run unit tests using karma and i'm getting the error You need to include some adapter that implements __karma__.start method!. I tried running with grunt and karma start commands. I did googling and all the solutions didn't work out. Not sure what i'm doing wrong. I included the right adapter which comes with karma-jasmine, which has the __karma__.start method, under plugins in karma.conf.js file. Here's my configuration file :-
module.exports = function(config){
config.set({
// root path location that will be used to resolve all relative paths in files and exclude sections
basePath : '../',
files : [
'bower_components/angular/angular.js',
'bower_components/angular-mocks/angular-mocks.js',
'node_modules/requirejs/require.js',
'node_modules/karma-jasmine/lib/adapter.js',
'app.js',
'mainapp/mainapp.js',
'mainapp/notes/notes.js',
'mainapp/notes/partial/create/create.js',
'mainapp/notes/partial/create/create-spec.js'
],
// files to exclude
exclude: [
'bower_components/angular/angular/*.min.js'
],
// karma has its own autoWatch feature but Grunt watch can also do this
autoWatch : false,
// testing framework, be sure to install the correct karma plugin
frameworks: ['jasmine', 'browserify', 'requirejs'],
// browsers to test against, be sure to install the correct browser launcher plugins
browsers : ['PhantomJS'],
// map of preprocessors that is used mostly for plugins
preprocessors: {
'mainapp/notes/partial/create/create-spec.js' : ['browserify']
},
reporters: ['progress'],
// list of karma plugins
plugins : [
'karma-teamcity-reporter',
'karma-chrome-launcher',
'karma-phantomjs-launcher',
'karma-babel-preprocessor',
'karma-requirejs',
'karma-jasmine',
'karma-browserify'
],
singleRun: true
})}
Using the requirejs framework turns off the automatic calling of __karma__.start. You need to create a file that a) configures RequireJS and b) calls __karma__.start to kick of the tests. Here's an example. It scans through the files that Karma is serving to find the files that contains tests. This is based on a naming convention that any file that ends with spec.js or test.js is a test file. It converts the file names to module names. It then configures RequireJS. One thing it does is pass all the test modules as deps so that all test modules are loaded right away. It sets __karma__.start as callback so that when all modules passed in deps are loaded, the tests start.
var allTestFiles = [];
var TEST_REGEXP = /(spec|test)\.js$/i;
Object.keys(window.__karma__.files).forEach(function(file) {
if (TEST_REGEXP.test(file)) {
// Normalize paths to RequireJS module names.
allTestFiles.push(file);
}
});
require.config({
baseUrl: '/base',
paths: {
'chai': 'node_modules/chai/chai',
'chai-jquery': 'node_modules/chai-jquery/chai-jquery',
'jquery': 'node_modules/jquery/dist/jquery.min',
'underscore': '//cdnjs.cloudflare.com/ajax/libs/underscore.js/1.6.0/underscore-min',
'sn/sn-underscore': 'static/scripts/sn/sn-underscore',
'vendor/jquery-ui': '//cdnjs.cloudflare.com/ajax/libs/jqueryui/1.11.4/jquery-ui.min'
},
deps: allTestFiles,
callback: window.__karma__.start
});
You need to include this file in your files parameter in your karma.conf.js file. Since you use the requirejs framework, you just need to put it first in the list. For instance, if you call the file test-main.js (as suggested in Karma's documentation):
files: [
'test-main.js',
...
]
If you load RequireJS yourself by listing it in files, you need to put test-main.js after RequireJS.
I am having trouble getting systemjs to work so it resolves node modules.
I have the following in my index.html:
<script src="./system.config.js"></script>
<script>
System.import('blast/test')
.then(null, console.error.bind(console));
</script>
This is my configuration:
System.config({
baseUrl: '/',
packages: {
'app': {
defaultExtension: 'js',
}
},
packageConfigPaths: ['./node_modules/*/package.json'],
paths: {
'blast/*': 'app/*'
}
});
This works fine so far. However, I want to be able to also resolve node modules like lodash. So I set paths to this:
paths: {
'blast/*': 'app/*'
'*': './node_modules/*'
}
Now I can import lodash fine, but when importing blast/test I get the error /app/test 404 (not found). It seems, the package configuration isn't used anymore, this .js isn't appended. Anyone got any hints how to resolve this? I am using SystemJs 0.19.25 Standard.
Thanks, Robin
Try using map configuration here rather for your local package -
System.config({
map: {
blast: './app'
}
});
The ./ is necessary to distinguish the URL space from becoming the node_modules/app path (probably the reason you used paths here to begin with?)
It's also advisable to use baseURL: 'node_modules' instead of a wildcard paths entry (and they pretty much amount to the same thing).
I work on Play 2.1.2 project, using Angular.js, CoffeeScript, require.js and bower to organize front-end.
With bower, I use shim in my /app/assets/javascripts/main.coffee file.
Then I deploy using play clean stage and running target/start.
The problem is: during stage phase, Play doesn't uglify resources.
In Build.scala:
val main = play.Project(appName, appVersion, appDependencies).settings(
requireJs += "main",
requireJsShim += "main.js"
)
Then after uglyfying css in stage:
Tracing dependencies for: main
Error: Load timeout for modules: angular-bootstrap,angular
http://requirejs.org/docs/errors.html#timeout
In module tree:
main
jquery
Error: Load timeout for modules: angular-bootstrap,angular
http://requirejs.org/docs/errors.html#timeout
In module tree:
main
jquery
[info] RequireJS optimization finished.
So nothing was uglified. In main.coffee:
require.config
paths:
jquery: "lib/jquery/jquery"
angular: "lib/angular/angular"
...
shim:
angular: {deps: ["jquery"], exports: "angular"}
...
define [
"angular-bootstrap"
"angular"
...
], ->
app = angular.module "app"
...
app
It works perfectly on client side, all paths are correct and so on.
requireJsShim += "main.js" also looks correct: it looks like require.js optimization takes place after compiling assets, so main.coffee or just main doesn't work.
Any ideas what are the roots of the problem? Have anyone faced it before?
I have an example application using the shim where I just answered a question very similar to yours. In a nutshell, the shim overwrites the app.build.js file.
What finally solved my problem is creating custom shim.coffee with part of require.config in it:
require.config
paths:
jquery: "lib/jquery/jquery"
angular: "lib/angular/angular"
...
Without shim part.
Then I had to explicitly define shimmed dependencies in define clauses and use requireJsShim += "shim.js" -- not the same file that I use for client-side configuration.
Then uglifying and require.js optimization begin to work!
I've encountered exactly this problem (almost; I'm not using CoffeeScript in my project), and it turns out easier to solve that I thought. To restate the issue: certain JavaScript resources—particularly those without an export setting in their shim—would produce the “Load timeout for modules” stated above. Worse, the problem appeared to be transient.
Separating the RequireJS configuration (e.g., paths, shim) from the module seemed to help, but compiling remained unreliable and it made working in development mode more complex.
I found that adding waitSeconds: 0 to the configuration object contributed to reliable builds. Why timeouts are even possible for accessing local resources during compilation is beyond me. See the RequireJS API waitSeconds documentation for details.
Here's a snippet from my RequireJS module, located in public/javascripts (your paths will likely differ).
require({
/* Fixes an unexplained bug where module loads would timeout
* at compilation. */
waitSeconds: 0,
paths: {
'angular': '../vendor/angular/angular',
'angular-animate': '../vendor/angular/angular-animate',
/* ... */
'jquery': '../vendor/jquery/jquery'
},
shim: {
'angular': {
deps: [ 'jquery' ],
exports: 'angular'
},
'angular-animate': ['angular'],
/* ... */
'jquery': {
exports: 'jQuery'
}
},
optimize: 'uglify2',
uglify2: {
warnings: false,
/* Mangling defeats Angular injection by function argument names. */
mangle: false
}
})
define(['jquery', 'angular'], function($, angular) {
/* Angular bootstrap. */
})