ember-cli and brocfile app.import with glob - ember-cli

I just installed a bower library that has dozens of javascript files that I need to concatenate at
bower_components/library/extension/*.js
In my Brocfile.js I wanted to do something like
app.import('bower_components/library/extension/*.js');
But it's telling me I can't use glob patterns in app.import.
So I tried to manually to concatenate these using broccoli-statis-compiler
var jsTree = concat('/', {
inputFiles: [
'bower_components/library/extension/*.js'
],
outputFile: '/assets/app.js',
seperator: '\n'
});
But it's telling me it can't find the bower_components path.
How exactly is this done?

Related

How to use stage 3 syntax in svelte/sapper?

I want to use class property and private fields in my sapper project. Apparently they have to be preprocessed by babel right now.
I tried to add the corresponding babel plugins to rollup.config.js, only to realize a few things.
the babel rollup plugin is only used in legacy mode.
the server part doesn't use babel at all.
I tried to add the babel rollup plugin to the end of server plugins like this,
babel({
extensions: ['.js', '.mjs', '.html', '.svelte'],
runtimeHelpers: true,
exclude: ['node_modules/#babel/**'],
plugins: [
'#babel/plugin-proposal-class-properties',
'#babel/plugin-proposal-private-methods',
],
}),
But it doesn't seem to take effect at all.
I also added it to the client plugins (before the legacy entry), but it complained about I needed to add #babel/plugin-syntax-dynamic-import, so looks like babel has to recognize the whole syntax in order to preprocess, and I don't really want to compile dynamic import for modern browsers.
How do I enable the use of esnext syntax in sapper?
You would need to preprocess the contents of <script>, using the preprocess option in rollup-plugin-svelte:
plugins: [
svelte({
// ...
preprocess: {
script: ({ content }) => {
return transformWithBabel(content);
}
},
// ...
})
]
In an ideal world we'd have a ready-made preprocessor plugin for doing this; as it is, the transformWithBabel function is left as an exercise to the reader for now. Essentially it would involve import * as babel from '#babel/core' and using the Babel API directly, which I guarantee will be lots of fun.
Note that #babel/plugin-syntax-dynamic-import doesn't compile dynamic import, it only allows Babel to parse it. Without it, Babel can't generate a valid AST from the code inside <script>.

What Path is the Babel Plugin module-alias Actually Using?

I'm trying to use Babel's "module-alias" plug-in with the "proxyquire" testing library, but I'm not having a lot of luck.
Library Backstory
(feel free to skip if you are familiar with both module-alias/proxyquire)
Proxyquire let's you mock out a module's dependencies for testing, like so:
const someFunctionToTest =
proxyquire(pathToSomeFunctionToTestsModule, {
pathToDependency: fakeVersionOfDependency
});
Babel's module-alias plugin let's you make your import paths more convenient and consistent. For instance, I can specify (in .babelrc):
"plugins": [
["module-alias", [
{ "src": "./", "expose": "~" }
]],
and then instead of having to type (when importing from a module nested three directories deep) require('../../../someModule') I can just typerequire('~/someModule')`.
The Problem
My problem is, they don't work together. If I havesomeModule that depends on someDependency:
// src/someModule.js
const someDependency = require('~/src/someDependency');
doSomethingWith(someDependency);
and then I want to test someModule with a mock version of someDependency, I should be able to do:
const proxiedSomeModule =
proxyquire('~/src/someModule', {
'~/src/someDependency': fakeVersionOfSomeDependency
});
... but proxyquire tells me `Error: Cannot find module '~/src/someModule'.
Presumably ("behind the scenes") Babel is converting '~/src/someModule' into its real path, so when Proxyquire looks for the aliased path it can't find it.
The Question
My question is: is there any way I can find out what the real path of '~/src/someModule' is, after Babel converts it (ie. when proxyquire deals with it)? Or alternatively is there any way to get proxyquire to just work with the aliased paths?
It turns out the "real" path (for '~/someModule') generated by module resolver is just the ../../someModule path. However, it also turns out that there is no need to convert paths by hand.
The module resolver plug-in will convert the arguments to any functions on its transformFunctions list. This means that you can convert any string to its non-aliased form by doing the following:
Define a simple passthrough function, e.g. const resolveModulePath = path => path;
Add that function (along with proxyquire) to the transformFunctions list in .babelrc:
["module-resolver", {
"transformFunctions": ["proxyquire", "resolveModulePath"]
}]
Wrap any paths which aren't arguments to a function with resolveModulePath:
proxyquire('~/some/path/someModule', {
[resolveModulePath('~/some/other/path')]: {
someFunction: fakeSomeFunction
}
})
Note that the first path in the above doesn't need to be escaped, as its an argument to a transformed function. Only the second path ('~/some/other/path') needs to be wrapped, because it's part of an object which is an argument; the string itself isn't an argument, until it's wrapped.
For further info see: https://github.com/tleunen/babel-plugin-module-resolver/issues/241#issuecomment-350109168

Include .json file in .js module

I have a translations file at:
frontend/locales/en/translations.js
which looks like:
export default {
back: "Back",
or: "Or",
cancel: "Cancel",
};
I want to move the translations hash into a .json file:
frontend/locales/en/translations.json
and somehow include it in the .js file.
What's the best way to do this?
I found a pretty slick solution: ember-cli-json-module.
After installing this add-on, I can do:
// master.json:
{
"back": "Back",
"or": "Or",
"cancel": "Cancel"
}
// translations.js:
import masterFile from './master';
export default masterFile;
I also removed the EcmaScript6 tag, as this is a specifically ember-cli solution.
EcmaScript 6 modules don't allow to import .json (or something else), they are focused on javascript only.
this is a reference: How to import a json file in ecmascript 6?
You can do it using some bundler like webpack that has a specific json-loader.
If you don't use any bundler, consider to use javascript instead of JSON, because JSON is a Javascript Object Notation, the only difference is that JSON is simple string (that you can parse).
So use javascript directly without any JSON.parse.
If you need to use json formats you can load them via Javascript...
Have a look at https://blog.gospodarets.com/fetch_in_action
fetch('/path/to/locale/en.json).then(res => res.json()).then(translations => console.log('translations', translations));

How do I write a Webpack plugin to generate index.js files on demand?

In general, I want to know how to do code-generation/fabrication in a Webpack plugin on demand. I want to generate contents for files that do not exist when they are "required."
Specifically, I want a plugin that, when I require a directory, automatically requires all files in that directory (recursively).
For example, suppose we have the directory structure:
foo
bar.js
baz.js
main.js
And main.js has:
var foo = require("./foo");
// ...
I want webpack to automatically generate foo/index.js:
module.exports = {
bar: require("./bar"),
baz: require("./baz")
};
I've read most of the webpack docs. github.com/webpack/docs/wiki/How-to-write-a-plugin has an example of generating assets. However, I can't find an example of how to generate an asset on demand. It seems this should be a Resolver, but resolvers seem to only output file paths, not file contents.
Actually for your use case:
Specifically, I want a plugin that, when I require a directory, automatically requires all files in that directory (recursively).
you don't need a plugin. See How to load all files in a subdirectories using webpack without require statements
Doing code-generation/fabrication on demand can be done in JavaScript quite easily, why would you restrict your code generation specifically to only applied, when "required" by WebPack?
As NodeJS itself will look for an index.js, if you require a directory, you can quite easily generate arbitrary exports:
//index.js generating dynamic exports
var time = new Date();
var dynamicExport = {
staticFn : function() {
console.log('Time is:', time);
}
}
//dynamically create a function as a property in dynamicExport
//here you could add some file processing logic that is requiring stuff on demand and export it accordingly
dynamicExport['dyn' + time.getDay()] = function() {
console.log('Take this Java!');
}
module.exports = dynamicExport;

Karma - Unexpected token when including an html file

I'm trying to include an simple html file in my karma config file to access the html elements from my javascript files and test this with jasmine in karma.
But I'm getting always an Unexpected token < error. I saw in an webcast that it is possible to include html files, but I don't know why it doesn't works with my configuration?
Thanks in advance
By karma config you have to register every file your want to use in your tests. You can do that by adding patterns to your files array in your config.
A file pattern looks like this:
files: [
{
pattern: 'test/unit/*.js',
watched: true, //watching file modifications for test autorun
included: true, //included as <script src="..."> in the runner html file
served: true //used by tests (the test server should serve it)
}
]
You can use short pattern syntax:
files: [
'test/unit/*.js'
]
This means exactly the same as the previous pattern.
If you don't want to use your file in your test included as <script src="file.js"></script>, then you have use included: false and load the file from within your tests via AJAX. Be aware that you should use relative urls for that, because the domain of the test server can always change...
The pattern order is important too, it uses reverse overriding:
files: [
{pattern: '*.js', included: true}
{pattern: '*.js', included: false}
]
In this case every js file will be included, so the first pattern overrides the second.
For example by html files:
karma config:
files: [
{pattern: 'bootstrap.js', included: true}
{pattern: 'x.html', included: false}
]
bootstrap.js:
var xhr = new XMLHttpRequest();
xhr.open('get', '/base/x.html', false);
xhr.send();
console.log(xhr.status, xhr.responseText); //200, '<html content>'
The URI always has a /base prefix because Karma stores its own index.html and js files on the top level, so it needs a subfolder. If you want to find out what URIs are served, you can use this code:
var servedUris = Object.keys(window.__karma__.files);
I wrote some basic fs stuff to support sync read by Yadda: https://github.com/acuminous/yadda/blob/master/lib/shims/karma-fs.js You might be able to move it to a different project extend it, and use fs instead of AJAX in browser. At least that's how I would do it with Browserify.
I ended up making a simple patch to the lib/middleware/karma.js file in my Karma installation. It uses an extension .literal to denote "just echo the entire file onto the page". I've always found it strange that Karma doesn't have an inbuilt way to do this. My way is certainly not brilliantly-constructed but it might be good enough for you.
With this you can just have in your karma.conf.js:
// ...
files: [
"path/to/my_html.literal", // Echoed directly onto the page
"path/to/other.js", // Normal behaviour
// ...
],
My nested scriptTags function (inside createKarmaMiddleware) just looks like this:
var scriptTags = files.included.map(function(file) {
var filePath = file.path;
var fileExt = path.extname(filePath);
// Include .literal files directly into the body of the HTML.
if (filePath.match(/.literal$/)) {
var fs = require("fs");
return fs.readFileSync(filePath, "utf-8");
}
// ...
The diff is here: https://github.com/amagee/karma/commit/c09346a71fc27c94cc212a2bd88b23def19ec3a4
You need use the html2js preprocessor (or the ng-html2js preprocessor if using AngularJS). See the preprocessor description for how it works.
Since Karma 0.10, the html2js preprocessor is shipped with Karma and configured by default:
module.exports = function(config) {
config.set({
files: [
'templates/*.html'
],
preprocessors: {
'**/*.html': ['html2js']
}
});
};
Then you can access the html content as JS string:
var elm = $(__html__['templates/x.html'])
The error you are getting means that the HTML files are not preprocessed. By default, all HTML files under the basePath are preprocessed, so maybe your HTML files are somewhere else, in which case you need to put them in the preprocessors config explicitly. Share your karma.conf.js if you need more help.