Make Babel transpile a group of files independently without "Couldn't find intersection" error - babeljs

I have a set of Javascript files in src directory. These files are completely independent from one another, i.e. nothing requires anything, they are not modules and intended to be used old-fashinally in different html files in <script> tags. I have set up babel using package.json (name, version, description etc omitted):
{
"scripts": {
"build": "babel src --out-dir ./ --source-maps"
},
"babel": {
"presets": [
"#babel/preset-env",
"babel-preset-minify"
],
"comments": false
},
"devDependencies": {
"#babel/cli": "^7.20.7",
"#babel/core": "^7.20.12",
"#babel/preset-env": "^7.20.2",
"babel-preset-minify": "^0.5.2"
}
}
When two certain js files are put in src directory, Babel fails to build, producing an error:
Error: D:\temp\babel\src\some_script.js: Couldn't find intersection
at NodePath.getDeepestCommonAncestorFrom (D:\temp\babel\node_modules\#babel\traverse\lib\path\ancestry.js:113:11)
at getSegmentedSubPaths (D:\temp\babel\node_modules\babel-plugin-minify-builtins\lib\index.js:244:14)
at BuiltInReplacer.replace (D:\temp\babel\node_modules\babel-plugin-minify-builtins\lib\index.js:92:31)
at PluginPass.exit (D:\temp\babel\node_modules\babel-plugin-minify-builtins\lib\index.js:205:27)
at newFn (D:\temp\babel\node_modules\#babel\traverse\lib\visitors.js:143:21)
at NodePath._call (D:\temp\babel\node_modules\#babel\traverse\lib\path\context.js:45:20)
at NodePath.call (D:\temp\babel\node_modules\#babel\traverse\lib\path\context.js:35:17)
at NodePath.visit (D:\temp\babel\node_modules\#babel\traverse\lib\path\context.js:88:8)
at TraversalContext.visitQueue (D:\temp\babel\node_modules\#babel\traverse\lib\context.js:86:16)
at TraversalContext.visitSingle (D:\temp\babel\node_modules\#babel\traverse\lib\context.js:65:19) {
code: 'BABEL_TRANSFORM_ERROR'
}
This error does not occur if I delete any one of those files and npm run build again, it also doesn't occur every time when there's more than one file inside the src directory, it only occurs for this particular pair of files. From which it can be assumed that the files are not transpiled independently and Babel somehow analyzes the content of the files, looking for "intersections" or whatever — a behavior that I never asked for. How to ask Babel to stop doing that, if possible?

Related

Running nx target only if file doesn't exist

I have a project which has a build step, however, I need to make sure that the file firebase.config.json exists before running the build command.
With that, I have two NPM scripts:
// package.json
{
...,
"nx": {
"targets": {
"prepare": {
"outputs": ["firebase.config.json"]
},
"build": {
"outputs": ["dist"],
"dependsOn": [
{
"target": "prepare",
"projects": "self"
}
]
}
}
},
"scripts": {
"prepare": "firebase apps:sdkconfig web $FIREBASE_APP_ID_SHOP --json | jq .result.sdkConfig > firebase.config.json",
"build": "VITE_FIREBASE_CONFIG=$(cat ./firebase.config.json) vite build",
},
...
}
So with the above, every time I run nx build app it will first run prepare and build the firebase.config.json file.
However, every time I make a change to any of the source files inside my project, prepare re-runs even though the firebase.config.json is already present.
Is it possible for nx to only run a target if the file declared under outputs is not present?
If you are in a bash environment you can modify your prepare script to be the following (note the original command has been shortened with ellipses for readability).
// package.json
{
"scripts":{
"prepare": "CONFIG=firebase.config.json; [ -f \"$CONFIG\" ] || firebase apps:sdkconfig ... | jq ... > \"$CONFIG\""
}
}
The above prepare script will still run, but it should not spend any time reproducing the configuration file if it already exists.
CONFIG=firebase.config.json is just putting our file in a bash environment variable so we can use it in multiple places (helps prevent typos). [ -f "$CONFIG" ] will return true if $CONFIG holds a filename which corresponds to an existing file. If it returns true, it will short-circuit the || (OR) command.
If you want further verification of this technique, you can test this concept at the terminal with the command [ -f somefile.txt ] || echo "File does not exist". If somefile.txt does not exist, then the echo will run. If the file does exist, then the echo will not run.
A slightly-related side-note: while you clearly can do this all in the package.json configuration, if your nx workspace is going to grow to include other libraries or applications, I highly recommend splitting up all your workspace configuration into the default nx configuration files: nx.json, workspace.json, and the per-project project.json files for the sake of readability/maintainability.
Best of luck!

file copy on save or upload in vs code

I work frequently providing custom java script solutions to SharePoint. The annoying part at the moment is that either I need to setup a watcher to check file changes and overwrite it in the directory where the files will be in SharePoint or I just move then manually. Is there anyway we can setup something with a config file, like it is with the watch tsc compiler.
All I want is, for example to run robocopy when I save a file.
Or even just overwrite the same file specified location, preferable not dependent on extensions.
Thank you all.
For those wondering. I was able to achieve it with the simplest way I can think of: by using fs.watch or node-watch package.
created a js file with the code below to watch file changes in the src dir and act depending on the event.
watch.js:
//if you are using fs:
//var fs = require('fs');
//
var watch = require('node-watch');
//fs.watch if using the node fs.
watch('./src/',{recursive: true}, async function(evt, name) {
if (evt == 'remove') {
// on delete runs rest call to delete the file
}
if (evt == 'update') {
// on create or modify runs rest to upload the file
}
}
with that you can either run the cmd "node watch.js"
or create a task to run on folderopen.
./.vs_code/tasks.json
{
"type": "npm",
"script": "watch",
"label": "wathever label you want",
"isBackground": true,
"runOptions": {
"runOn": "folderOpen"
}
}
package.json:
"scripts": {
"watch": "node watch.js"
}
For the task to run on folderopen you also need to allow it in the folder. you can search on the command pallete (ctrl+shift+p), "Tasks:Manage Automatic Tasks in Folder" to enable it.

How do I define an extension for coffeeify with Budo dev server?

I'm trying to use coffeeify with budo so I do not have to add the extension to my require statements. I have tried passing these commands through budo's browserify options
budo src/app.coffee --live --serve bundle.js -- -t coffeeify --extension=".coffee"
budo src/app.coffee --live --serve bundle.js -- -t [coffeeify --extension=".coffee"]
I also tried inserting the browserify transform into my package.json
"browserify: {
"transform": ["coffeeify", {"extension": ".coffee"}]
}
Here is something that works for me (took me forever to figure it out, the hard part being getting watchify to work with coffeescript). Everything is in the package.yaml. Invoke npm start from your top folder and it will do the trick. npm puts all the locally installed node binaries in your PATH for you (they normally live under node_modules/.bin).
{
"name": "my-package",
"version": "1.0.0",
"private": true,
"scripts": {
"start": "(cd src; budo app.coffee:bundle.js --dir . --live --verbose -- --extension=.coffee | garnish)"
},
"browserify": {
"extension": [ ".coffee" ],
"transform": [ ["coffeeify"], ["brfs"] ]
},
"devDependencies": {
"brfs": "1.4.1",
"browserify": "11.1.0",
"budo": "^5.1.5",
"coffee-script": "latest",
"coffeeify": "^1.1.0",
"garnish": "^3.2.1",
"watchify": "3.4.0"
}
}
I have my source code under the src folder, and a file named app.coffee which includes (or require in node.js terms) my whole application. I have an index.html in my src folder which reference the bundle.js through from an html script tag.
The command to start budo is inside my package.json. It does cd into my src folder first.
The trick is to specify some configuration in the browserify block: the extension .coffee needs to be present, and a list of transforms as well. I tried to have everything on the command line but never got it to work
After npm start is invoked, since I pass the --live argument to budo everything works like magic and edit/saves to my documents do trigger a browser reload/refresh.
To deploy or release you'll probably need another target to minify with uglify.js. I still have a script that does that manually in 2 steps, the first step calls browserify and the second step calls uglify.js explicitely.
As a remark, recent version of budo do the piping into garnish for you I've heard.
Another tip is to look at what the React folks are doing to transform their .jsx files, as it is in theory extremely close to what the coffeescript folks need to do. There seems to be a huge momentum around React so hopefully React people will have figured those build problems first.

Gulp + Browserify: CoffeeScript not loading when loading files from node_modules

After setting up the folder structure for my Gulp project, I was wondering how to do paths in browserify, and found this page: https://github.com/substack/browserify-handbook#organizing-modules. It recommends putting common application parts in a subfolder of node_modules. This appears to be working, it's getting the files, but it's not applying my coffeeify transform, so it's throwing errors because it's trying to interpret them as JS. Any ideas how to fix this? This is my browserify config:
browserify: {
// Enable source maps
debug: true,
// Additional file extentions to make optional
extensions: ['.coffee', '.hbs'],
// A separate bundle will be generated for each
// bundle config in the list below
bundleConfigs: [{
entries: src + '/javascript/app.coffee',
dest: dest,
outputName: 'app.js'
}, {
entries: src + '/javascript/head.coffee',
dest: dest,
outputName: 'head.js'
}]
}
and these are the relevant bits form my package.json.
"browserify": {
"transform": [
"coffeeify",
"hbsfy"
]
}
Transfroms aren't applied to files in node_modules unless they are marked as being global: https://github.com/substack/node-browserify#btransformtr-opts. If you choose to make it global, be warned, the documentation suggests against it:
Use global transforms cautiously and sparingly, since most of the time
an ordinary transform will suffice.
You won't be able to specify the tranform in package.json:
You can also not configure global transforms in a package.json like
you can with ordinary transforms.
The two options are programmatically, by passing {global: true} as options or at the command line with the -g option:
browserify -g coffeeify main.coffee > bundle.js

JSDoc: Lookup tutorials from different directories

Is there a way to ask JSDoc (either in the command line or through grunt-jsdoc plugin) to lookup tutorials from different directories ?
As per the documentation, -u allows to specify the Directory in which JSDoc should search for tutorials. (it says the Directory instead of Directories).
I tried the following with no luck:
specify different strings separated by space or comma
specify one string with shell/ant regular expression
As suggested by #Vasil Vanchuk, a solution would be creating links to all tutorial files within a single directory. As such, JSDoc3 will be happy and it will proceed with the generation of all tutorials.
Creating/maintaining links manually would be a tedious task. Hence, and for people using grunt, the grunt-contrib-symlink come in handy. Using this plugin, the solution is reduced to a config task.
My Gruntfile.js looks like the following:
clean:['tmp', 'doc'],
symlink: {
options: {
overwrite: false,
},
tutorials: {
files: [{
cwd: '../module1/src/main/js/tut',
dest: 'tmp/tutorial-generation-workspace',
expand: true,
src: ['*'],
}, {
cwd: '../module2/src/main/js/tut',
dest: 'tmp/tutorial-generation-workspace',
expand: true,
src: ['*'],
}]
}
},
jsdoc: {
all: {
src: [
'../module1/src/main/js/**/*.js',
'../module2/src/main/js/**/*.js',
'./README.md',
],
options: {
destination: 'doc',
tutorials: 'tmp/tutorial-generation-workspace',
configure : "jsdocconf.json",
template: 'node_modules/grunt-jsdoc/node_modules/ink-docstrap/template',
},
}
},
grunt.loadNpmTasks('grunt-contrib-symlink');
grunt.loadNpmTasks('grunt-jsdoc');
grunt.registerTask('build', ['clean', 'symlink', 'jsdoc']);
grunt.registerTask('default', ['build']);
Integrating new module is translated by updating symlink and jsdoc tasks.
You could just copy the files instead of linking to a bunch of directories.
E.g. create a dir in your project for documentation in which you'll copy over all relevant tutorials from where ever.
Then, in your npm scripts you can have something like this:
"copy:curry": "cp node_modules/#justinc/jsdocs/tutorials/curry.md doc/tutorials",
"predocs": "npm run copy:curry",
The docs script (not shown) runs jsdoc. predocs runs automatically before docs and, in this case copies over a tutorial in one of my packages over to doc/tutorials. You can then pass doc/tutorials as the single directory housing all your tutorials.
In predocs you can keep adding things to copy with bash's && - or if that's not available for whatever reason, you'll find npm packages which let you do this (therefore not relying on whatever shell you're using).
Now that I think about it, it's best to also delete doc/tutorials in predocs:
"predocs": "rm -rf doc/tutorials && mkdir -p doc/tutorials && npm run copy:tutorials",
That way any tutorials you once copied there (but are now not interested in) will be cleared each time you generate the docs.
btw, I opened an issue for this: https://github.com/jsdoc3/jsdoc/issues/1330