Using other NPM packages in own cordova plugin - ionic-framework

I created an own cordova plugin for use in an Ionic 3 project.
The js-module requires other npm/node packages (in my case for XML processing).
e.g.
var builder = require('xmlbuilder');
I read X articles about hooks and other crazy stuff, but nothing really helpful.
Is there no easy way to add such a trivial thing to the build-chain just by some simple configuration of the dependencies?
Edit:
Let´s take that simple plugin as example: https://github.com/don/cordova-plugin-hello
The great function in https://github.com/don/cordova-plugin-hello/blob/master/www/hello.js should be able to use the xmlbuilder.
So I changed the file to:
/*global cordova, module*/
var builder = require('xmlbuilder');
module.exports = {
greet: function (name, successCallback, errorCallback) {
cordova.exec(successCallback, errorCallback, "Hello", "greet", [name]);
}
};
I added the module via npm install to the ionic project, but that doesn´t help. How can I make the modules somehow available for the plugins?

Related

gatsby build fails when adding mapbox-gl-geocoder to gatsby.js

I have a gatsby.js application with mapbox-gl and it all worked successfully until I tried to include the mapbox-gl-geocoder package to add search functionality.
I have installed it using npm as the following:
npm install mapbox/mapbox-gl-geocoder --save
then, added it to the React component:
import '#mapbox/mapbox-gl-geocoder/lib/mapbox-gl-geocoder.css';
import MapboxGeocoder from '#mapbox/mapbox-gl-geocoder';
and inside a useEffect hook:
map.addControl(
new MapboxGeocoder({
accessToken: mapboxgl.accessToken,
zoom: 20,
placeholder: 'Enter City',
mapboxgl: mapboxgl
}), 'top-left'
);
locally, when I run gatsby develop it works as it should, but when I deploy it with Netlify, I get the following error:
:24:21 PM: error Generating SSR bundle failed
1:24:21 PM: Can't resolve 'electron' in '/opt/build/repo/node_modules/#mapbox/mapbox-sdk/node_modules/got'
1:24:21 PM: If you're trying to use a package make sure that 'electron' is installed. If you're trying to use a local file make sure that the path is correct.
it seems that mapbox-gl-geocoder has a dependency to #mapbox/mapbox-sdk and the mapbox-sdk has a dependency to got that somehow needs electron ?
The only thing i can read about electron in the got npm page is the following:
Electron support has been removed
The Electron net module is not consistent with the Node.js http module. See #899 for more info.
In any way, I tried to install electron with npm to see if the errors are gone, and the build started to fail with new error indicating that the window object is not available.
10:35:45 AM: error "window" is not available during server side rendering.
Googling that error led me to some answers about not loading the packages in gatsby-node.js but that didn't help either.
When dealing with third-party modules that use window in Gatsby you need to add a null loader to its own webpack configuration to avoid the transpilation during the SSR (Server-Side Rendering). This is because gatsby develop occurs in the browser while gatsby build occurs in the Node server where obviously there isn't a window or other global objects.
exports.onCreateWebpackConfig = ({ stage, loaders, actions }) => {
if (stage === "build-html") {
actions.setWebpackConfig({
module: {
rules: [
{
test: /#mapbox/,
use: loaders.null(),
},
],
},
})
}
}
Keep in mind that the test value is a regular expression that will match a folder under node_modules so, ensure that the /#mapbox/ is the right name.
Using a patch-package may work but keep in mind that you are adding another package, another bundled files which may have a performance effect. The proposed snippet is a built-in solution that happens when you build your application.

How do you install a specific package version with Spago?

With tools like npm we can install a specific version
npm install foo#1.2.0
How do you install a specific version using spago install?
Firstly, that's not what spago install does. Instead of "adding a package to your project", spago install downloads all packages that are currently referenced in your spago.dhall file.
Secondly, the idea with Spago is that you don't choose a specific package version. Instead what you choose is a "snapshot", which is a collection of certain versions of all available packages that are guaranteed to compile and work together. This is a measure intended to prevent version conflicts and versioning hell (and this is similar to how Haskell stack works)
The snapshot is defined in your packages.dhall file, and then you specify the specific packages that you want to use in spago.dhall. The version for each package comes from the snapshot.
But if you really need to install a very specific version of a package, and you really know what you're doing, then you can modify the snapshot itself, which is described in packages.dhall.
By default your packages.dhall file might look something like this:
let upstream =
https://raw.githubusercontent.com/purescript/package-sets/psc-0.13.5-20200103/src/packages.dhall sha256:0a6051982fb4eedb72fbe5ca4282259719b7b9b525a4dda60367f98079132f30
let additions = {=}
let overrides = {=}
in upstream // additions // overrides
This is the default template that you get after running spago new.
In order to override the version for a specific package, add it to the overrides map like this:
let overrides =
{ foo =
upstream.foo // { version = "v1.2.0" }
}
And then run spago install. Spago should pull in version 1.2.0 of the foo package for you.

Polymer build with custom babel plugins?

We'd like to be able to add custom functionality to polymer build and polymer serve by configuring custom babel plugins.
For example, since polymer-cli uses babel internally, we would add a babel.config.js file to our workspace/project-root, e.g.:
module.exports = function (api) {
api.cache(true);
const presets = [ ];
const plugins = [
"#babel/plugin-proposal-optional-chaining"
];
return {
presets,
plugins
};
}
...and then we could serve or build our project with support for optional-chaining, etc This would allow us to do all sorts of things by writing additional babel plugins to handle stuff like minification inside template HTML strings...
Unfortunately, this doesn't currently work. polymer-build seems to load the configuration (due to its use of babel/core?), but polymer-analyze doesn't. An error is generated in the build-optimization step performed by polymer-analyze as soon as it encounters optional-chaining syntax in our source:
error: Error: Unable to get document file:///.../somefile.js: This experimental syntax requires enabling the parser plugin:
'optionalChaining' (423:6)
at BuildAnalyzer.<anonymous> (/usr/local/share/.config/yarn/global/node_modules/polymer-build/lib/analyzer.js:342:23)
at Generator.next (<anonymous>)
at fulfilled (/usr/local/share/.config/yarn/global/node_modules/polymer-build/lib/analyzer.js:17:58)
at process._tickCallback (internal/process/next_tick.js:68:7)
polymer serve also generates an error:
Error { SyntaxError: This experimental syntax requires enabling the parser plugin: 'optionalChaining' (423:6)
at Parser.raise (/usr/local/share/.config/yarn/global/node_modules/babylon/lib/index.js:776:15)
at Parser.expectPlugin (/usr/local/share/.config/yarn/global/node_modules/babylon/lib/index.js:2084:18)
...
pos: 13056, loc: Position { line: 423, column: 6 },
missingPlugin: [ 'optionalChaining' ] }
In both cases, I've confirmed that the babel.config.js file is being loaded. But babel is included by several different packages used in polymer-cli, so my suspicion is that in some of them, babel is being used without (babel/core having loaded) configuration info.
Can anyone involved with the polymer project confirm whether I'm correct in identifying the main issue? I'm looking into the possibility of contributing a fix/enhancement if the scope isn't too large.
Thanks.
I think for this you need to write your own custom build. Polymer-cli will provide its tool also for this. Have a look at this example:
https://github.com/PolymerElements/generator-polymer-init-custom-build
For us, this issue was preventing us from using modern JS language features (like optional chaining and the nullish coalescing operator) which have wide support in modern browsers.
The only solution we could come up with was forking the Polymer tools monorepo and adding in support for the appropriate Babel plugins ourselves.
The file in question is /packages/build/src/js-transform.ts. Both serve and build use this file for Babel transforms. We switched to using Rollup for our build process, but we still needed a development server and couldn't get any others to work, so we forked the repo and built our own version of the standalone polyserve package. Would love to some day switch to Modern Web's #web/dev-server.

How to develop Babel 6 plugins

What process do you use to develop Babel 6 plugins?
Here's what I came up with to develop a plugin (babel-plugin-test):
1) In an empty folder run:
npm install babel-cli babel-preset-es2015
2) Create the file src/test.js (to test the plugin) with just:
class Person {
}
3) Create the folder node_modules/babel-plugin-test with the following contents
node_modules/babel-plugin-test/package.json
{
"name": "babel-plugin-test",
"version": "0.1.0",
"main": "lib/index.js",
"dependencies": {
"babel-runtime": "^5.0.0"
},
"devDependencies": {
"babel-helper-plugin-test-runner": "^6.3.13"
}
}
node_modules/babel-plugin-test/src/index.js
export default function ({types: t }) {
return {
visitor: {
ClassDeclaration: function (node, parent) {
console.log("XXX");
}
}
};
}
4) Create a script that runs:
node_modules/babel-plugin-test/babel --presets es2015 --out-dir lib src
babel --plugins babel-plugin-test --presets es2015 --out-dir out src
So it compiles the plugin and then compiles test.js using the plugin and I see the console log and the output file (in this example I'm not changing anything).
There has to be a better way to do this. Maybe some way to use WebStorm or another Node debugger to put a breakpoint and play around (at least be able to inspect variables).
The way to go is to run babel in code so you can get the transpiled code and compare against what you expect.
First transpile your plugin as you did, then in the test folder
var transformFileSync = require('babel-core').transformFileSync
var path = require('path')
var fs = require('fs')
var plugin = require('../lib/index').default
describe('My test', function (){
it ('Transform', function(){
var transformed = transformFileSync(path.join(__dirname, 'testFilePath'), {
plugins: [[plugin]]
}).code
var expectedCode = fs.readFileSync(..)
assert.equal(transformed, expected)
})
})
You can find an example here. It is also possible to write the tests in ES6 importing modules, for that you need to provide mocha with a compiler, see here for an example.
For basic testing I use the wonderful astexplorer. Just set the parser to babylon 6 and turn on transform and set it to babelv6. Then you can put your code in one pane, see the AST results in another, put your babel plugin in a third pane and see the plugin result in the fourth.
For a more complicated test I create two projects, one for the plugin and another one i want to test the plugin on. I npm link the two and configure the test project to use my plugin in babelrc. Then, whenever my plugin changes i just run babel on my file/project and see the results.
Im not sure if this is the best way but it works for me.

How to use blueimp-file-upload with webpack?

I'm using blueimp-file-upload in my website, and I'm using webpack to organize my js code.
I installed blueimp-file-upload and jquery.ui.widget from NPM
npm install --save blueimp-file-upload
npm install --save jquery.ui.widget
and I require blueimp-file-upload in my entry file
require('blueimp-file-upload')
but when I run webpack, I get thie error:
ERROR in ./~/blueimp-file-upload/js/jquery.fileupload.js
Module not found: Error: Cannot resolve module 'jquery.ui.widget' in E:\app-parent\cooka-common-web\src\main\resources\static\node_modules\blueimp-file-upload\js
# ./~/blueimp-file-upload/js/jquery.fileupload.js 19:8-22:19
If you're working with images:
Webpack was complaining about some modules that weren't in the blueimp-file-upload package. Here is the way I got this working:
Install missing dependencies:
npm i -S blueimp-load-image
npm i -S blueimp-canvas-to-blob
Configure Webpack:
config.resolve = {
extensions: ['', '.js'],
alias: {
'load-image': 'blueimp-load-image/js/load-image.js',
'load-image-meta': 'blueimp-load-image/js/load-image-meta.js',
'load-image-exif': 'blueimp-load-image/js/load-image-exif.js',
'canvas-to-blob': 'blueimp-canvas-to-blob/js/canvas-to-blob.js',
'jquery-ui/widget': 'blueimp-file-upload/js/vendor/jquery.ui.widget.js'
}
};
Include scripts in your app:
import "blueimp-file-upload/js/vendor/jquery.ui.widget.js";
import "blueimp-file-upload/js/jquery.iframe-transport.js";
import "blueimp-file-upload/js/jquery.fileupload.js";
import "blueimp-file-upload/js/jquery.fileupload-image.js";
Disable both AMD and CommonJS and use the Browser Global jQuery.
/* The jQuery UI widget factory, can be omitted if jQuery UI is already included */
require('imports?define=>false&exports=>false!blueimp-file-upload/js/vendor/jquery.ui.widget.js');
/* The Iframe Transport is required for browsers without support for XHR file uploads */
require('imports?define=>false&exports=>false!blueimp-file-upload/js/jquery.iframe-transport.js');
/* The basic File Upload plugin */
require('imports?define=>false&exports=>false!blueimp-file-upload/js/jquery.fileupload.js');
/* The File Upload processing plugin */
require('imports?define=>false&exports=>false!blueimp-file-upload/js/jquery.fileupload-process.js');
/* The File Upload validation plugin */
require('imports?define=>false&exports=>false!blueimp-file-upload/js/jquery.fileupload-validate.js');
/* The File Upload Angular JS module */
require('imports?define=>false&exports=>false!blueimp-file-upload/js/jquery.fileupload-angular.js');
This is the configuration I'm using to integrate webpack, blueimp-fileupload with angular. Alternatively you can configure in your webpack.config.js as a regex to avoid repeating loaders.
resolve: {
extensions: ['', '.js'],
alias: {
'jquery-ui/widget': 'blueimp-file-upload/js/vendor/jquery.ui.widget.js'
}
}
I had almost identical problem, except that Error announced not 'jquery.ui.widget' but 'jquery/ui/widget'.
For me #Gowrav answer was wrong way.
After days of straying I've solved it in the simple way. Just did:
npm install jquery-ui
The fact is that jquery.fileupload.js searching for its vendor:
But in context where jquery.fileupload.js is trying to import dependency, of course, it can't be found (resolved). So I add it to project instead.
P.S. It's just my opinion about how does all work. But this way has helped me.
jquery.fileupload.js checks for AMD require first which results in this error. You can teach webpack not to use AMD style for this file. (Make sure to npm install imports-loader for this method to work.):
require('imports?define=>false!blueimp-file-upload')
It should correctly register the module as CommonJS and will require the jquery.ui.widget from the right location.
Read more here: http://webpack.github.io/docs/shimming-modules.html#disable-some-module-styles
You can add an alias to jquery.ui.widget's main file - it unfortunately doesn't specify one in its package.json, so webpack can't find it otherwise.
resolve: {
alias: {
"jquery.ui.widget": "node_modules/jquery.ui.widget/jquery.ui.widget.js"
}
},
first install two plugins
npm i blueimp-file-upload --save
npm i jquery-ui --save
then require in web pack
require('blueimp-file-upload/js/jquery.fileupload')
actually you can solve this by changing your webpack config, just add the path to resolve (for example I am using bower)
resolve: {
extensions: [ '', '.js', '.jsx' ],
modulesDirectories: [
'node_modules',
'bower_components',
'bower_components/blueimp-file-upload/js/vendor'
]
}
In webpack 3.x, the syntax will look like this:
{
test: require.resolve("blueimp-file-upload"),
use: "imports-loader?define=>false"
}