Nothing happens using spawn gulp task to execute commands into subfolder - command-line

Yesterday, I've been sent to a reference to use process_child.spawn for my need. I'd like gulp executing commands for me, to avoid typing commands concerning my dependency when I need to compile my main project.
I got something that seems ok, any error into logs, but nothing happened, the way my commands wouldn't been executed.
Any one with feedback about my code ? I got another task like this one to compile the dependency.
var spawn = require("child_process").spawn;
gulp.task("my-dependency-install", function(done) {
spawn("ft", ["install"], {
cwd: "node_modules/app/my-dependency/"
})
.on("error", function (err) {
throw err
})
.on("close", done);
});
Thanks

Here is the way I've fixed it :
var spawn = require("child_process").spawn;
spawn("ft.cmd", ["install"], {
cwd: "node_modules/app/my-dependency/"
})
.on("error", function (err) {
throw err
});
Anyone is able to explain why I had to add .cmd ? It's because of windows OS, isn'it ?

Related

How can I prevent a Gulp task with a dynamic import from being asynchronous?

I want to use gulp-imagemin to minify images. The relevant part of my gulpfile.js looks like this:
const gulp = require('gulp');
// a couple more require('')s
function minifyImages(cb) {
import('gulp-imagemin')
.then(module => {
const imagemin = module.default;
gulp.src('src/img/**/*')
.pipe(imagemin())
.pipe(gulp.dest('img'));
cb();
})
.catch(err => {
console.log(err);
cb();
});
}
function buildCSS(cb) { /* ... */ }
exports.build = gulp.series(buildCSS, minifyImages);
The reason I'm using a dynamic import here is because I think I have to - gulp-imagemin doesn't support the require('') syntax, and when I say import imagemin from 'gulp-imagemin I get an error saying "Cannot use import statement outside a module".
I would expect the build task to only finish after minifyImages has finished. After all, I'm calling cb() only at the very end, at a point where the promise should be resolved.
However, build seems to finish early, while minifyImages is still running. This is the output I get:
[21:54:47] Finished 'buildCSS' after 6.82 ms
[21:54:47] Starting 'minifyImages'...
[21:54:47] Finished 'minifyImages' after 273 ms
[21:54:47] Finished 'build' after 282 ms
<one minute later>
[21:55:49] gulp-imagemin: Minified 46 images (saved 5.91 MB - 22.8%)
How can I make sure the task doesn't finish early, and all tasks are run in sequence?
Let me know if there's something wrong with my assumptions; I'm somewhat new to gulp and importing.
Streams are always asynchronous, so if the cb() callback is called just after creating the gulp stream as in your then handler, it's only obvious that the stream by that time has not finished yet (in fact, it hasn't even started).
The simplest solution to call a callback when the gulp.dest stream has finished is using stream.pipeline, i.e.:
function minifyImages(cb) {
const { pipeline } = require('stream');
return import('gulp-imagemin')
.then(module => {
const imagemin = module.default;
pipeline(
gulp.src('src/img/**/*'),
imagemin(),
gulp.dest('img'),
cb
);
})
.catch(cb);
}
Or similarly, with an async function.
async function minifyImages(cb) {
const { pipeline } = require('stream');
const { default: imagemin } = await import('gulp-imagemin');
return pipeline(
gulp.src('src/img/**/*'),
imagemin(),
gulp.dest('img'),
cb
);
}
Another approach I have seen is to split the task in two sequential sub-tasks: the first sub-task imports the plugin module and stores it in a variable, and the second sub-task uses the plugin already loaded by the previous sub-task to create and return the gulp stream in the usual way.
Then the two sub-tasks can be combined with gulp.series.

How to execute function after stream is closed in Dart/Flutter?

So basically I am using the flutter_uploader package to upload files to a server and I'd like to execute a function after the upload is complete:
final StreamSubscription<UploadTaskProgress> subscription = _uploader.progress.listen(
(e) {
print(e.progress);
},
onError: (ex, stacktrace) {
throw Exception("Something went wrong updating the file...");
},
onDone: () {
myFunction(); // won't run
},
cancelOnError: true,
);
The problem is the onDone function doesn't execute thus meaning myFunction never executes. I've done some digging and I found that onDone gets called when we close the stream but there is no such method on the subscription variable. I have not used streams much and therefore am pretty bad with them.
My question is, how can I run myFunction? once the stream is complete? I thought that onDone would get called when such is the case but I guess not.
Thank you!
I didn't used that package before but I was reading a litle bit about the package and I think you can execute your funciton inside the main block, the other ones are to handle internal processes like stopping a background job or some other external stuff like notify the error to some error monitoring tool, this is what I propose to you:
final StreamSubscription<UploadTaskProgress> subscription =
_uploader.progress.listen(
(e) {
if (e.status is UploadTaskStatus._internal(3)) {
myFunction()
}
print(e.progress);
},
onError: (ex, stacktrace) {
throw Exception("Something went wrong updating the file...");
},
cancelOnError: true,
);
Just to be clear I'm not sure of the specific implementation, is just and idea I get from the docs, seems like the event also contains an status property which has a constant for when the event is completed
https://pub.dev/documentation/flutter_uploader/latest/flutter_uploader/UploadTaskProgress/UploadTaskProgress.html
https://pub.dev/documentation/flutter_uploader/latest/flutter_uploader/UploadTaskStatus-class.html
Hope this helps you :D

VSCode Extension executes local python (bash) code and append to output channel

In my VS Code extension, I'd like to invoke a python file and append standard output to VSCode's output channel. I'm aware that I can create an output channel via vscode.window.createOutputChannel("..."); and I'm also aware I can execute such python (or for this matter, any local bash) script via these APIs: https://nodejs.org/api/child_process.html. Would it be possible to append standard out information to this output channel in real-time (i.e., as the script runs)?
What I currently have is the following:
export function execute(cmd: string, callback: any, logging?: vscode.OutputChannel) {
const spwanedProcess = spawn(cmd, [], {shell: true, detached: true});
console.log(`spawned pid ${spwanedProcess.pid} with command ${cmd}`);
spwanedProcess.stdout.on('data', (data: any) => {
console.log(data);
logging?.appendLine("stdout:" + data);
});
spwanedProcess.stderr.on('data', (data: any) => {
console.error(`spawned pid ${spwanedProcess.pid} pushed something to stderr`);
logging?.appendLine(data);
});
spwanedProcess.on('exit', function(code: any) {
if (code !== 0) {
console.log('Failed: ' + code);
}
else {
console.log(`pid ${spwanedProcess.pid} finished`);
}
callback();
});
}
where callback() is something to be executed after the execution is done. I got this structure from here https://stackoverflow.com/a/32872753/14264786.
However, when I run a simple python code that sleeps for 3 seconds and prints something and does this again, the standard out information is still not displayed on the logging in real-time. Instead, they are outputted together after the script finishes.
Any idea on what's a potential solution?
After some more digging, I found out that one solution is to use python -u to invoke the python script to unbuffer the python output

How to pass GitHub action event hook from a javascript to a bash file?

I want to create a GitHub action that is simple and only run a bash-script file, see previous
question: How to execute a bash-script from a java script
With this javascript action, I want to pass values to the bash-script from the JSON payload given by GitHub.
Can this be done with something as simple as an exec command?
...
exec.exec(`export FILEPATH=${filepath}`)
...
I wanted to do something like this, but found there to be much more code needed that I originally expected. So while this is not simple, it does work and will block the action script while the bash script runs:
const core = require('#actions/core');
function run() {
try {
// This is just a thin wrapper around bash, and runs a file called "script.sh"
//
// TODO: Change this to run your script instead
//
const script = require('path').resolve(__dirname, 'script.sh');
var child = require('child_process').execFile(script);
child.stdout.on('data', (data) => {
console.log(data.toString());
});
child.on('close', (code) => {
console.log(`child process exited with code ${code}`);
process.exit(code);
});
}
catch (error) {
core.setFailed(error.message);
}
}
run()
Much of the complication is handling output and error conditions.
You can see my debugger-action repo for an example.

multiple tasks in nodeunit with mongo fail

I've taken How do I get an asynchronous result back with node unit and mongoose? and VERY slightly modified it to be simpler to show my failure.
var mongoose = require('mongoose');
var db;
module.exports = {
setUp: function(callback) {
try {
//db.connection.on('open', function() {
mongoose.connection.on('open', function() {
console.log('Opened connection');
callback();
});
db = mongoose.connect('mongodb://localhost/test_1');
console.log('Started connection, waiting for it to open');
} catch (err) {
console.log('Setting up failed:', err.message);
test.done();
callback(err);
}
},
tearDown: function(callback) {
console.log('In tearDown');
try {
console.log('Closing connection');
db.disconnect();
callback();
} catch (err) {
console.log('Tearing down failed:', err.message);
test.done();
callback(err);
}
},
test1: function(test) {
test.ifError(null);
test.done();
},
test2: function(test) {
test.ifError(null);
test.done();
}
};
When running this with nodeunit I get the following:
stam2_test.js
Started connection, waiting for it to open
Opened connection
In tearDown
Closing connection
✔ test1
Started connection, waiting for it to open
Opened connection
FAILURES: Undone tests (or their setups/teardowns):
- test2
To fix this, make sure all tests call test.done()
Some more info:
If in the setUp/tearDown i don't user mongo but just a test code, like increasing a counter, it all works.
If I have only one test, everything works.
Adding another test AND having mongo in the setup consistently fails it so I guess I'm doing something wrong in the setup.
Thank you in advance.
The reason for the failure seems that the event subscription in mongoose.connection.on('open',...) remains bound to the callback from test1 even after disconnect and connect for test2. The extra call to the previous callback is the one causing trouble.
You should make sure to remove the subscription somehow when you are done with it. Since mongoose connection is based on nodejs EventEmitter then a simple solution might be to replace the call
mongoose.connection.on('open'...)
with
mongoose.connection.once('open'...)
but you could also use the general add/removeListener() as needed.
On a different note, it seems unneeded to connect and disconnect in each test of a unit test. You could just connect once by something like requiring a module that connects to your test database as in require('db_connect_test'), the module db_connect_test should just call mongoose.connect(...) and all the tests would run with the same connection (or pool as mongoose creates).
Have a good one!