VSCode Extension executes local python (bash) code and append to output channel - visual-studio-code

In my VS Code extension, I'd like to invoke a python file and append standard output to VSCode's output channel. I'm aware that I can create an output channel via vscode.window.createOutputChannel("..."); and I'm also aware I can execute such python (or for this matter, any local bash) script via these APIs: https://nodejs.org/api/child_process.html. Would it be possible to append standard out information to this output channel in real-time (i.e., as the script runs)?
What I currently have is the following:
export function execute(cmd: string, callback: any, logging?: vscode.OutputChannel) {
const spwanedProcess = spawn(cmd, [], {shell: true, detached: true});
console.log(`spawned pid ${spwanedProcess.pid} with command ${cmd}`);
spwanedProcess.stdout.on('data', (data: any) => {
console.log(data);
logging?.appendLine("stdout:" + data);
});
spwanedProcess.stderr.on('data', (data: any) => {
console.error(`spawned pid ${spwanedProcess.pid} pushed something to stderr`);
logging?.appendLine(data);
});
spwanedProcess.on('exit', function(code: any) {
if (code !== 0) {
console.log('Failed: ' + code);
}
else {
console.log(`pid ${spwanedProcess.pid} finished`);
}
callback();
});
}
where callback() is something to be executed after the execution is done. I got this structure from here https://stackoverflow.com/a/32872753/14264786.
However, when I run a simple python code that sleeps for 3 seconds and prints something and does this again, the standard out information is still not displayed on the logging in real-time. Instead, they are outputted together after the script finishes.
Any idea on what's a potential solution?

After some more digging, I found out that one solution is to use python -u to invoke the python script to unbuffer the python output

Related

How can I prevent a Gulp task with a dynamic import from being asynchronous?

I want to use gulp-imagemin to minify images. The relevant part of my gulpfile.js looks like this:
const gulp = require('gulp');
// a couple more require('')s
function minifyImages(cb) {
import('gulp-imagemin')
.then(module => {
const imagemin = module.default;
gulp.src('src/img/**/*')
.pipe(imagemin())
.pipe(gulp.dest('img'));
cb();
})
.catch(err => {
console.log(err);
cb();
});
}
function buildCSS(cb) { /* ... */ }
exports.build = gulp.series(buildCSS, minifyImages);
The reason I'm using a dynamic import here is because I think I have to - gulp-imagemin doesn't support the require('') syntax, and when I say import imagemin from 'gulp-imagemin I get an error saying "Cannot use import statement outside a module".
I would expect the build task to only finish after minifyImages has finished. After all, I'm calling cb() only at the very end, at a point where the promise should be resolved.
However, build seems to finish early, while minifyImages is still running. This is the output I get:
[21:54:47] Finished 'buildCSS' after 6.82 ms
[21:54:47] Starting 'minifyImages'...
[21:54:47] Finished 'minifyImages' after 273 ms
[21:54:47] Finished 'build' after 282 ms
<one minute later>
[21:55:49] gulp-imagemin: Minified 46 images (saved 5.91 MB - 22.8%)
How can I make sure the task doesn't finish early, and all tasks are run in sequence?
Let me know if there's something wrong with my assumptions; I'm somewhat new to gulp and importing.
Streams are always asynchronous, so if the cb() callback is called just after creating the gulp stream as in your then handler, it's only obvious that the stream by that time has not finished yet (in fact, it hasn't even started).
The simplest solution to call a callback when the gulp.dest stream has finished is using stream.pipeline, i.e.:
function minifyImages(cb) {
const { pipeline } = require('stream');
return import('gulp-imagemin')
.then(module => {
const imagemin = module.default;
pipeline(
gulp.src('src/img/**/*'),
imagemin(),
gulp.dest('img'),
cb
);
})
.catch(cb);
}
Or similarly, with an async function.
async function minifyImages(cb) {
const { pipeline } = require('stream');
const { default: imagemin } = await import('gulp-imagemin');
return pipeline(
gulp.src('src/img/**/*'),
imagemin(),
gulp.dest('img'),
cb
);
}
Another approach I have seen is to split the task in two sequential sub-tasks: the first sub-task imports the plugin module and stores it in a variable, and the second sub-task uses the plugin already loaded by the previous sub-task to create and return the gulp stream in the usual way.
Then the two sub-tasks can be combined with gulp.series.

dart:io Process - How to run 'CMD start' on Windows?

On Linux we are able to open url's using a simple io.Process call:
io.Process.run("xdg-open", [ url, ])
But trying to do the equivalent thing on windows
io.Process.run("start", [url]);
It fails with:
The system cannot find the file specified.
I'm guessing we need the path of cmd.exe, which is located at %ComSpec%. Tried doing a 'echo %ComSpec%, but getting the same error. Also tried hard-coding the path, but no success.
Here is our complete function:
ProcessResult result;
try {
if(Platform.isLinux){
result = await io.Process.run("xdg-open", [ url, ]);
}
else if(Platform.isWindows){
result = await io.Process.run("start", [url]);
}
} on ProcessException catch (e){
Log.e(e?.message);
};
return result?.exitCode == 0;
[Edit] Updated title to be more accurate
The title of your question doesn't match your code; that isn't trying to run cmd.exe, it's trying to run an executable called start. The reason it doesn't work is that start isn't an executable, it's a command built into cmd.exe's interpreter. (Try running where start at a command prompt; compare with where cmd.)
If you want to run a cmd.exe command like start, you need to pass runInShell: true to Process.run. However, keep in mind that if you do you may need to be careful about special characters in your arguments.
(The answer to the question in your title is: Process.run('cmd', [...]);. But since what you want to do is run a command in the shell it's easier to use runInShell: true than to invoke cmd with /c and your command as a string.)
Work example for Windows -
Future<io.ProcessResult> processRun() {
var result = io.Process.run(
'C:\\Program Files (x86)\\1cv8\\common\\1cestart.exe', [],
runInShell: true);
result.then((value) {
print(value.exitCode);
});
return result;
}

How to pass GitHub action event hook from a javascript to a bash file?

I want to create a GitHub action that is simple and only run a bash-script file, see previous
question: How to execute a bash-script from a java script
With this javascript action, I want to pass values to the bash-script from the JSON payload given by GitHub.
Can this be done with something as simple as an exec command?
...
exec.exec(`export FILEPATH=${filepath}`)
...
I wanted to do something like this, but found there to be much more code needed that I originally expected. So while this is not simple, it does work and will block the action script while the bash script runs:
const core = require('#actions/core');
function run() {
try {
// This is just a thin wrapper around bash, and runs a file called "script.sh"
//
// TODO: Change this to run your script instead
//
const script = require('path').resolve(__dirname, 'script.sh');
var child = require('child_process').execFile(script);
child.stdout.on('data', (data) => {
console.log(data.toString());
});
child.on('close', (code) => {
console.log(`child process exited with code ${code}`);
process.exit(code);
});
}
catch (error) {
core.setFailed(error.message);
}
}
run()
Much of the complication is handling output and error conditions.
You can see my debugger-action repo for an example.

Nothing happens using spawn gulp task to execute commands into subfolder

Yesterday, I've been sent to a reference to use process_child.spawn for my need. I'd like gulp executing commands for me, to avoid typing commands concerning my dependency when I need to compile my main project.
I got something that seems ok, any error into logs, but nothing happened, the way my commands wouldn't been executed.
Any one with feedback about my code ? I got another task like this one to compile the dependency.
var spawn = require("child_process").spawn;
gulp.task("my-dependency-install", function(done) {
spawn("ft", ["install"], {
cwd: "node_modules/app/my-dependency/"
})
.on("error", function (err) {
throw err
})
.on("close", done);
});
Thanks
Here is the way I've fixed it :
var spawn = require("child_process").spawn;
spawn("ft.cmd", ["install"], {
cwd: "node_modules/app/my-dependency/"
})
.on("error", function (err) {
throw err
});
Anyone is able to explain why I had to add .cmd ? It's because of windows OS, isn'it ?

Send commands over socket, but wait every time for response (Node.js)

I need to send several commands over telnet to a server. If I try to send them without a time delay between every command, the server freaks out:
var net = require('net');
var conn = net.createConnection(8888, 'localhost');
conn.on('connect', function() {
conn.write(command_1);
conn.write(command_2);
conn.write(command_3);
//...
conn.write(command_n);
})
I guess the server needs some time to respond to command n before I send it command n+1. One way is to write something to the log and fake a "wait":
var net = require('net');
var conn = net.createConnection(8888, 'localhost');
conn.on('connect', function() {
console.log('connected to server');
console.log('I'm about to send command #1');
conn.write(command_1);
console.log('I'm about to send command #2');
conn.write(command_2);
console.log('I'm about to send command #3');
conn.write(command_3);
//...
console.log('I'm about to send command #n');
conn.write(command_n);
})
It might also be the fact that conn.write() is asynchronous, and putting one command after another doesn't guranty the correct order??
Anyway, what is the correct pattern to assure correct order and enough time between two consecutive commands, for the server to respond?
First things first: if this is truly a telnet server, then you should do something with the telnet handshaking (where terminal options are negotiated between the peers, this is the binary data you can see when opening the socket).
If you don't want to get into that (it will depend on your needs), you can ignore the negotiation and go straight to business, but you will have to read this data and ignore it yourself.
Now, in your code, you're sending the data as soon as the server accepts the connection. This may be the cause of your troubles. You're not supposed to "wait" for the response, the response will get to you asynchronously thanks to nodejs :) So you just need to send the commands as soon as you get the "right" response from the server (this is actually useful, because you can see if there were any errors, etc).
I've tried this code (based on yours) against a device I've got at hand that has a telnet server. It will do a login and then a logout. See how the events are dispatched according to the sever's response:
var net = require('net');
var conn = net.createConnection(23, '1.1.1.1');
var commands = [ "logout\n" ];
var i = 0;
conn.setEncoding('ascii');
conn.on('connect', function() {
conn.on('login', function () {
conn.write('myUsername\n');
});
conn.on('password', function () {
conn.write('myPassword\n');
});
conn.on('prompt', function () {
conn.write(commands[i]);
i++;
});
conn.on('data', function(data) {
console.log("got: " + data + "\n");
if (data.indexOf("login") != -1) {
conn.emit('login');
}
if (data.indexOf("password") != -1) {
conn.emit('password');
}
if (data.indexOf(">#") != -1) {
conn.emit('prompt');
}
});
});
See how the commands are in an array, where you can iteratively send them (the prompt event will trigger the next command). So the right response from the server is the next prompt. When the server sends (in this case) the string ># another command is sent.
Hope it helps :)
The order of writes is guaranteed. However:
You must subscribe to data event. conn.on('data', function(data)
{}) will do.
You must check return values of writes - if a write
fails, you must wait for 'drain' event. So you should check if any
write really fails and if it does then fix the problem. If it
doesn't - then you can leave current dirty solution as is.
You
must check if your server supports request piplining (sending
multiple requests without waiting for responses). If it doesn't -
you must not send next request before receiving a data event after
the previous one.
You must ensure that the commands you send are real telnet commands - telnet expects a \0 byte after \r\n (see the RFC), so certain servers may freak if \0 is not present.
So:
var net = require('net');
var conn = net.createConnection(8888, 'localhost');
conn.on('connect', function() {
console.log(conn.write(command_1) &&
conn.write(command_2) &&
conn.write(command_3) &&
//...
conn.write(command_n))
})
conn.on('data', function () {})
If it writes false - then you must wait for 'drain'. If it writes true - you must implement waiting. I discourage the event-based solution and suggest to look at using async or Step NPM modules instead.