How to use continuation monad transformer with spawn/exec in purescript? - callback

I want to execute a command using spawn in Node.ChildProcess but i have no clue on how to hook the the function which spawns the command and the rest of the application with spawn. I have a vague idea that i need to use ContT for hooking up the error and success callbacks and am not able to figure out the data pipeline as a single program.
This is the program I am trying to write -
Wait for a request (let's say as an HTTP server)
On request, write something to a file
Fire a terminal command
Collect output from terminal command
Send response

I would go for the easier solution and use Aff. I have an example of how I'm doing spawned child processes here: https://github.com/justinwoo/vidtracker/blob/b5756099a4f683d262bc030d33b68343a47f14d7/src/GetIcons.purs#L44-L54
curl :: forall e.
String
-> String
-> Aff
_
Unit
curl url path = do
cp <- liftEff $ spawn "curl" [url, "-o", path] defaultSpawnOptions
makeAff \e s -> do
onError cp (e <<< Exc.error <<< unsafeStringify)
onClose cp (s <<< const unit)
I have another example here where I'm collecting the output: https://github.com/justinwoo/simple-rpc-telegram-bot/blob/09e894c81493f913fe316f6689cb94ce1f5056e6/src/Main.purs#L149

Related

Does BaseX support running a basex script file (.bxs) with the jobs module or a combination of query with proc module and jobs module?

This has been a thorn in my side and I'm wondering if I'm missing something simple or not. I need to run .bxs scripts from the jobs scheduler.
I tried to start a service with a .bxs script file from the jobs module but it does not run. It registers as a service but the script does not run.
let $home := Q{org.basex.util.Prop}HOMEDIR()
let $job := $home || 'webapp/sync/update_jira.bxs'
let $job2 := $home || 'webapp/sync/update_commit_data.bxs'
return (jobs:eval(xs:anyURI($job), (), map { 'id':'update_jira_job', 'start':'14:54:02', 'interval':'P1D', 'service': true(), 'log': 'update_jira_job'}),
jobs:eval(xs:anyURI($job2), (), map { 'id':'update_commit_data', 'start':'15:03:02', 'interval':'P1D', 'service': true(), 'log': 'update_commit_data'}))
I also tried to run a query that executes the command line to run the scripts for example within the update_jira.xq there is a line proc:execute('basex update_jira.bxs') from an initial query that looks something like this...
let $home := Q{org.basex.util.Prop}HOMEDIR()
let $job := $home || '/srv/webapp/sync/update_jira.xq'
let $job2 := $home || '/src/webapp/sync/update_commit_data.xq'
return (jobs:eval(xs:anyURI($job), (), map { 'id':'update_jira_job', 'start':'14:54:02', 'interval':'P1D', 'service': true(), 'log': 'update_jira_job'}),
jobs:eval(xs:anyURI($job2), (), map { 'id':'update_commit_data', 'start':'15:03:02', 'interval':'P1D', 'service': true(), 'log': 'update_commit_data'}))
When this ran as a service, the database did not update as expected and I got this output in the log:
22:47:02.001 JOB:update_commit_data admin OK 0.30 update_commit_data
22:41:00.000 JOB:update_jira_job admin ERROR 0.00 update_jira_job; Unexpected end of query: '0'.
But that is strange because when I ran the query itself -- that starts the service with jobs:eval -- then it actually ran ok when I ran the query for the first time.
16:42:52.257 10.244.144.142:57444 admin 200 221563.70 [GET] /rest?run=sync/update_jira.bxs
16:49:39.862 10.244.144.142:57591 admin 200 101413.21 [GET] /rest?run=sync/update_commit_data.bxs
This is my latest attempt where the query runs initially but then doesn't seem to execute as a service interval. I added the base-uri as the path to the query and I hope that's the right way to do that.
let $home := Q{org.basex.util.Prop}HOMEDIR()
return jobs:eval(proc:execute('/usr/local/bin/basex', '/srv/basex/webapp/sync/update_jira.bxs'), (),
map { 'id':'update_jira_job', 'interval':'PT5M', 'base-uri': '/srv/basex/webapp/sync/',
'service': true(), 'log': 'update_jira_job'})
When I run this through the database admin tool query window, it runs right away
14:02:54.494 10.244.144.142:54402 admin 200 296095.32 [POST] /dba/query-update
And then after 5 the minute interval a .05 ms log entry shows up when the service kicked off:
14:57:50.564 JOB:update_jira_job admin OK 0.05 update_jira_job
Please note that BaseX command scripts contain plain database commands, whereas the functions in the Jobs Module were tailored to execute XQuery code. If you want to use jobs:eval, the best solution is to rewrite the contents of your command scripts to XQuery.
If you want to stick with the command scripts, you could indeed try to invoke BaseX via proc:execute, but you should be aware that the two BaseX instance will run independently of each other and could lead to corrupt databases (see https://docs.basex.org/wiki/Startup#Concurrent_Operations).
If the invocation fails…
Cannot run program "basex": CreateProcess error=2, ...
…you may need to address BaseX with the full path:
(: Windows installation :)
proc:execute('c:\Program Files (x86)\BaseX\bin\basex.bat', 'commands.bxs')
(: Linux :)
proc:execute('/path/to/basex', 'commands.bxs')

SCP command not working in karate project - it throws command error:cannot run program scp.exe: CreateProcess error=2 [duplicate]

I'm trying to execute bash script using karate. I'm able to execute the script from karate-config.js and also from .feature file. I'm also able to pass the arguments to the script.
The problem is, that if the script fails (exits with something else than 0) the test execution continues and finishes as succesfull.
I found out that when the script echo-es something then i can access it as a result of the script so I could possibly echo the exit value and do assertion on it (in some re-usable feature), but this seems like a workaround rather than a valid clean solution. Is there some clean way of accessing the exit code without echo-ing it? Am I missing on something?
script
#!/bin/bash
#possible solution
#echo 3
exit 3;
karate-config.js
var result = karate.exec('script.sh arg1')
feture file
def result = karate.exec('script.sh arg1')
Great timing. We very recently did some work for CLI testing which I am sure you can use effectively. Here is a thread on Twitter: https://twitter.com/maxandersen/status/1276431309276151814
And we have just released version 0.9.6.RC4 and new we have a new karate.fork() option that returns an instance of Command on which you can call exitCode
Here's an example:
* def proc = karate.fork('script.sh arg1')
* proc.waitSync()
* match proc.exitCode == 0
You can get more ideas here: https://github.com/intuit/karate/issues/1191#issuecomment-650087023
Note that the argument to karate.fork() can take multiple forms. If you are using karate.exec() (which will block until the process completes) the same arguments work.
string - full command line as seen above
string array - e.g. ['script.sh', 'arg1']
json where the keys can be
line - string (OR)
args - string array
env - optional environment properties (as JSON)
redirectErrorStream - boolean, true by default which means Sys.err appears in Sys.out
workingDir - working directory
useShell - default false, auto-prepend cmd /c or sh -c depending on OS
And since karate.fork() is async, you need to call waitSync() if needed as in the example above.
Do provide feedback and we can tweak further if needed.
EDIT: here's a very advanced example that shows how to listen to the process output / log, collect the log, and conditionally exit: fork-listener.feature
Another answer which can be a useful reference: Conditional match based on OS
And here's how to use cURL for advanced HTTP tests ! https://stackoverflow.com/a/73230200/143475
In case you need to do a lot of local file manipulation, you can use the karate.toJavaFile() utility so you can convert a relative path or a "prefixed" path to an absolute path.
* def file = karate.toJavaFile('classpath:some/file.txt')
* def path = file.getPath()

How to transform monads in a conduit pipeline?

I am trying to copy a file from disk to a File in MongoDB GridFS with the Database.MongoDB packages.
main :: IO ()
main = do
pipe <- MDB.connect (host "127.0.0.1")
_ <- access pipe master "baseball" run
close pipe
run :: MDB.Action IO GFS.File
run = do
uploadImage "sandbox/bat.jpg"
uploadImage :: Text -> MDB.Action IO GFS.File
uploadImage src = do
bucket <- GFS.openDefaultBucket
runConduitRes $ sourceFileBS (unpack src) .| (hole $ GFS.sinkFile bucket src)
This does not work because the sourceFileBS expects as Resource in the base monad and the GFS.sinkFile wants a MongoDB an Action (a specialized Reader).
What is an elegant way to connect these pieces of a conduit together?
Without all of the types and functions available, it's a bit hard to tell you the best way to do it. However, one way that should work looks something like this:
withBinaryFile (unpack src) ReadMode $ \h -> runMongo $ runConduit $
sourceHandle h .| GFS.sinkFile bucket src

Execute command line order from Coffee-script

is it possible to execute a command line order such as 'll', 'pwd' or whatever from a Coffee script?
I've tried to find examples without luck so far.
Thanks!
If you execute CoffeeScript via Node.js you will have full access to the abilities of your OS. Use the spawn method of the child_process module to create a new process:
{spawn} = require 'child_process'
ls = spawn 'ls', ['array', 'of', 'options']
# receive all output and process
ls.stdout.on 'data', (data) -> console.log data.toString().trim()
# receive error messages and process
ls.stderr.on 'data', (data) -> console.log data.toString().trim()

Output when watching CoffeeScript files from a cakefile task

I would like to make a Cakefile task to watch some CoffeeScript files just like if I had run coffee -c -w js/*.coffee.
Its watching and recompiling them successfully, but it doesn't log the usual output to the terminal when there's a compile error like it would if I just ran the script from the terminal. Any idea how to make this happen?
exec = require('child_process').exec
task 'watch','watch all files and compile them as needed', (options) ->
exec 'coffee -c -w js/*.coffee', (err,stdout, stderr) ->
console.log stdout
Also, if there's a better way to invoke a coffeescript command from a cakefile than running 'exec' please post that too.
spawn instead of exec?
{spawn} = require 'child_process'
task 'watch', -> spawn 'coffee', ['-cw', 'js'], customFds: [0..2]
I've used spawn to solve this, here is an example cake file:
{spawn, exec} = require 'child_process'
option '-p', '--prefix [DIR]', 'set the installation prefix for `cake install`'
task 'build', 'continually build with --watch', ->
coffee = spawn 'coffee', ['-cw', '-o', 'lib', 'src']
coffee.stdout.on 'data', (data) -> console.log data.toString().trim()
You can see it in action with the docco project:
https://github.com/jashkenas/docco/blob/master/Cakefile
The problem with your original code was that exec only calls its callback once—after the child process has terminated. (The Node docs aren't so clear on this.) So instead of defining that callback, you should instead try
child = exec 'coffee -c -w js/*.coffee'
child.stdout.on 'data', (data) -> sys.print data
Let me know if that works for you.