Erlang: running custom module - import

Windows 7 x64, Erlang-OTP 17.
I wrote simple module like this:
-module (somequery).
-export ([fbquery/2]).
fbquery(P1,P2) ->
inets:start(),
ssl:start(),
token = "78a8shd67tyajsndweiu03hr83h19j",
Encoded = {"Authorization","Basic " ++ base64:encode_to_string(lists:append([token,":",""]))},
ContentType = "application/xml",
Headers = [Encoded, {"Content-Type",ContentType}],
Options = [{body_format,binary}],
{ok, File}=file:read_file(P1),
Res = httpc:request(post, {"https://datapi.com/api/xml4-8", Headers, ContentType, File}, [], Options),
file:write_file(P2, io_lib:fwrite("~p.\n", [Res])).
This code working in interactive mode (werl.exe), and compiling into beam.
The question is how use *.erl or compiled *.beam module now? How import it and run fbquery/2 method?

First of, you need to add the directory containing your beam with the argument -pa Dir1 Dir2 .... It will add the directory to the erlang path and you will be able to type somequery:fbquery(Arg1,Arg2) in your shell.
Then, you can use the argument -s module function [args..] to launch erl with the specified function.
You can read about it in the erlang documentation for erl.

Related

Add bash script as an entrypoint to Python package with Poetry

Is it possible to add bash script as an entrypoint (console script) to Python package via poetry? It looks like it only accepts python files (see code here).
I want entry.sh to be an entry script
#!/usr/bin/env bash
set -e
echo "Running entrypoint"
via setup.py
entry_points={
"console_scripts": [
"entry=entry.sh",
],
},
On the other hand setuptools seems to be supporting shell scripts (see code here).
Is it possible to include shell script into a package and add it to the entrypoints after installing when working with Poetry?
UPD. setuptools does not support that as well (it generates code below)
def importlib_load_entry_point(spec, group, name):
dist_name, _, _ = spec.partition('==')
matches = (
entry_point
for entry_point in distribution(dist_name).entry_points
if entry_point.group == group and entry_point.name == name
)
return next(matches).load()
globals().setdefault('load_entry_point', importlib_load_entry_point)
Is it design decision? It looks to me that packaging should provide such a feature to deliver complex applications as a single bundle.
So I ended up using this workaround: have my script in place and add it to the bundle via package_data and call it from within Python code which I made as an entrypoint.
import subprocess
def _run(bash_script):
return subprocess.call(bash_script, shell=True)
def entrypoint():
return _run("./scripts/my_entrypoint.sh")
def another_entrypoint_if_needed():
return _run("./scripts/some_other_script.sh")
and pyproject.toml
[tool.poetry.scripts]
entrypoint = 'bash_runner:entrypoint'
another = 'bash_runner:another_entrypoint_if_needed'
Same works for console_scripts in setup.py file.

Protobufs import from another directory

While trying to compile a proto file named UserOptions.proto which has an import named Account.proto using the below command
protoc --proto_path=/home/project_new1/account --java_out=/home/project_new1/source /home/project_new1/settings/Useroptions.proto
I get the following error :
/home/project_new1/settings/UserOpti‌​ons.proto: File does not reside within any path specified using --proto_path (or -I). You must specify a --proto_path which encompasses this file.
PS: UserOptions.proto present in the directory /home/project_new1/settings
imports Account.proto present in the directory
/home/project_new1/account
Proto descriptor files:
UserOptions.proto
package settings;
import "Account.proto";
option java_outer_classname = "UserOptionsVOProto";
Account.proto
package account;
option java_outer_classname = "AccountVOProto";
message Object
{
optional string userId = 1;
optional string service = 2;
}
As the error message states, the file you pass on the command line needs to be in one of the --proto_paths. In your case, you have only specified one --proto_path of:
/home/project_new1/
But the file you're passing is:
/home/project_new1/settings/UserOpti‌ons.proto
Notice that the file is not in the account subdirectory; it's in settings instead.
You have two options:
(Not recommended) Pass a second --proto_path argument to add .../settings to the path.
(Recommended) Use the root of your source tree as the proto path. E.g.:
protoc --proto_path=/home/project_new1/ --java_out=/home/project_new1 /home/project_new1/settings/UserOpti‌ons.proto
In this case, to import Account.proto, you'll need to write:
import "acco‌​unt/Account.proto";
For those of us who want this really spelled out, here is an example where I have installed the protoc beta for gRPC using NuGet Packages Google.Protobuf, Grpc.Core and Grpc.Tools. My solution packages are one level above my Grpc directory (i.e. at BruTrader\packages). My .proto files are at BruTrader\Grpc\protos.
1. My .proto file:
syntax = "proto3";
import "timestamp.proto";
import "enums.proto";
package BruTrader.Grpc;
message DividendMessage {
double amount = 1;
google.protobuf.Timestamp dateUnix = 2;
}
2. my GenerateProto.bat file:
..\packages\Google.Protobuf.3.0.0-beta2\tools\protoc.exe -I..\Grpc\protos -I..\packages\Google.Protobuf.3.0.0-beta2\tools\google\protobuf --csharp_out=..\Grpc\Generated --grpc_out=..\Grpc\Generated --plugin=protoc-gen-grpc=..\packages\Grpc.Tools.0.13.0\tools\grpc_csharp_plugin.exe %1
3. my BuildProtos.bat
call GenerateProto ..\Grpc\protos\masterinstrument.proto
call GenerateProto .\protos\instrument.proto
etc.
4. BuildProtos.bat is executed as a Pre-build event on my Grpc project like this:
CD $(ProjectDir)
CALL "$(ProjectDir)BuildProtos.bat"
For my environment, Windows 10 Pro operating system and C++ programming languaje, I used the protoc-3.12.2-win64.zip that you can downloat it from here. You should open a Windows PowerShell inside the protoc-3.12.2-win64\bin path and then you must execute one of the next commands:
.\protoc.exe -I=C:\Users\UserName\Desktop\SRC --cpp_out=C:\Users\UserName\Desktop\DST C:\Users\UserName\Desktop\SRC\addressbook.proto
Or
.\protoc.exe --proto_path=C:\Users\UserName\Desktop\SRC --cpp_out=C:\Users\UserName\Desktop\DST C:\Users\UserName\Desktop\SRC\addressbook.proto
Note:
1- My source folder is in: C:\Users\UserName\Desktop\SRC
2- My destination folder is in: C:\Users\UserName\Desktop\DST
3- My .proto file is in: C:\Users\UserName\Desktop\SRC\addressbook.proto

how to execute a command in scala?

I want to execute this command "dot -Tpng overview.dot > overview.png ", which is used to generate an image by Graphviz.
The code in scala:
Process(Seq("dot -Tpng overview.dot > overview.png"))
It does not work.
And I also want to open this image in scala. I work under Ubuntu. By default, images will be opened by image viewer. But I type "eog overview.png" in terminal, it reports error
** (eog:18371): WARNING **: The connection is closed
Thus, I do not know how to let scala open this image.
Thanks in advance.
You can't redirect stdout using > in command string. You should use #> and #| operators. See examples in process package documentation.
This writes test into test.txt:
import scala.sys.process._
import java.io.File
// use scala.bat instead of scala on Windows
val cmd = Seq("scala", "-e", """println(\"test\")""") #> new File("test.txt")
cmd.!
In your case:
val cmd = "dot -Tpng overview.dot" #> new File("overview.png")
cmd.!
Or just this (since dot accepts output file name as -ooutfile):
"dot -Tpng overview.dot -ooverview.png".!

Browserify a mix of coffeescript and livescript files

I have a main coffee file and a mix of other coffee and livescript files.
# main.coffee
require 'LiveScript'
one = require './one.coffee'
two = require './two.ls'
console.log one.fun(), two.fun()
# one.coffee
module.exports.fun = -> 1
# two.ls
module.exports.fun = -> 2
I can run
coffee main.coffee
But trying to run
browserify -t coffeeify main.coffee
Gives an error:
module.exports.fun = -> 2
^
ParseError: Unexpected token >
The only workaround I see is to compile ls files to js first. Is there a simpler, direct way to mix ls and coffee files?
require 'LiveScript' is only sufficient for Node.js. Browserify does not support require.extensions, and is trying to parse the LiveScript as JavaScript.
You need a transform for LiveScript as well, for example Liveify.
You might try Webpack. With proper loaders, e.g. livescript-loader, coffee-loader and others, you can compose your program with different js flavors.

copy task in Cakefile

I am trying to copy all the files in a list of directories and paste them into an output directory. The problem is whenever I use an *, the output says there is no file or directory by that name exists. Here is the specific error output:
cp: cannot stat `tagbox/images/*': No such file or directory
cp: cannot stat `votebox/images/*': No such file or directory
If I just put the name of a specific file instead of *, it works.
here is my Cakefile:
fs = require 'fs'
util = require 'util'
{spawn} = require 'child_process'
outputImageFolder = 'static'
imageSrcFolders = [
'tagbox/images/*'
'votebox/images/*'
]
task 'cpimgs', 'Copy all images from the respective images folders in tagbox, votebox, and omnipost into static folder', ->
for imgSrcFolder in imageSrcFolders
cp = spawn 'cp', [imgSrcFolder, outputImageFolder]
cp.stderr.on 'data', (data) ->
process.stderr.write data.toString()
cp.stdout.on 'data', (data) ->
util.log data.toString()
You are using the * character, probably because that works for you in your shell. Using * and other wildcard characters that expand to match multiple paths is called "globbing" and while your shell does it automatically, most other programs including node/javascript/coffeescript will not do it by default. Also the cp binary itself doesn't do globbing, as you are discovering. The shell does the globbing and then passes a list of matching files/directories as arguments to cp. Look into the node module node-glob to do the globbing and give you back a list of matching files/directories, which you can then pass to cp as arguments if you like. Note that you could also use a filesystem module that would have this type of functionality built in. Note however that putting async code directly into a Cakefile can be problematic as documented here.