Execute curl command in scala to get shipping charges from easypost API - scala

I am trying execute a curl command in scala. The curl command fetched the shipping charges from easypost API. Below is the code I am using :
import sys.process._
val data="curl -X POST https://api.easypost.com/v2/shipments -u <Easypost Test API Key>: -d 'shipment[to_address][zip]=90277' -d 'shipment[from_address][zip]=94104' -d 'shipment[parcel][length]=20.2' -d 'shipment[parcel][width]=10.9' -d 'shipment[parcel][height]=5' -d 'shipment[parcel][weight]=65.9'".!!
println("Shipping data is "+data)
I am getting :
Shipping data is {"error":{"code":"SHIPMENT.INVALID_PARAMS","message":"Unable to create shipment, one or more parameters were invalid.","errors":[]}}
But in terminal it is responding like :
{"created_at":"2017-03-03T05:31:03Z","is_return":false,"messages":[],"mode":"test","options":{"currency":"USD","label_date":null,"date_advance":0},"reference":null,"status":"unknown","tracking_code":null,"updated_at":"2017-03-03T05:31:03Z","batch_id":null,"batch_status":null,"batch_message":null,"customs_info":null,"from_address":{"id":"adr_382aaa644ccb4ecfb3f14db65275dc47","object":"Address","created_at":"2017-03-03T05:31:03Z","updated_at":"2017-03-03T05:31:03Z","name":null,"company":null,"street1":null,"street2":null,"city":null,"state":null,"zip":"94104","country":"US","phone":null,"email":null,"mode":"test","carrier_facility":null,"residential":null,"federal_tax_id":null,"state_tax_id":null,"verifications":{}},"insurance":null,"order_id":null,"parcel":{"id":"prcl_adf352eee75d43339279f959b8cd1118","object":"Parcel","created_at":"2017-03-03T05:31:03Z","updated_at":"2017-03-03T05:31:03Z","length":20.2,"width":10.9,"height":5.0,"predefined_package":null,"weight":65.9,"mode":"test"},"postage_label":null,"rates":[{"id":"rate_9799f33dbc99420abeba4101d6a0d31f","object":"Rate","created_at":"2017-03-03T05:31:03Z","updated_at":"2017-03-03T05:31:03Z","mode":"test","service":"Express","carrier":"USPS","rate":"37.08","currency":"USD","retail_rate":"41.80","retail_currency":"USD","list_rate":"37.08","list_currency":"USD","delivery_days":null,"delivery_date":null,"delivery_date_guaranteed":false,"est_delivery_days":null,"shipment_id":"shp_54916a9085114979a300c0ba7b10efd7","carrier_account_id":"ca_bba7f2862b2e4a6aa682dcf5eeb0de38"},{"id":"rate_df1211dd22ab4aefa83cc34e206acb9c","object":"Rate","created_at":"2017-03-03T05:31:03Z","updated_at":"2017-03-03T05:31:03Z","mode":"test","service":"Priority","carrier":"USPS","rate":"8.91","currency":"USD","retail_rate":"11.95","retail_currency":"USD","list_rate":"9.19","list_currency":"USD","delivery_days":2,"delivery_date":null,"delivery_date_guaranteed":false,"est_delivery_days":2,"shipment_id":"shp_54916a9085114979a300c0ba7b10efd7","carrier_account_id":"ca_bba7f2862b2e4a6aa682dcf5eeb0de38"},{"id":"rate_37f9ba8a68304222b02c15019a02918a","object":"Rate","created_at":"2017-03-03T05:31:03Z","updated_at":"2017-03-03T05:31:03Z","mode":"test","service":"ParcelSelect","carrier":"USPS","rate":"9.19","currency":"USD","retail_rate":"9.19","retail_currency":"USD","list_rate":"9.19","list_currency":"USD","delivery_days":5,"delivery_date":null,"delivery_date_guaranteed":false,"est_delivery_days":5,"shipment_id":"shp_54916a9085114979a300c0ba7b10efd7","carrier_account_id":"ca_bba7f2862b2e4a6aa682dcf5eeb0de38"}],"refund_status":null,"scan_form":null,"selected_rate":null,"tracker":null,"to_address":{"id":"adr_9aa339f8acd244059e5ffb775c541dba","object":"Address","created_at":"2017-03-03T05:31:03Z","updated_at":"2017-03-03T05:31:03Z","name":null,"company":null,"street1":null,"street2":null,"city":null,"state":null,"zip":"90277","country":"US","phone":null,"email":null,"mode":"test","carrier_facility":null,"residential":null,"federal_tax_id":null,"state_tax_id":null,"verifications":{}},"usps_zone":4,"return_address":{"id":"adr_382aaa644ccb4ecfb3f14db65275dc47","object":"Address","created_at":"2017-03-03T05:31:03Z","updated_at":"2017-03-03T05:31:03Z","name":null,"company":null,"street1":null,"street2":null,"city":null,"state":null,"zip":"94104","country":"US","phone":null,"email":null,"mode":"test","carrier_facility":null,"residential":null,"federal_tax_id":null,"state_tax_id":null,"verifications":{}},"buyer_address":{"id":"adr_9aa339f8acd244059e5ffb775c541dba","object":"Address","created_at":"2017-03-03T05:31:03Z","updated_at":"2017-03-03T05:31:03Z","name":null,"company":null,"street1":null,"street2":null,"city":null,"state":null,"zip":"90277","country":"US","phone":null,"email":null,"mode":"test","carrier_facility":null,"residential":null,"federal_tax_id":null,"state_tax_id":null,"verifications":{}},"forms":[],"fees":[],"id":"shp_54916a9085114979a300c0ba7b10efd7","object":"Shipment"}
Am I doing something wrong here ? Please suggest.

It looks like curl is successfully contacting the service, but it is rejecting your request.
Are you sure you are using the exact same arguments on the command-line as you are in the scala version? The trailing ":" on the API key looks suspicious to me.
I wonder if this is a shell escaping problem. Maybe try using the Seq[String] form of ProcessBuilder, to avoid any shell escaping?
import sys.process._
val data = List("curl", "-X", "POST", "https://api.easypost.com/v2/shipments", "-u", "<Easypost Test API Key>:", "-d", "shipment[to_address][zip]=90277", "-d", "shipment[from_address][zip]=94104", "-d", "shipment[parcel][length]=20.2", "-d", "shipment[parcel][width]=10.9", "-d", "shipment[parcel][height]=5", "-d", "shipment[parcel][weight]=65.9").!!
println("Shipping data is " + data)

Related

passing python modules on HDFS through livy

On the /user/usr1/ path in HDFS, I placed two scripts pySparkScript.py and relatedModule.py. relatedModule.py is a python module which will be imported into pySparkScript.py.
I can run the scripts with spark-submit pySparkScript.py
However, I need to run these scripts through Livy. Normally, I run single scripts successfully as the following:
curl -H "Content-Type:application/json" -X POST -d '{"file": "/user/usr1/pySparkScript.py"}' livyNodeAddress/batches
However, when I run the above code, as soon as it gets to import relatedModule.py it fails. I realize I should give the path to the relatedModule also in the parameters of Livy. I tried the following option:
curl -H "Content-Type:application/json" -X POST -d '{"file": "/user/usr1/pySparkScript.py", "files": ["/user/usr1/relatedModule.py"]}' livyNodeAddress/batches
How should I pass both files to Livy?
Try to use pyFiles property.
Please refer Livy REST API docs.

How to execute a jar-packaged scala program via Apache Livy on Spark that responds with a result directly to a client request?

What I intend to achieve is having a Scala Spark program (in a jar) receive a POST message from a client e.g. curl, take some argument values, do some Spark processing and then return a result value to the calling client.
From the Apache Livy documentation available I cannot find a way how I can invoke a compiled and packaged Spark program from a client (e.g. curl) via Livy in an interactive i.e. session mode. Such a request/reply scenario via Livy can be done with Scala code passed in plain text to the Spark shell. But how can I do it with a Scala class in a packaged jar?
curl -k --user "admin:mypassword" -v \
-H "Content-Type: application/json" -X POST \
-d #Curl-KindSpark_ScalaCode01.json \
"https://myHDI-Spark-Clustername.azurehdinsight.net/livy/sessions/0/statements" \
-H "X-Requested-By: admin"
Instead of Scala source code as data (-d #Curl-KindSpark_ScalaCode01.json) I would rather pass the path and filename of the jar-file and a ClassName and Argument values. But how?
Make a uber jar of your Spark app with sbt-assemby plugin.
Upload jar file from the previous step to your HDFS cluster:
hdfs dfs -put /home/hduser/PiNumber.jar /user/hduser
Execute your job via livy:
curl -X POST -d '{"conf": {"kind": "spark" , "jars": "hdfs://localhost:8020/user/hduser/PiNumber.jar"}}' -H "Content-Type: application/json" -H "X-Requested-By: user" localhost:8998/sessions
check it:
curl localhost/sessions/0/statements/3:
{"id":3,"state":"available","output":{"status":"ok","execution_count":3,"data":{"text/plain":"Pi
is roughly 3.14256"}}}
p.s.
Spark Livy API for Scala/Java requires using an uber jar file. sbt-assembly doesn't make fat jar instantly, it annoys me.
Usually, I use Python API of Livy for smoke tests and tweaking.
Sanity checks with Python:
curl localhost:sessions/0/statements -X POST -H 'Content-Type: application/json' -d '{"code":"print(\"Sanity check for Livy\")"}'
You can put more complicated logic to field code.
BTW, it's a way in which popular notebooks for Spark works - sending the source code to cluster via Livy.
Thx, I will try this out. In the meanwhile I found another solution:
$ curl -k --user "admin:" -v -H "Content-Type: application/json" -X POST -d #Curl-KindSpark_BrandSampleModel_SessionSetup.json "https://mycluster.azurehdinsight.net/livy/sessions
with a JSON file containing
{
"kind": "spark",
"jars": ["adl://skylytics1.azuredatalakestore.net/skylytics11/azuresparklivy_2.11-0.1.jar"]
}
and with the uploaded jar in the Azure Data Lake Gen1 account containing the Scala object and then post the statement
$ curl -k --user "admin:myPassword" -v -H "Content-Type: application/json" -X POST -d #Curl-KindSpark_BrandSampleModel_CodeSubmit.json "https://mycluster.azurehdinsight.net/livy/sessions/4/statements" -H "X-Requested-By: admin"
with the content
{
"code": "import AzureSparkLivy_GL01._; val brandModelSamplesFirstModel = AzureSparkLivyFunction.SampleModelOfBrand(sc, \"Honda\"); brandModelSamplesFirstModel"
}.
So I told Livy to start an interactive Spark session and load the specified jar and passed some code to invoke a member of the object in the jar. It works. Will check your advice too.

Upload secret file credentials to Jenkins with REST / CLI

How can I create a Jenkins Credential via REST API or Jenkins CLI? The credential should be of type "secret file", instead of a username / password combination.
The question is similar to this question, but not the same or a duplicate.
You can do it as follows:
curl -X POST \
https://jenkins.local/job/TEAM-FOLDER/credentials/store/folder/domain/_/createCredentials \
-F secret=#/Users/maksym/secret \
-F 'json={"": "4", "credentials": {"file": "secret", "id": "test",
"description": "HELLO-curl", "stapler-class":
"org.jenkinsci.plugins.plaincredentials.impl.FileCredentialsImpl",
"$class":
"org.jenkinsci.plugins.plaincredentials.impl.FileCredentialsImpl"}}'
just finished with it today https://www.linkedin.com/pulse/upload-jenkins-secret-file-credential-via-api-maksym-lushpenko/?trackingId=RDcgSk0KyvW5RxrBD2t1RA%3D%3D
To create Jenkins credentials via the CLI you can use the create-credentials-by-xml command:
java -jar jenkins-cli.jar -s <JENKINS_URL> create-credentials-by-xml system::system::jenkins _ < credential-name.xml
The best way to know the syntax of this is to create a credential manually, and then dump it:
java -jar jenkins-cli.jar -s <JENKINS_URL> get-credentials-as-xml system::system::jenkins _ credential-name > credential-name.xml
Then you can use this XML example as a template, it should be self-explanatory.
If you want to update an existing secret file, the simplest way I found was to delete and re-create.
A delete request, to extend #lumaks answer (i.e. with the same hostname, folder name and credentials id), looks like:
curl -v -X POST \
-u "user:password" \
https://jenkins.local/job/TEAM-FOLDER/credentials/store/folder/domain/_/credential/test/doDelete
This will return either HTTP status code 302 Found or 404 Not Found for existing and non-existing creds file respectively.

Refreshing an Elastic Search Index / Realtime Searching

I'm writing unit tests for an ElasticSearch datasource, however, I am getting mixed results. The problem is that a match_all query isn't finding records that I submitted, however, when I run the commands by hand using CURL in the same order the unit test does I am able to find the records.
I believe that perhaps the index isn't refreshed, so, I started running the "refresh" api command after submitting records, however, this didn't work either. Here is my list of commands - it would be helpful if anyone had any suggestions on how to make sure these commands worked even if they were run in immediate succession.
Commands the unit test runs:
curl -XGET 'http://localhost:9200/test_index/_mapping'
curl -XDELETE 'http://localhost:9200/test_index/test_models'
curl -XPOST 'http://localhost:9200/test_index/test_models/_refresh' -d '{}'
curl -XPUT 'http://localhost:9200/test_index/test_models/_mapping' -d '{"test_models":{"properties":{"TestModel":{"properties":{"id":{"type":"string","index":"not_analyzed"},"string":{"type":"string"},"created":{"type":"date","format":"yyyy-MM-dd HH:mm:ss"},"modified":{"type":"date","format":"yyyy-MM-dd HH:mm:ss"}},"type":"object"}}}}'
curl -XPOST 'http://localhost:9200/test_index/test_models/_bulk' -d '{"index":{"_index":"test_index","_type":"test_models","_id":"test-model"}}
{"TestModel":{"id":"test-model","string":"Analyzed for terms","created":"2012-01-01 00:00:00","modified":"2012-02-01 00:00:00"}}
'
curl -XPOST 'http://localhost:9200/test_index/test_models/_refresh' -d '{}'
curl -XGET 'http://localhost:9200/test_index/_mapping'
curl -XGET 'http://localhost:9200/test_index/test_models/_search' -d '{"query":{"match_all":{}},"size":10}'
This question has also been posted to the (super awesome) ElasticSearch mailing list:
https://groups.google.com/forum/?fromgroups#!topic/elasticsearch/Nxv0XpLDY4k
-DK
The problem is with the _refresh command.
You can't refresh a type, only an index. I changed the refresh command to:
curl -XPOST 'http://localhost:9200/test_index/_refresh'
And it is now fixed!

Unable to use CURL within GROOVY script for a REST PUT call

I am trying to do a simple PUT request using CURL. Simple it is on a terminal but not able to get it working within my Groovy script.
Here is a snippet of it :-
class Test {
//Throws 415 Cannot Consume Content Type
void testPUT () {
println "curl -i -X PUT -H \"Content-Type: application/json\" -d '{\"Key1\":1, \"Key2\":\"Value2\"}' http://<hostname>/foo/".execute().text
}
// Works Perfectly Fine
void testGET () {
println "curl -i -X GET -H \"Content-Type: application/json\" http://<hostname>/foo".execute().text
}
}
I also tried to enclose the command using triple quotes like:-
"""curl -i -X PUT -H "Content-Type:application/json" -d '{"Key1":1,"Key2":"Value2"}' http://<hostname>/foo""".execute().text
All my attempts just gives 415 Content Type Cannot be Consumed
When I simply use the curl command on a terminal window, both PUT and GET methods work fine.
Am I missing something? Any help would be appreciated!
Thank You!
Try using the list variation of the string and see if that works:
println ["curl", "-i", "-X PUT", "-H 'Content-Type:application/json'", "-d '{\"Key1\":1, \"Key2\":\"Value2\"}'", "http://<hostname>/foo/"].execute().text
I was having a similar problem and this was the only way I could find to solve it. Groovy will split the string into arguments at each space and I suspect this was tripping up Curl and the -H argument pair. By putting the string into the list variant, it keeps each item together as an argument.
Building on Bruce's answer, you will also need to tokenize "-X PUT". Tested out on groovy 2.3.6. ["curl", "-H", "Content-Type: application/json", "-H", "Accept: application/json", "-X", "PUT", "-d", data, uri].execute()
This works in my terminal
groovy -e "println 'curl -i -H \'Content-Type:application/json\' -XPUT -d \'{\"test\":4}\' http://google.fr/'.execute().text"
If it does not work for you, then this is likely not a groovy problem.
Thanks xynthy for the list variation hint, I was still seeing the dreaded
"Content type 'application/x-www-form-urlencoded' not supported"
with your example, but breaking up the -H and the content type strings worked.
This is confirmed working in groovy 1.8:
["curl", "-H", "Content-Type: application/json", "-H", "Accept: application/json", "-X PUT", "-d", data, uri].execute().text
First I installed the groovy post build plugin
https://wiki.jenkins-ci.org/display/JENKINS/Groovy+Postbuild+Plugin
Then I have included groovy post build plugin in my post build configuration of my jenkins job
and used the command
"curl --request POST http://172.16.100.101:1337/jenkins/build".execute().text
Here my endpoint is http:172.16.100.101:1337/jenkins/build