Sending data before execution of scenarios - scala

I am working on a scala application. I have some files in my resouce folder of project. Those are json files. I want to load all of them as string and send them over to kafka topic. I already have kafka producer code but just don't know how to all files and send them. I am using following code
Source.fromResource(path_of_file).mkstring
But with this I am able to send only one file which I pass but how can I write a generic code to load them and send them one by one. This thing I need to do in BeforeAll of my cucumber test. In short I just want to send these files before my any scenario begin to execute

Which sbt version are you using? Please note that sbt 1.2.8 has a bug in listing directories. Otherwise the following should do that:
new File(getClass.getResource("/my-dir").getFile).listFiles().foreach(println)

Related

updating kafka dependency in camus is causing messages not read by EtlRecordReader

In my project camus is used for long time and it is never get updated.
The camus project uses kafka version 0.8.2.2. I want to find a workaround to use kafka 1.0.0.
So I cloned the directory and updated the dependency. When I do that the Message here requires additional parameters here.
As given in the github link above, the code compiles but the messages are not read from the kafka due to the condition here.
Is it possible to update the kafka dependency along with appropriate data constructors of kafka.message.Message and make it work.

Build code in vscode using external http server

Our code building process is done via an http server which starts the build process after receiving a project uuid from the build command. Once the server starts the compilation, GCC compatible output can be fetched from it.
Note: only my extension is aware of the project uuid which is different per workspace.
AFAIU I can implement it by:
programmatically adding a task which will call a script with the correct workspace uuid. Is this possible?
Having my extension manage the build process. This seems to be far from supported.
Bottom line, I'm trying to avoid asking the user to add anything to the configuration files and I want to completely manage the build process.
Thanks!
As I didn't find a suitable only vscode solution I did the following:
Defined a helper script which I executed as the task. The helper script was respojnsible for the communication against the HTTP server.
I registered the task using vscode.workspace.registerTaskProvider API, and made sure to register it only after figuring out the UUID.
Then in the task itself I executed the helper script.
(A nice task register example can be found here: https://github.com/Microsoft/vscode-extension-samples/tree/master/task-provider-sample)

How to send REST request with build parameters in Jenkins Post Build Actions?

I am working on a CI system with Jenkins. But now I got a problem. I need to do the following steps.
1:Jenkins build
2:Deploy to Tomcat
3:find a way to send the build parameters (Job Name, build number...) to a web server (I am using REST now).
4:Web Server trigger testing system.
5:Jenkins get the result from testing system.
6:update build status
7:send emails.
I have problem with the step 3. I need to send those info after the deploy. I am thinking a way as following.
write those parameters to a file during build step, then call a script or Java problem to process the file and send out those info by REST.
But that is ugly. Is there any better ways to do it?
Side questions
Can groovy do this?
How to import groovy http-builder library to Jenkins?
I found a walk around solution.
1: ran echo command during the build to get the build ID and print to log.
2:wrote a small Java program to get the JSON response of the build then sent the necessary info as rest request to the server you set. The program is like a message forwarder.
3:in post build actions, use groovy post build to fetch the log then call the Java program.

scala online code execution

I'd like to develop an web-based application that allows user to submit Scala code from their web browser client and compile / execute their code on the server.
I was trying to use the scala.tools.nsc.IMain / ILoop classes to load the client file and then execute the file on the server. How do I do this?
How does using the IMain / ILoop classes compare to forking off an external process to compile and execute the code?
Zeppelin is another opensource project worth to take a look at.
It's got scala interpreter embedded for Apache Spark.
https://github.com/NFLabs/zeppelin
I guess it's worth to take a look at https://github.com/Bridgewater/scala-notebook http://vimeo.com/user18356272/review/66548724/53e2b222c1

Restful DDS execution

I download restful-dds-1.0-src.tgz file from http://code.google.com/p/restful-dds/downloads/list website. I am using linux environment. From the ReadMe.txt file i execute the chatter application (CHATROOM TEST) up to scripts/startRESTfulDDS.sh and also view the html file from http://ipaddress:8182/static/ajaxTest.html. After that "run the Chatter application in the Tutorial directory by running scripts/Chatter.{sh,bat}." In here my problem arise. I am not able to see scripts folder and chatter.sh file inside the Tutorial folder. Please, help me what i did wrong.
I am using opensplice DDS v5.5
GWT2.4.0,
JDK 1.6,
Restlet v2.0.14,
Gson v2.2.2
I am not able to see scripts folder and
chatter.sh file inside the Tutorial folder
The Tutorial folder that is created is an exact copy of the OpenSpliceDDS tutorial, found in $OSPL_HOME/examples/dcps/standalone/Java/Tutorial. There seems to be a mismatch between the description in the resful-dds README and this tutorial because indeed, there is no chatter.sh. However, there is a README.txt inside the Tutorial directory which explains how to run Chatter:
Chatter [userid] [username]
userid: an integer number that uniquely identifies the sender of a message
(Transmit a message with userid = -1 to terminate the MessageBoard.)
username: the user-name other chatters will see when they receive one of your
chat messages.
The executables classes are located in the chatroom package, but should be
started from the current directory in the following way:
...
java -classpath $OSPL_HOME/jar/dcpssaj.jar:bld chatroom.Chatter 1 Bill
Following this procedure, you should be able to run Chatter. Of course, you should first run ospl start to initialize the infrastructure.
By the way, it is not required that you run the java version of the tutorial -- any supported language should do. The OpenSpliceDDS installation itself should give you more information about running Chatter for different languages. The restful DDS webservice will pick up any data found on the DDS bus and expose it via HTTP, no matter what language the originating process was written in.