How can we subscribe to resources in scala? - scala

How can we subscribe to file present as resource in Scala project so that any live changed to the file can be detected in the the service?
Example there is a Scala code which is calculating sum of numbers from the text file , how to subscribe to that file in the code so that program can act upon immediately for any addition of new numbers in the file.

In Scala you can use Java classes and APIs.
You can use the Java Watch Service API in java.nio.file.
You can about it here.

Related

Amqp test framework - write tests without coding [duplicate]

We have a requirement where we need to send .avro file as an input request to our API's. Really stuck at this point. If any detail example provided would be more appreciated.
Just use Java interop: https://github.com/intuit/karate#calling-java
You need to write a helper (start with a static method) to convert JSON to Avro and vice versa. I know teams using this for gRPC. Read this thread for tips: https://github.com/intuit/karate/issues/412
Also there is even a "karate-grpc" project: https://github.com/pecker-io/karate-grpc
Also see:
https://twitter.com/KarateDSL/status/1128170638223364097
https://twitter.com/KarateDSL/status/1417023536082812935

How to write a value from the response to a file in Gatling?

I have a script which creates new referenceId each time its executed. I used
.check(regex("orders.(.*?)\"").saveAs("referenceId")))
to extract the referenceId. Now, how can I write/append it to a file without impacting the script even if I run it as a load test?
I used session in .exec to write my value into a file. Here it is:
.exec( session => {
scala.tools.nsc.io.File("../user-files/data/refenceId.csv").appendAll(session("refenceId").as[String]+"\n")
session}
)
You solution works, but...
First of all do not use anything (if you don't have to) from scala.tools.nsc.io package. It is internal package only for Scala compiler. It is not public API included in Scala runtime library (official Scaladoc). More about the topic here. Scala do not have any own abstraction for writing to file, hence one need to use normal java.io.File & co.
Secondly opening a file in each execution can (may not) slow down your load-test. It strongly depends on at which rate you are making the requests. At higher rates you can experience contention when more concurrent executions will be trying to write to same file. Simplest solution to this is to write to different files, but you can still run out of maximum possible number of opened files. Another solution is to use shared java.io.FileOutputStream resp. java.io.FileWriter to desired target file with proper synchronisation (will be accessed from various threads), which is still blocking IO. Yet another solution will be use Java NIO API to write to shared file via Channel (non-blocking) or OutputStream (not sure if non-blocking).
Of course solutions differ in difficulty of implementation.

Scalatra file organisation

How should I manage my files in Scalatra . After encountering the following error my fundamental understanding of "code separation" in Scala has been destroyed .
Working in Scalatra I defined an class in one file and received an in an error after attempting to define a class with the same name in another file . I was somewhat confused about the error because I was working under the impression that there was some degree of isolation afforded to each file ( Node JS inspired assumption).
I cam currently working on an application that requires : Actors, Routes, Classes, etc . How should I organize these things ?
Classes of the same name need to be in different packages. You can use imports to avoid having to type the full path (packagename.ClassName), but if you don't create separate packages there is no way to unambiguously refer to the class you mean. That quickly grows unworkable in code bases of more than moderate size.
So, no, separate files are not enough.

Working with Linux shared objects in Scala

How can I access *.so object and it's methods in Scala? Here is a Python example: https://github.com/soulseekah/py-libgfshare/blob/master/gfshare.py where ctypes library is used to interact with libgfshare.so. What tools do I need to use to achieve the same in Scala?
If you would like to interact with a native library which doesn't support JNI (Java Native Interface) (that is, not designed especially for interacting with Java VM), try JNA (Java Native Access). There's also Scala Native Access project on Google Code, which seems to provide more "scala-friendly" API, but it seems inactive (last commit was in 2010).
The previous answer is quite correct that JNI is the way to go but getting it all to work requires a little perseverance. An example of a multi-platform Scala interface to a real world native library can be found here. As a summary, the steps you need to take are detailed below.
First, define the Scala interface that you want to use to access your native library with. An example of a native interface declaration in Scala is:
package foo.bar
object NativeAPI {
#native def test(data: Array[Byte]): Long
}
Compile this file using scalac and then use javah to output a suitable C/C++ header file for the JNI native stub. The javah header generator (part of Java SDK) needs to be invoked as
javah -classpath <path-to-scala-libs> foo.bar.NativeAPI$
Note that the $ is added to Scala objects by the Scala compiler. The functions generated in this header will be called when you call the API from the JVM. For this example, the javah generated C header's declaration for this function would look like this:
JNIEXPORT jlong JNICALL Java_foo_bar_NativeAPI_00024_test(JNIEnv *, jobject, jbyteArray);
At this point, you need to create a C file which then maps this function from your JVM api to the native library you intend to use. The resulting C file needs to be compiled to a shared library (.so in Linux) which references the native library you want to call. The C interface into the JVM is described here.
For this example, lets call this library libjni-native.so and assume it references a 3rd party library called libfoo.so.0. If both these libraries are available in the dynamic library search path of your OS, then you need to instruct the JVM to load the library and call the function as follows:
System.loadLibrary("libjni-native.so")
val data = new Array[Byte](100)
// Populate 'data'
NativeAPI.test(data)
On Linux and OSX machines, the dynamic linker will know that libfoo.so.0 (.dylib for OSX) is a dependency of libjni-native.so and will load it automatically. You can now call the test function from Scala. You should now be able to make a call to foo.bar.Native.test() and have the native function executed.
In the real world, you probably need to bundle the .so libraries into a JAR file for distribution. To do this, you can place the shared libraries in a suitable directory in the resources directory of your project. These libraries need to be copied from the JAR file to a temporary directory at run time and then loaded using System.load("/tmppath/libjni-native.so"). LibLoader from the example shows one way how this can be achieved.

How can I perform dynamic reconfiguration in Scala (like Dyre or XMonad)?

A fairly common method of configuration for Haskell applications is having the program as a library, with a main function provided with a bunch of optional parameters for configuration. Upon being run, the executable itself looks for a dotfile containing a main function using this default function, which it then compiles and run instead. This sort of configuration scheme allows the user to add arbitrarily complex functionality without recompiling the entire program. Examples of this are the Dyre library and the XMonad window manager. How can this be done in Scala cleanly? It appears that SBT does something similarly internally.
Using SBT externally would require having the sources of the whole program somewhere, and lacks the cleanliness of just having a single dotfile. Typesafe config, Configrity, Bee Config, and fig all seem to only be meant for normal string based configuration.
https://github.com/typesafehub/config is a great config library.
supports files in three formats: Java properties, JSON, and a human-friendly JSON superset