Scala Client Library for Apache OpenWhisk - scala

Is there a Scala client library for Apache OpenWhisk?

The short answer is that there isn’t an Apache community supported Scala client. There is quite a bit to seed such an implementation from the Scala REST and CLI bindings defined by this interface https://github.com/apache/openwhisk/blob/fcbe9ca83829f2194b47f7c61a166396838c6a44/tests/src/test/scala/common/WskOperations.scala#L163 but it is not a standalone client.
If it’s something you’re interested in contributing to the project and community, you should reach out on the Apache project dev list as noted in one of the comments above.

Related

REST Api documentation for Apache CXF application

We use Apache CXF in our application with jax-rs to build REST Api. As of now, there is no documentation about the various endpoints available and is deployed on Tomcat 8.5 server.
I have done some R&D on how to find a solution. I understand that Swagger can be used.
But, I did not find enough documentation to user swagger with Apache CXF.
I understand that these type of questions are prohibited in the site. At the same time, I am not sure which chat room to use for this purpose.
Any information on this would help me a lot.
Depending on the CXF version that you are using, I would suggest to use OpenApiFeature (OpenAPI is newer than Swagger) as described here: http://cxf.apache.org/docs/openapifeature.html
You can also find multiple sample projects with Swagger or OpenAPI here: https://github.com/apache/cxf/tree/master/distribution/src/main/release/samples/jax_rs

Does spark-cassandra-connector support built-in load balanceing?

I have Scala-based application and I need to connect it to Cassandra.
I found DataStax Enterprise drivers very useful in this regard, and those have a lot of cool features like in-built load balancing for Cassandra and that is really import for me.
Unfortunately there isn't any native DSE drivers for Scala. I know we can use DSE Java drivers, but in that case, we loose a lot of Scala cool features.
I also found spark-cassandra-connector that's built by Datastax as well, but this built-in load balancing thing is really important to me and I don't know if spark-cassandra-connector support it or not.
In the Java-based applications using DSE Java driver, I need to config the built-in load balancer in a configuration file as below:
datastax-java-driver.basic.load-balancing-policy {
class = DefaultLoadBalancingPolicy
}
I don't know the equivalent way in Scala using spark-cassandra-connector and I'm not even sure if it is possible or not.
Any help would be appreciated. Thanks.
In the Scala you can just use the Java driver - out of the box you don't have only support for base Scala types, but you can solve this problem by importing the java-driver-scala-extras into your project (as source code) - it works for at least for driver 3.x. Another issue is the support for Option, but this could done via Java's optional that has an extra codec in Java driver.
Regarding the customization of the driver - that part should work with Scala without change. Regarding the support of default policy in Spark - Spark Cassandra connector has a separate policy for a special reason - it's close to the Java's default policy, but with specifics for Spark.

Using Hawkular for Rest services built using CXF

I am new to Distributed Tracing / Hawkular. And would like to experiment tracing for my distributed cxf rest services using hawkular.
Will it be possible to trace cxf servcies using hawkular and if any one has doc or reference sample app, that will be great.
Also, is there any other tracing tool which can solve this requirement(tracing java cxf rest services). Zipkin-brave has a feature for this which I am looking at also.
I'd recommend instrumenting your application using the OpenTracing API, and later choose a concrete implementation. Under the Hawkular project, there's the Hawkular APM module which provides a solution for capturing, visualizing and making sense of the data. However, we (Hawkular APM) recently decided to join the Jaeger project, to have a better support for the OpenTracing case. We expect to have similar features from Hawkular APM ported to Jaeger "soon".
For OpenTracing, there are quite a few "framework integrations" under the OpenTracing Contrib organization, including JAX-RS, which might serve as base or reference for a CXF-specific implementation. If nothing suits you, I'm certain we'd welcome a contribution.
If you are just looking to learn OpenTracing, I'd suggest taking a look at the Hawkular APM's example directory, including a vertx-opentracing example.

Is Fortify-code scan possible with Scala

Can I use Fortify to scan scala-code or the generated java (jar) files ? I know that I can do the jar option technically but are there any known challenges with respect to the generated java code?
Fortify SCA now officially supports Scala (since December 2017).
Adding this support was a collaborative project between Lightbend and Micro Focus.
I did most of the engineering work on the Lightbend side, writing a compiler plugin that translates Scala code to an intermediate form that Fortify understands. Micro Focus added Scala-specific security rules and made any necessary adjustments to the Fortify back end. (They also made sure that existing Java rules also worked for equivalent Scala code, when appropriate.)
See:
https://www.lightbend.com/blog/developing-secure-scala-applications-with-fortify-for-scala (45 minute webinar)
https://lightbend.com/fortify (form to ask Lightbend sales for more info)
https://developer.lightbend.com/docs/fortify/current/ (technical documentation)
Note that Fortify SCA is commercial software and so is the new Scala plugin. To use them, you must
have a Fortify SCA license (or use Fortify on Demand)
also be a Lightbend subscriber
I saw a response from James Roper (Play) to this question.
https://groups.google.com/forum/#!topic/play-framework/MtatDozyDjg
Basically he says that any issues that could be found by a static code analysis tool are a mistake in an API and should be fixed. JAVA cannot do this because of backwards compatibility.
I have tested scala code using Fortify SCA engine 3.8, 4.21. The Fortify found no issues. If I recall correctly, I saw a lot of warning during translation stage, so I assume that Fortify does not have native parser for the scala code.

Upgrade CometD from 2.4/2.5 to 2.9.1?

Can I expect that I replace the Java .jar files and client-side .js files of version 2.4 or 2.5 of CometD in an existing software that is running fine, with the same files in 2.9.1 and it all runs the same?
1- Are the API of CometD exactly the same on all 2.x versions?
2- Is there an upgrade guide that I can use?
Also, I noticed that on the client side, CometD 2.4/2.5 is not AMD and is a single file, but on 2.9.1 it is AMD-based. Is there a single .js file that contains all client-side CometD code?
You can expect upgrades from 2.4/2.5 to 2.9.x to be either without problems, or requiring very little changes, so yes, it should be typically be a drop-in replacement.
While you're upgrading, I suggest to move to CometD 3. You can find here the migration guide from CometD 2.x.
CometD 2.9.x is AMD compliant, and the single file you should include in your HTML is typically org/cometd.js along with a binding for a toolkit (either jquery or dojo).
If you use extensions, you should add also those, see for example http://docs.cometd.org/3/reference/#_primer, or if you don't want to use Maven, this other section.
Follow also the tutorials, that should get you going.
Full documentation link.