Is Fortify-code scan possible with Scala - scala

Can I use Fortify to scan scala-code or the generated java (jar) files ? I know that I can do the jar option technically but are there any known challenges with respect to the generated java code?

Fortify SCA now officially supports Scala (since December 2017).
Adding this support was a collaborative project between Lightbend and Micro Focus.
I did most of the engineering work on the Lightbend side, writing a compiler plugin that translates Scala code to an intermediate form that Fortify understands. Micro Focus added Scala-specific security rules and made any necessary adjustments to the Fortify back end. (They also made sure that existing Java rules also worked for equivalent Scala code, when appropriate.)
See:
https://www.lightbend.com/blog/developing-secure-scala-applications-with-fortify-for-scala (45 minute webinar)
https://lightbend.com/fortify (form to ask Lightbend sales for more info)
https://developer.lightbend.com/docs/fortify/current/ (technical documentation)
Note that Fortify SCA is commercial software and so is the new Scala plugin. To use them, you must
have a Fortify SCA license (or use Fortify on Demand)
also be a Lightbend subscriber

I saw a response from James Roper (Play) to this question.
https://groups.google.com/forum/#!topic/play-framework/MtatDozyDjg
Basically he says that any issues that could be found by a static code analysis tool are a mistake in an API and should be fixed. JAVA cannot do this because of backwards compatibility.

I have tested scala code using Fortify SCA engine 3.8, 4.21. The Fortify found no issues. If I recall correctly, I saw a lot of warning during translation stage, so I assume that Fortify does not have native parser for the scala code.

Related

Scala Client Library for Apache OpenWhisk

Is there a Scala client library for Apache OpenWhisk?
The short answer is that there isn’t an Apache community supported Scala client. There is quite a bit to seed such an implementation from the Scala REST and CLI bindings defined by this interface https://github.com/apache/openwhisk/blob/fcbe9ca83829f2194b47f7c61a166396838c6a44/tests/src/test/scala/common/WskOperations.scala#L163 but it is not a standalone client.
If it’s something you’re interested in contributing to the project and community, you should reach out on the Apache project dev list as noted in one of the comments above.

Restricting Target Platform's API usage when developing Eclipse plugins

I'm developing an Eclipse plugin and i've run into this problem several times already.
I always keep my Target Platform updated for the latest (stable) Eclipse release so that i test my code against all the recent updates, fixes etc.
However, this may (and have) result in accidental breakage of backward compatibility of my plugin, e.g. when i accidentally use new API that did not exist in the Eclipse version i aim to support.
Or, more sneaky example, in 4.6 Eclipse moved to Java 8 and some interface methods got default implementations. Now when i implement these interfaces my IDE doesn't automatically generate empty implementations for those methods and no error is generated. If i install and run this code against a previous Eclipse version these methods will throw AbstractMethodError since no implementation has been provided.
So my question is: is there a tool to further restrict API my Target Platform provides to some earlier Eclipse API version?
Is API Baseline an appropriate tool for this? Because i couldn't get it to work like this. (It allowed even non-baseline method calls not to mention the more complex default-methods example.)
You can use multiple target platforms, switching between them doesn't take long. For testing Stack Overflow questions I have one Eclipse install with 10 target platforms.
So have a target platform for the oldest release you want to support as well as your current release target platform and check the code runs against that.
It is particularly important to test with the actual Target Platform if you want to support Eclipse 3 releases as the were large changes going from Eclipse 3 to 4.

Upgrade CometD from 2.4/2.5 to 2.9.1?

Can I expect that I replace the Java .jar files and client-side .js files of version 2.4 or 2.5 of CometD in an existing software that is running fine, with the same files in 2.9.1 and it all runs the same?
1- Are the API of CometD exactly the same on all 2.x versions?
2- Is there an upgrade guide that I can use?
Also, I noticed that on the client side, CometD 2.4/2.5 is not AMD and is a single file, but on 2.9.1 it is AMD-based. Is there a single .js file that contains all client-side CometD code?
You can expect upgrades from 2.4/2.5 to 2.9.x to be either without problems, or requiring very little changes, so yes, it should be typically be a drop-in replacement.
While you're upgrading, I suggest to move to CometD 3. You can find here the migration guide from CometD 2.x.
CometD 2.9.x is AMD compliant, and the single file you should include in your HTML is typically org/cometd.js along with a binding for a toolkit (either jquery or dojo).
If you use extensions, you should add also those, see for example http://docs.cometd.org/3/reference/#_primer, or if you don't want to use Maven, this other section.
Follow also the tutorials, that should get you going.
Full documentation link.

Are there any known issues with SpringSource-TC-Server and Java7?

We are using SpringSource-TC-Server and we are considering upgrading to java7. (Currently using java6).
We have not seen any reports on SpringSource-TC-Server not working well with java7 but we do not know of any name worthy projects that have migrated to such an environment.
I'm looking for answer(s) about the following:
Are there any known issues?
Are there any projects who migrated and can report on how it went?
Java 7 is officially supported since vFabric tc Server 2.7.0:
http://www.vmware.com/support/vfabric-tcserver/doc/vfabric-tcserver-rn-2.7.0.html#whatsnew
Since you're using tc Server instead of plain Tomcat probably due to commercial support, it's reasonable only to migrate the underlying Java JDK to the latest version when it is officially supported by the employed version of tc Server. Otherwise, you'd be running it in an unsupported configuration, which isn't far from running a plain unsupported open source version of Tomcat.
Operating tc Server on Java 7 in an officially supported arrangement of versions gives you 2 advantages:
It would have been thoroughly tested by vmWare for any incompatibilities so that you wouldn't have to deal with testing by yourself.
If any problems do occur, you can always get support from vmWare in resolving them.
I know it doesn't directly address your questions, as we in my company also haven't upgraded yet and are only planning to do so.
I just had an impression that your approach makes no sense for a commercially supported product and wanted to outline the reasonable (IMO) approach that is in wide use.
As to any known issues, Java 7 is known for its backward incompatible changes to the XML stack, especially the migration to JAXB 2.2 which changes handling of java.lang.Boolean objects (see the other question - What are the pitfalls when upgrading to Java 7). This can spring up in many different places, I've seen it cause problems in Apache CXF's cxf-codegen-plugin that generates Java stubs from WSDL since the wsdl2java tool it launches makes use of JAXB - the generated method names for boolean elements were no longer in the form of java.lang.Boolean isSomeBooleanProperty() but in the form of java.lang.Boolean getSomeBooleanProperty() which broke code depending on those stubs.
So perform thorough testing if you deal with SOAP web services or XML in general.

How to get API Tooling to work in Eclipse

I have been having a real hard time getting API Tooling to work in Eclipse 3.4.2. It keeps telling me:
The minor version should be incremented in version 3.4.0.qualifier, since new APIs have been added since version 3.4.0.40001
That being said, I have generated the plugins that are used for the baseline from the exact same code that it is being analyzed against. The API Tools docs say that it compares the current code against the baseline to see if there are any differences. I can't see how there could be differences if the built version is built from the current code.
The way that I tested it:
Create a new eclipse workspace
Create a new Plug-in Project with API Analysis turned on
Add a simple class to that plugin and export the package with that class in it
Build/Export that plugin to some location on your hard drive
Set the workspace baseline to that location and do a full build
You get an error for the project in your problems view.
Thanks,
-One very perplexed user
Looks like this is something that got fixed in 3.5. Too bad my company doesn't want us using 3.5 in case there are any incompatibility issues. (there were 3.3 to 3.4)
My recommendation to anyone who wants to do Eclipse API Analysis is to use 3.5.
First off, I apologize for jumping on a thread late after its "active time" but I am currently running into this exact situation, but with Eclipse Helios 3.6.
From your answer, you noted that something was fixed in 3.5. Are you aware of what this exact fix was AND if you have yet been able to verify that it is working under Eclipse Helios 3.6?
I would really like to have PDE API tooling working but I'm nearing my time allowed on this effort and need to move forward onto some pending tasks.
Thanks!
EDIT: I would have posted this in a followup link but did not see any such links available.