Generating Scala models from Swagger spec - scala

I'm trying to build a Vert.X app using Scala, and generating the routes using an OpenAPI 3 spec through the OpenAPI3RouterFactory.
I need to generate the models described in my spec as Scala classes.
Is there any simple and straightforward way to accomplish this?
I'm using SBT to build my app, and I've already tried some sbt codegen plugins for Swagger, but none of them seem to work.

With vertx-web-api-contract, Router and validation handlers are generated at runtime, so you don't need to generate the routes. You can just start using the Router factory and mount the handlers you want as if it would be a Vert.x Web Router. If you want to bootstrap a new project there is a community tool called vertx-starter, but there is no Scala support now
Talking about the models, what you can do is organize your OpenAPI specification in different files, putting all data model definitions under a specific directory like spec/models (You can find a good guide here). Then you can configure jsonschema2pojo (sbt plugin) to generate a Scala case class for each schema inside that directory. Then, if you want to repack the spec in a single file, you can configure tools like swagger-cli to run during the compilation and pack the spec back in a single file

Related

Automatic META-INF/services generation in Scala and SBT for ServiceLoader

Is there a way, in Scala and SBT, to automatically generate META-INF/services/* resource files for later use by java.util.ServiceLoader by annotating classes, like Google Auto Service does for Java projects?
i.e.
package foo.bar
import my.exported.ServiceInterface
#AutoService[ServiceInterface]
class MyService extends ServiceInterface{
// …
}
to automatically generate the file META-INF/services/my.exported.ServiceInterface in the resources folder. The file will contain:
foo.bar.MyService
(I don't think I can use Google Auto Service directly, as it doesn't work with Scala classes -- see this comment on a realm-java github issue.)
Please consider using https://github.com/nyavro/spi-plugin.
The approach used in this plugin differs from using annotations - it uses whole packages as a source of interfaces and applies to packages of interface implementations.

How to use AspectJ with Sbt to generate SWF client classes

I need to do some work with Amazon SWF in sbt scala project. So i am getting problem in generating SWF client classes. If any of you have uses SWF in a scala project, please tell how can i generate SWF client classes using sbt AspectJ.
AspectJ is not used to generate client classes. It is used only to inject interceptors for #Asynchronous and #ExponentialRetry annotations. The client side classes are generated using SWF annotation processor.
I don't know anything about Scala. But in Java you can write workflows without generated client classes using generic API that generated code relies on. To get these generic clients use getActivityClient and getWorkflowClient methods returned by DecisionContext

Can SBT scopes be used for custom libraryDependencies for specific code blocks?

I've a simple SBT project, in which one code block reads from HDFS (needs a certain version of Hadoop's libraryDependencies) and another code block (needs another version of Hadoop's libraryDependencies) writes the filtered result to Cassandra.
Can SBT scopes be used to assign a different libraryDependencies to the two code blocks?
You can do this, but you have to split your code over one of the scope axises: Project, configuration, task. The only axis that can be used for your purpose is the "project" axis. So you have to create a multi-project sbt project and split your code on its sub projects.
But his will not solve your problem. Because you will not be able to run the resulting application. The Java class loader has no way to decide, when to use the one version of Hadoop and when the other. It will load one version of the classes in question and then use it in all cases.
For this task you have to use a context aware class loader. An example for this is an OSGi container, like Apache Feilx. OSGi is version aware and can load different versions of the same library in the same Java process. It will then reference to the classes of the correct version of the library depending on the context the library is used.
To be more precise: You must convert your different versions of your Hadoop library into OSGi bundles. Then you must split your code into mutliple OSGi bundles, each with a dependency of the correct version of the Hadoop bundle in its meta data (Manifest file). When you want to start you application, you must run it in an OSGi container.
This can be done, but is quite complex. Better to clean up your code, so you only depend on one version of the Hadoop library.

SecureSocial not using extended classes in Play! 2.1 project inside SBT Multi-Project

Currently I have a Play! 2.1 project that is a sub-project of an SBT Multi-Project that is a front-end interface. The Play! project uses SecureSocial for typical authentication.
I will typically first start the SBT console to run my internal services locally in separate terminals. Finally I perform a play "project interface" "~run 9000" command in a new window to start up the interface sub-project using Play!. Problem is that on a fresh load (even after a clean) SecureSocial does not use my extended services and providers, and instead falls back on its own.
I will make a source change and reload, where SecureSocial will then use my own classes but suddenly starts throwing ClassCast exceptions using two of the same types, indicating there are conflicting ClassLoaders.
Is there a proper way to set this up so this doesn't happen? Thanks for your help!
Though not a real solution, I have in the meantime developed a workaround where I manually instantiate my own extended UserService class and bring the current Application instance into scope. I also wrote my own providers and SecureAction wrappers and designed them to use the custom UserService. It's a lot of extra code, but works around the problem.

log4j-over-slf4j missing class org.apache.log4j.varia.NullAppender logback

We are using a 3rd-party jar that contains a dependency to the log4j.varia.NullAppender class. We are transitioning to logback, but unable to change the 3rd-party jar, so we plan to use the log4j-over-sl4j bridge for now with the rest of our code still using the existing log4j calls.
Since the log4j-over-slf4j jar does not contain the NullAppender class, what is the best way to spoof it? Should we write our own and put it on the class path? Should we augment the log4j-over-slf4j bridge jar with this class?
What have others done who have run into this and does the logback project plan to support backwards compatibility? This is a real hassle for those of us who want to migrate when existing jar files use log4j classes that are not in the bridge.
Any suggestions or best practices?