I need to do some work with Amazon SWF in sbt scala project. So i am getting problem in generating SWF client classes. If any of you have uses SWF in a scala project, please tell how can i generate SWF client classes using sbt AspectJ.
AspectJ is not used to generate client classes. It is used only to inject interceptors for #Asynchronous and #ExponentialRetry annotations. The client side classes are generated using SWF annotation processor.
I don't know anything about Scala. But in Java you can write workflows without generated client classes using generic API that generated code relies on. To get these generic clients use getActivityClient and getWorkflowClient methods returned by DecisionContext
Related
I'm trying to build a Vert.X app using Scala, and generating the routes using an OpenAPI 3 spec through the OpenAPI3RouterFactory.
I need to generate the models described in my spec as Scala classes.
Is there any simple and straightforward way to accomplish this?
I'm using SBT to build my app, and I've already tried some sbt codegen plugins for Swagger, but none of them seem to work.
With vertx-web-api-contract, Router and validation handlers are generated at runtime, so you don't need to generate the routes. You can just start using the Router factory and mount the handlers you want as if it would be a Vert.x Web Router. If you want to bootstrap a new project there is a community tool called vertx-starter, but there is no Scala support now
Talking about the models, what you can do is organize your OpenAPI specification in different files, putting all data model definitions under a specific directory like spec/models (You can find a good guide here). Then you can configure jsonschema2pojo (sbt plugin) to generate a Scala case class for each schema inside that directory. Then, if you want to repack the spec in a single file, you can configure tools like swagger-cli to run during the compilation and pack the spec back in a single file
lets say I have a thrift file
##namespace scala com.project.artifact.thrift
namespace java com.project.artifact.java.thrift
service SomeService {
string helloWorld()
}
where different name spaces are provided for java and scala.
Now I know i can make scrooge compile in java by changing the language
scroogeLanguage in Compile := "java"
But how can I have generation tasks for both java and scala
The reason I want to do this is so I can provide pre-made clients for both java and scala projects,
without them having to pull in scrooge or thrift.
I know that's generally considered an anti pattern to bundle the generated classes rather then having consumers use the idl to build what they need, but existing non thrift projects we've produced
follow the pattern of including clients so it makes things more consistent (many of the consuming projects won't be using thrift/scrooge anyway).
when using Finagle you can place the thrift file in the directory src/main/thrift/, and when compiling it can generate thrift file automatically in the directory target/scala-2.10/src_managed/main/.
How can the Play framework do the same thing automatically?
You just need to configure the SBT plugin or the Maven plugin. Though note that this will still produce Finagle-oriented implementations - Scrooge only really supports those. (I have a pull request outstanding to add support for other kind of server implementations).
Currently I have a Play! 2.1 project that is a sub-project of an SBT Multi-Project that is a front-end interface. The Play! project uses SecureSocial for typical authentication.
I will typically first start the SBT console to run my internal services locally in separate terminals. Finally I perform a play "project interface" "~run 9000" command in a new window to start up the interface sub-project using Play!. Problem is that on a fresh load (even after a clean) SecureSocial does not use my extended services and providers, and instead falls back on its own.
I will make a source change and reload, where SecureSocial will then use my own classes but suddenly starts throwing ClassCast exceptions using two of the same types, indicating there are conflicting ClassLoaders.
Is there a proper way to set this up so this doesn't happen? Thanks for your help!
Though not a real solution, I have in the meantime developed a workaround where I manually instantiate my own extended UserService class and bring the current Application instance into scope. I also wrote my own providers and SecureAction wrappers and designed them to use the custom UserService. It's a lot of extra code, but works around the problem.
We are using a 3rd-party jar that contains a dependency to the log4j.varia.NullAppender class. We are transitioning to logback, but unable to change the 3rd-party jar, so we plan to use the log4j-over-sl4j bridge for now with the rest of our code still using the existing log4j calls.
Since the log4j-over-slf4j jar does not contain the NullAppender class, what is the best way to spoof it? Should we write our own and put it on the class path? Should we augment the log4j-over-slf4j bridge jar with this class?
What have others done who have run into this and does the logback project plan to support backwards compatibility? This is a real hassle for those of us who want to migrate when existing jar files use log4j classes that are not in the bridge.
Any suggestions or best practices?