When does Play load application.conf? - scala

Is application.conf already loaded when the code in Global.scala is executed? I'm asking because I've tried to read some configuration items from Global.scala and I always get None. Is there any workaround?

In Java it's available beforeStart(Application app) already
public class Global extends GlobalSettings {
public void beforeStart(Application app) {
String secret = Play.application().configuration().getString("application.secret");
play.Logger.debug("Before start secret is: " + secret);
super.beforeStart(app);
}
}
As it's required to i.e. configuring DB connection, most probably Scala works the same way (can't check)

Here below is how to read the configuration just after it has been loaded but before the application actually starts:
import play.api.{Configuration, Mode}
import play.api.GlobalSettings
import java.io.File
import utils.apidocs.InfoHelper
object Global extends GlobalSettings {
override def onLoadConfig(
config: Configuration,
path: File, classloader:
ClassLoader,
mode: Mode.Mode): Configuration = {
InfoHelper.loadApiInfo(config)
config
}
}
And here below, just for your info, is the source of InfoHelper.loadApiInfo – it just loads API info for Swagger UI:
package utils.apidocs
import play.api.Configuration
import com.wordnik.swagger.config._
import com.wordnik.swagger.model._
object InfoHelper {
def loadApiInfo(config: Configuration) = {
config.getString("application.name").map { appName =>
config.getString("application.domain").map { appDomain =>
config.getString("application.emails.apiteam").map { contact =>
val apiInfo = ApiInfo(
title = s"$appName API",
description = s"""
Fantastic application that makes you smile. You can find our
more about $appName at $appDomain.
""",
termsOfServiceUrl = s"//$appDomain/terms",
contact = contact,
license = s"$appName Subscription and Services Agreement",
licenseUrl = s"//$appDomain/license"
)
ConfigFactory.config.info = Some(apiInfo)
}}}
}
}
I hope it helps.

Related

Proto + Kafka Producer: Error serializing object to JSON: Direct self-reference leading to cycle (through reference chain unknownFields

Stack: Micronaut-Kafka, Proto3, gRPC, BloomRPC
Goal: reiceve a grpc request from BloomRPC and post it to a kafka topic
Context: I coded a gRPC endpoint which receives a simple call successfully but when trying to post such object to Kafka topic I get
12:38:18.775 [DefaultDispatcher-worker-1] ERROR i.m.r.intercept.RecoveryInterceptor - Type [com.mybank.producer.DebitProducer$Intercepted] executed with error: Exception sending producer record for method [void sendRequestMessage(String key,DebitRequest message)]: Error serializing object to JSON: Direct self-reference leading to cycle (through reference chain: com.mybank.endpoint.DebitRequest["unknownFields"]->com.google.protobuf.UnknownFieldSet["defaultInstanceForType"])
io.micronaut.messaging.exceptions.MessagingClientException: Exception sending producer record for method [void sendRequestMessage(String key,DebitRequest message)]: Error serializing object to JSON: Direct self-reference leading to cycle (through reference chain: com.mybank.endpoint.DebitRequest["unknownFields"]->com.google.protobuf.UnknownFieldSet["defaultInstanceForType"])
at io.micronaut.configuration.kafka.intercept.KafkaClientIntroductionAdvice.wrapException(KafkaClientIntroductionAdvice.java:564)
I guess somehow my issue is related to "...Error serializing object to JSON". Do I really have to convert such object to json? Can I not just post it to kafka topic? What I am missing here? Well, I can easily fix this issue by creating a new object and setting each property to it accordingly to the request object like this:
endpoint (controller) method
override suspend fun sendDebit(request: DebitRequest): DebitReply {
var dtoDebiter: Debiter = Debiter()
dtoDebiter.id = request.id.toString()
dtoDebiter.name = request.name
transactionService.postDebitTransaction(dtoDebiter)
return DebitReply.newBuilder().setMessage(postStatus).build()
}
Model created only for conversion (doesn't seem weird this solution???)
class Debiter {
lateinit var id: String
lateinit var name: String
}
So, my straight question is: can I reuse the same autogenerated stub resulted from the proto file as a model both for receiving the request and posting it to kafka topic? If so, what is wrong in code bellow? If not, what is the recommended approach?
Full code:
transaction.proto
syntax = "proto3";
option java_multiple_files = true;
option java_package = "com.mybank.endpoint";
option java_outer_classname = "TransactionProto";
option objc_class_prefix = "HLW";
package com.mybank.endpoint;
service Account {
rpc SendDebit (DebitRequest) returns (DebitReply) {}
}
message DebitRequest {
int64 id = 1;
string name = 2;
}
message DebitReply {
string message = 1;
}
transaction endpoint (controller)
package com.mybank.endpoint
import com.mybank.dto.Debiter
import com.mybank.service.TransactionService
import javax.inject.Singleton
#Singleton
#Suppress("unused")
class TransactionEndpoint(val transactionService: TransactionService) : AccountGrpcKt.AccountCoroutineImplBase(){
override suspend fun sendDebit(request: DebitRequest): DebitReply {
var postStatus: String = transactionService.postDebitTransaction(request)
return DebitReply.newBuilder().setMessage(postStatus).build()
}
}
transaction service (not really relevant here)
package com.mybank.service
import com.mybank.dto.Debiter
import com.mybank.dto.Transaction
import com.mybank.dto.Transactions
import com.mybank.endpoint.DebitRequest
import com.mybank.producer.DebitProducer
import javax.inject.Inject
import javax.inject.Named
import javax.inject.Singleton
#Singleton
class TransactionService(){
#Inject
#Named("debitProducer")
lateinit var debitProducer : DebitProducer
fun postDebitTransaction(debit: DebitRequest) : String{
debitProducer.sendRequestMessage ("1", debit)
return "posted"
}
}
kafka producer
package com.mybank.producer
import com.mybank.dto.Transaction
import com.mybank.dto.Transactions
import com.mybank.endpoint.DebitRequest
import io.micronaut.configuration.kafka.annotation.KafkaClient
import io.micronaut.configuration.kafka.annotation.KafkaKey
import io.micronaut.configuration.kafka.annotation.Topic
#KafkaClient
public interface DebitProducer {
#Topic("debit")
fun sendRequestMessage(#KafkaKey key: String?, message: DebitRequest?) {
}
}
build.gradle
plugins {
id "org.jetbrains.kotlin.jvm" version "1.3.72"
id "org.jetbrains.kotlin.kapt" version "1.3.72"
id "org.jetbrains.kotlin.plugin.allopen" version "1.3.72"
id "application"
id 'com.google.protobuf' version '0.8.13'
}
version "0.2"
group "account-control"
repositories {
mavenLocal()
jcenter()
}
configurations {
// for dependencies that are needed for development only
developmentOnly
}
dependencies {
kapt(enforcedPlatform("io.micronaut:micronaut-bom:$micronautVersion"))
kapt("io.micronaut:micronaut-inject-java")
kapt("io.micronaut:micronaut-validation")
implementation(enforcedPlatform("io.micronaut:micronaut-bom:$micronautVersion"))
implementation("org.jetbrains.kotlin:kotlin-stdlib-jdk8:${kotlinVersion}")
implementation("org.jetbrains.kotlin:kotlin-reflect:${kotlinVersion}")
implementation("org.jetbrains.kotlinx:kotlinx-coroutines-core:$kotlinxCoroutinesVersion")
implementation("io.micronaut:micronaut-runtime")
// implementation("io.micronaut.grpc:micronaut-grpc-runtime")
implementation("io.micronaut.grpc:micronaut-grpc-server-runtime:$micronautGrpcVersion")
implementation("io.micronaut.grpc:micronaut-grpc-client-runtime:$micronautGrpcVersion")
implementation("io.grpc:grpc-kotlin-stub:${grpcKotlinVersion}")
//Kafka
implementation("io.micronaut.kafka:micronaut-kafka")
//vertx
implementation("io.micronaut.sql:micronaut-vertx-mysql-client")
//implementation("io.micronaut.configuration:micronaut-vertx-mysql-client")
compile 'io.vertx:vertx-lang-kotlin:3.9.4'
//mongodb
implementation("org.mongodb:mongodb-driver-reactivestreams:4.1.1")
runtimeOnly("ch.qos.logback:logback-classic:1.2.3")
runtimeOnly("com.fasterxml.jackson.module:jackson-module-kotlin:2.9.8")
kaptTest("io.micronaut:micronaut-inject-java")
testImplementation enforcedPlatform("io.micronaut:micronaut-bom:$micronautVersion")
testImplementation("org.junit.jupiter:junit-jupiter-api:5.3.0")
testImplementation("io.micronaut.test:micronaut-test-junit5")
testImplementation("org.mockito:mockito-junit-jupiter:2.22.0")
testRuntime("org.junit.jupiter:junit-jupiter-engine:5.3.0")
testRuntime("org.jetbrains.spek:spek-junit-platform-engine:1.1.5")
}
test.classpath += configurations.developmentOnly
mainClassName = "account-control.Application"
test {
useJUnitPlatform()
}
allOpen {
annotation("io.micronaut.aop.Around")
}
compileKotlin {
kotlinOptions {
jvmTarget = '11'
//Will retain parameter names for Java reflection
javaParameters = true
}
}
//compileKotlin.dependsOn(generateProto)
compileTestKotlin {
kotlinOptions {
jvmTarget = '11'
javaParameters = true
}
}
tasks.withType(JavaExec) {
classpath += configurations.developmentOnly
jvmArgs('-XX:TieredStopAtLevel=1', '-Dcom.sun.management.jmxremote')
}
sourceSets {
main {
java {
srcDirs 'build/generated/source/proto/main/grpc'
srcDirs 'build/generated/source/proto/main/grpckt'
srcDirs 'build/generated/source/proto/main/java'
}
}
}
protobuf {
protoc { artifact = "com.google.protobuf:protoc:${protocVersion}" }
plugins {
grpc { artifact = "io.grpc:protoc-gen-grpc-java:${grpcVersion}" }
grpckt { artifact = "io.grpc:protoc-gen-grpc-kotlin:${grpcKotlinVersion}" }
}
generateProtoTasks {
all()*.plugins {
grpc {}
grpckt {}
}
}
}
in case it is relevant, here is a piece of the stub autogenerated by proto
// Generated by the protocol buffer compiler. DO NOT EDIT!
// source: transaction.proto
package com.mybank.endpoint;
/**
* Protobuf type {#code com.mybank.endpoint.DebitRequest}
*/
public final class DebitRequest extends
com.google.protobuf.GeneratedMessageV3 implements
// ##protoc_insertion_point(message_implements:com.mybank.endpoint.DebitRequest)
DebitRequestOrBuilder {
private static final long serialVersionUID = 0L;
// Use DebitRequest.newBuilder() to construct.
private DebitRequest(com.google.protobuf.GeneratedMessageV3.Builder<?> builder) {
super(builder);
}
private DebitRequest() {
name_ = "";
}
#java.lang.Override
#SuppressWarnings({"unused"})
protected java.lang.Object newInstance(
UnusedPrivateParameter unused) {
return new DebitRequest();
}
#java.lang.Override
public final com.google.protobuf.UnknownFieldSet
getUnknownFields() {
return this.unknownFields;
}
private DebitRequest(
com.google.protobuf.CodedInputStream input,
com.google.protobuf.ExtensionRegistryLite extensionRegistry)
throws com.google.protobuf.InvalidProtocolBufferException {
this();
...
*** First edition
#KafkaClient(
id="debit-client",
acks = KafkaClient.Acknowledge.ALL,
properties = [Property(name = ProducerConfig.RETRIES_CONFIG, value = "5")]
//How add these two properties in order to use Protobuf Serializer
//kafka.producers.*.key-serializer
//kafka.producers.*.value-serializer
)
public interface DebitProducer {
#Topic("debit")
fun sendRequestMessage(#KafkaKey key: String?, message: DebitRequest?) {
}
*** Second Edition
I tried added the serializer via application.yaml
micronaut:
application:
name: account-control
grpc:
server:
port: 8082
kafka:
producers:
product-client:
value:
serializer: io.confluent.kafka.serializers.protobuf.KafkaProtobufSerializer
gradle.build
//kafka-protobuf-serializer
implementation("io.confluent:kafka-protobuf-serializer:6.0.0")
and now I get during gradle build
Execution failed for task ':extractIncludeProto'.
> Could not resolve all files for configuration ':compileProtoPath'.
> Could not resolve com.squareup.wire:wire-schema:3.2.2.
Required by:
project : > io.confluent:kafka-protobuf-serializer:6.0.0 > io.confluent:kafka-protobuf-provider:6.0.0
> The consumer was configured to find a component, preferably only the resources files. However we cannot choose between the following variants of com.squareup.wire:wire-schema:3.2.2:
- jvm-api
- jvm-runtime
- metadata-api
All of them match the consumer attributes:
- Variant 'jvm-api' capability com.squareup.wire:wire-schema:3.2.2 declares a component, packaged as a jar:
- Unmatched attributes:
- Provides release status but the consumer didn't ask for it
- Provides an API but the consumer didn't ask for it
- Provides attribute 'org.jetbrains.kotlin.platform.type' with value 'jvm' but the consumer didn't ask for it
- Variant 'jvm-runtime' capability com.squareup.wire:wire-schema:3.2.2 declares a component, packaged as a jar:
- Unmatched attributes:
- Provides release status but the consumer didn't ask for it
- Provides a runtime but the consumer didn't ask for it
- Provides attribute 'org.jetbrains.kotlin.platform.type' with value 'jvm' but the consumer didn't ask for it
- Variant 'metadata-api' capability com.squareup.wire:wire-schema:3.2.2:
- Unmatched attributes:
- Doesn't say anything about its elements (required them preferably only the resources files)
- Provides release status but the consumer didn't ask for it
- Provides a usage of 'kotlin-api' but the consumer didn't ask for it
- Provides attribute 'org.jetbrains.kotlin.platform.type' with value 'common' but the consumer didn't ask for it
* Try:
Run with --info or --debug option to get more log output. Run with --scan to get full insights.
* Exception is:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':extractIncludeProto'.
at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:38)
at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.executeTask(EventFiringTaskExecuter.java:77)
at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:55)
at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:52)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:416)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:406)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:102)
at org.gradle.internal.operations.DelegatingBuildOperationExecutor.call(DelegatingBuildOperationExecutor.java:36)
at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter.execute(EventFiringTaskExecuter.java:52)
at org.gradle.execution.plan.LocalTaskNodeExecutor.execute(LocalTaskNodeExecutor.java:41)
at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:370)
at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:357)
at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:350)
at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:336)
at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.lambda$run$0(DefaultPlanExecutor.java:127)
at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.execute(DefaultPlanExecutor.java:191)
at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.executeNextNode(DefaultPlanExecutor.java:182)
at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.run(DefaultPlanExecutor.java:124)
at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
Caused by: org.gradle.api.internal.artifacts.ivyservice.DefaultLenientConfiguration$ArtifactResolveException: Could not resolve all files for configuration ':compileProtoPath'.
at org.gradle.api.internal.artifacts.configurations.DefaultConfiguration.rethrowFailure(DefaultConfiguration.java:1265)
at org.gradle.api.internal.artifacts.configurations.DefaultConfiguration.access$1800(DefaultConfiguration.java:141)
at org.gradle.api.internal.artifacts.configurations.DefaultConfiguration$ConfigurationFileCollection.visitContents(DefaultConfiguration.java:1242)
at org.gradle.api.internal.artifacts.configurations.DefaultConfiguration$ConfigurationFileCollection.visitContents(DefaultConfiguration.java:1235)
at org.gradle.api.internal.artifacts.configurations.DefaultConfiguration.visitContents(DefaultConfiguration.java:489)
at org.gradle.api.internal.file.AbstractFileCollection.visitStructure(AbstractFileCollection.java:265)
at org.gradle.api.internal.file.CompositeFileCollection.visitContents(CompositeFileCollection.java:152)
at org.gradle.api.internal.file.AbstractFileCollection.visitStructure(AbstractFileCollection.java:265)
at org.gradle.internal.fingerprint.impl.DefaultFileCollectionSnapshotter.snapshot(DefaultFileCollectionSnapshotter.java:50)
at org.gradle.internal.fingerprint.impl.AbstractFileCollectionFingerprinter.fingerprint(AbstractFileCollectionFingerprinter.java:47)
at ...
ultPlanExecutor.java:182)
at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.run(DefaultPlanExecutor.java:124)
at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)

Loader constraint violation when using Dropwizard Histogram metric in Apache Flink

I am running a streaming job on Flink 1.9.1 cluster and trying to get a histogram of values into our Prometheus metric collector. Per recommendation in Flink docs, I used Dropwizard histogram implementation with the Flink-provided wrapper, however when submitting job onto cluster, it crashes with the following traceback:
java.lang.LinkageError: loader constraint violation: when resolving method "org.apache.flink.dropwizard.metrics.DropwizardHistogramWrapper.<init>(Lcom/codahale/metrics/Histogram;)V" the class loader (instance of org/apache/flink/util/ChildFirstClassLoader) of the current class, com/example/foo/metrics/FooMeter, and the class loader (instance of sun/misc/Launcher$AppClassLoader) for the method's defining class, org/apache/flink/dropwizard/metrics/DropwizardHistogramWrapper, have different Class objects for the type com/codahale/metrics/Histogram used in the signature
at com.example.foo.metrics.FooMeter.<init>(FooMeter.scala:11)
at com.example.foo.transform.ValidFoos$.open(ValidFoos.scala:15)
at org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:36)
at org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:102)
at org.apache.flink.streaming.api.operators.StreamFlatMap.open(StreamFlatMap.java:43)
at org.apache.flink.streaming.runtime.tasks.StreamTask.openAllOperators(StreamTask.java:532)
at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:396)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
at java.lang.Thread.run(Thread.java:748)
I have found similar error in the mailing list, however using shadowJar plugin in Gradle didn't help.
Is there something I am missing?
Relevant code here:
import com.codahale.metrics.{Histogram, SlidingWindowReservoir}
import org.apache.flink.dropwizard.metrics.DropwizardHistogramWrapper
import org.apache.flink.metrics.{MetricGroup, Histogram => FlinkHistogram}
class FooMeter(metricGroup: MetricGroup, name: String) {
private var histogram: FlinkHistogram = metricGroup.histogram(
name, new DropwizardHistogramWrapper(new Histogram(new SlidingWindowReservoir(500))))
def record(fooValue: Long): Unit = {
histogram.update(fooValue)
}
}
object ValidFoos extends RichFlatMapFunction[Try[FooData], Foo] {
#transient private var fooMeter: FooMeter = _
override def open(parameters: Configuration): Unit = {
fooMeter = new FooMeter(getRuntimeContext.getMetricGroup, "foo_values")
}
override def flatMap(value: Try[FooData], out: Collector[FooData]): Unit = {
Transform.validFoo(value) foreach (foo => {
fooMeter.record(foo.value)
out.collect(foo)
})
}
}
build.gradle:
plugins {
id 'scala'
id 'application'
id 'com.github.johnrengelman.shadow' version '2.0.4'
}
ext {
flinkVersion = "1.9.1"
scalaBinaryVersion = "2.11"
scalaVersion = "2.11.12"
}
dependencies {
implementation(
"org.apache.flink:flink-streaming-scala_${scalaBinaryVersion}:${flinkVersion}",
"org.apache.flink:flink-connector-kafka_${scalaBinaryVersion}:${flinkVersion}",
"org.apache.flink:flink-runtime-web_${scalaBinaryVersion}:${flinkVersion}",
"org.apache.flink:flink-json:${flinkVersion}"
"org.apache.flink:flink-metrics-dropwizard:${flinkVersion}",
"org.scala-lang:scala-library:${scalaVersion}",
)
}
shadowJar {
relocate("org.apache.flink.dropwizard", "com.example.foo.shaded.dropwizard")
relocate("com.codahale", "com.example.foo.shaded.codahale")
}
jar {
zip64 = true
archiveName = rootProject.name + '-all.jar'
manifest {
attributes('Main-Class': 'com.example.foo.Foo')
}
from {
configurations.compileClasspath.collect {
it.isDirectory() ? it : zipTree(it)
}
configurations.runtimeClasspath.collect {
it.isDirectory() ? it : zipTree(it)
}
}
}
Further Info:
Running the code locally works
Flink cluster is custom-compiled with following directory structure:
# find /usr/lib/flink/
/usr/lib/flink/
/usr/lib/flink/plugins
/usr/lib/flink/plugins/flink-metrics-influxdb-1.9.1.jar
/usr/lib/flink/plugins/flink-s3-fs-hadoop-1.9.1.jar
/usr/lib/flink/plugins/flink-metrics-graphite-1.9.1.jar
/usr/lib/flink/plugins/flink-metrics-prometheus-1.9.1.jar
/usr/lib/flink/plugins/flink-cep_2.11-1.9.1.jar
/usr/lib/flink/plugins/flink-python_2.11-1.9.1.jar
/usr/lib/flink/plugins/flink-queryable-state-runtime_2.11-1.9.1.jar
/usr/lib/flink/plugins/flink-sql-client_2.11-1.9.1.jar
/usr/lib/flink/plugins/flink-metrics-slf4j-1.9.1.jar
/usr/lib/flink/plugins/flink-state-processor-api_2.11-1.9.1.jar
/usr/lib/flink/plugins/flink-oss-fs-hadoop-1.9.1.jar
/usr/lib/flink/plugins/flink-metrics-statsd-1.9.1.jar
/usr/lib/flink/plugins/flink-swift-fs-hadoop-1.9.1.jar
/usr/lib/flink/plugins/flink-gelly-scala_2.11-1.9.1.jar
/usr/lib/flink/plugins/flink-azure-fs-hadoop-1.9.1.jar
/usr/lib/flink/plugins/flink-metrics-datadog-1.9.1.jar
/usr/lib/flink/plugins/flink-shaded-netty-tcnative-dynamic-2.0.25.Final-7.0.jar
/usr/lib/flink/plugins/flink-s3-fs-presto-1.9.1.jar
/usr/lib/flink/plugins/flink-cep-scala_2.11-1.9.1.jar
/usr/lib/flink/plugins/flink-gelly_2.11-1.9.1.jar
/usr/lib/flink/lib
/usr/lib/flink/lib/flink-metrics-influxdb-1.9.1.jar
/usr/lib/flink/lib/flink-metrics-graphite-1.9.1.jar
/usr/lib/flink/lib/flink-metrics-prometheus-1.9.1.jar
/usr/lib/flink/lib/flink-table_2.11-1.9.1.jar
/usr/lib/flink/lib/flink-metrics-slf4j-1.9.1.jar
/usr/lib/flink/lib/log4j-1.2.17.jar
/usr/lib/flink/lib/slf4j-log4j12-1.7.15.jar
/usr/lib/flink/lib/flink-metrics-statsd-1.9.1.jar
/usr/lib/flink/lib/flink-metrics-datadog-1.9.1.jar
/usr/lib/flink/lib/flink-table-blink_2.11-1.9.1.jar
/usr/lib/flink/lib/flink-dist_2.11-1.9.1.jar
/usr/lib/flink/bin/...

Swagger grails Integration

I am new to swagger and I want to integrate swagger to Restful API project using grails framework. Please help if anybody have any idea what i am doing wrong?
my grails specification as below:
| Grails Version: 3.0.7
| Groovy Version: 2.4.4
| JVM Version: 1.8.0_71
Did some settings for swagger as below:
in build.gradle:
dependencies {
...
compile "io.swagger:swagger-core:1.5.3"
compile "io.swagger:swagger-jaxrs:1.5.3"
...
}
in resources.groovy
import io.swagger.jaxrs.config.BeanConfig
beans = {
swaggerConfig(BeanConfig) {
def serverUrl = "http://localhost:8080/"
def hostName = "localhost:8080"
resourcePackage = "grails.rest.example"
host = hostName
basePath = "/api"
version = 'v0' // Default "1".
title = 'Core Registration API, Version V0'
description = 'API for Accessing secured resources'
contact = 'testtest#mailinator.com'
license = ''
licenseUrl = ''
}
corsFilter(CorsFilter)
}
Added a Controller ApiDocController.groovy:
package grails.rest.example.apidoc
import grails.web.mapping.LinkGenerator
class ApiDocController {
LinkGenerator grailsLinkGenerator
def apiDocService
def index = {
String basePath = grailsLinkGenerator.serverBaseURL
render(view: 'index', model: [apiDocsPath: "${basePath}/api/swagger-json"])
//render(view: 'index', model: [apiDocsPath: "localhost:8080/api/swagger-json"])
//render(view: 'index', model: [apiDocsPath: "localhost:8080/dist/index.html"])
}
def swaggerJson = {
render apiDocService.generateJSON()
}
}
Added a URLMapping for controller:
"/api/info"(controller: 'ApiDoc')
"/"(controller: 'Index')
"500"(controller: 'InternalServerError')
"404"(controller: 'NotFound')
Added a service ApiDocService.groovy:
//package com.care.apidoc
package grails.rest.example.apidoc
import io.swagger.jaxrs.config.BeanConfig
import grails.transaction.Transactional
import io.swagger.util.Json
#Transactional
class ApiDocService {
def swaggerConfig
/*
* generates SWAGGer JSON
*/
def generateJSON() {
String[] schemes = ["http"] as String[]
swaggerConfig.setSchemes(schemes)
swaggerConfig.setScan(true)
def swagger = swaggerConfig.getSwagger()
Json.mapper().writeValueAsString(swagger);
}
}
added Swagger-ui in src/main/webapp/dist folder
with a working customised API URL "http://localhost:8080/api/orders" in index.html
added CorsFilter setting in src/main/groovy/CorsFilter.groovy
import org.springframework.web.filter.OncePerRequestFilter
import javax.annotation.Priority
import javax.servlet.FilterChain
import javax.servlet.ServletException
import javax.servlet.http.HttpServletRequest
import javax.servlet.http.HttpServletResponse
#Priority(Integer.MIN_VALUE)
public class CorsFilter extends OncePerRequestFilter {
public CorsFilter() { }
#Override
protected void doFilterInternal(HttpServletRequest req, HttpServletResponse resp, FilterChain chain)
throws ServletException, IOException {
String origin = req.getHeader("Origin");
boolean options = "OPTIONS".equals(req.getMethod());
resp.addHeader("Access-Control-Allow-Headers", "origin, authorization, accept, content-type, x-requested-with");
resp.addHeader("Access-Control-Allow-Methods", "GET, HEAD, POST, PUT, DELETE, TRACE, OPTIONS");
resp.addHeader("Access-Control-Max-Age", "3600");
resp.addHeader("Access-Control-Allow-Origin", origin == null ? "*" : origin);
resp.addHeader("Access-Control-Allow-Credentials", "true");
if (!options) chain.doFilter(req, resp);
}
}
On starting the server.
API for orders is working correctly, however, when I try to load the API in Swagger UI index file. it shows.
No operations defined in spec!
as attached in a pics.
Have you looked at springfox?
Here is a sample Grails Application hosted in Heroku that demonstrates the capabilities of springfox integrating with it to produce the service description in the Open API specification 2.0 (fka swagger). The source code for the demo is available here.
You can see this demo running live here demonstrating the Open API specification generated by the grails application and rendered using swagger-ui.
The library that makes this possible is springfox-grails-integration library. It is about to be released and probably needs a little bit of work to make it a grails plugin. There is some preliminary documentation of how to configure this the library repository.
NOTE: This only works with grails 3.x
Also it was a notable library showcased in the SHOW US YOUR GRAILS contest. Feedback to improve this library is much appreciated.

Serving an entire directory tree with spray.io

I would like to translate this JS code to Scala, using spray.io.
How can I translate this line below to Scala using spray.io ?
app.use('/', express.static(path.join(__dirname, 'public')));
In other word, how can I serve an entire directory tree using spray.io ?
As comment above says Spray is deprecated. But directives are similar in akka-http. Here is what you probably need (getFromResourceDirectory in your case)
pathPrefix("docs") {
get {
path("swagger.json") {
getFromResource("swagger.json", ContentTypes.`application/json`)
} ~
(pathEnd | pathSingleSlash) {
redirect("docs/index.html", StatusCodes.TemporaryRedirect)
} ~
getFromResourceDirectory("swagger-ui")
}
}
This serves files (recursively) from the directory ./web/
package com.softwaremill.spray.server
import akka.actor.ActorSystem
import spray.routing.SimpleRoutingApp
object Step1Complete extends App with SimpleRoutingApp {
implicit val actorSystem = ActorSystem()
startServer(interface = "localhost", port = 3300) {
get {
path("hello") {
complete {
"Welcome to Amber Gold!"
}
}
} ~
pathPrefix("web" ) {
getFromDirectory("./web/")
}
}
}

Scalate ResourceNotFoundException in Scalatra

I'm trying the following based on scalatra-sbt.g8:
class FooWeb extends ScalatraServlet with ScalateSupport {
beforeAll { contentType = "text/html" }
get("/") {
templateEngine.layout("/WEB-INF/scalate/templates/hello-scalate.jade")
}
}
but I'm getting the following exception (even though the file exists) - any clues?
Could not load resource: [/WEB-INF/scalate/templates/hello-scalate.jade]; are you sure it's within [null]?
org.fusesource.scalate.util.ResourceNotFoundException: Could not load resource: [/WEB-INF/scalate/templates/hello-scalate.jade]; are you sure it's within [null]?
FWIW, the innermost exception is coming from org.mortbay.jetty.handler.ContextHandler.getResource line 1142: _baseResource==null.
Got an answer from the scalatra mailing list. The problem was that I was starting the Jetty server with:
import org.mortbay.jetty.Server
import org.mortbay.jetty.servlet.{Context,ServletHolder}
val server = new Server(8080)
val root = new Context(server, "/", Context.SESSIONS)
root.addServlet(new ServletHolder(new FooWeb()), "/*")
server.start()
I needed to insert this before start():
root.setResourceBase("src/main/webapp")