SOAP jaxws maven plugin deployed on JBoss EAP 7.2 - soap

I have a SpringBoot 2.2.6 WebApp. I have to consume a SOAP webservices and therefore I have built the services with jaxws-maven-plugin inside my pom as follows:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>jaxws-maven-plugin</artifactId>
<version>2.6</version>
<executions>
<execution>
<goals>
<goal>wsimport</goal>
</goals>
</execution>
</executions>
<configuration>
<wsdlDirectory>${project.basedir}/src/main/resources/</wsdlDirectory>
<packageName>it.mypackage.ws.client</packageName>
<sourceDestDir>
${project.build.directory}/generated-sources/
</sourceDestDir>
</configuration>
</plugin>
I also have a Configuration classes as follows:
#Configuration
public class SOAPClientConfiguration {
#Value("${jks.auth.path}")
private String jksPath; // .jks client generated by .pfx generated by cert
#Value("${jks.auth.pass}")
private String jksPass; // password for .jks
private final URL url = getClass().getClassLoader().getResource("my.wsdl");
#Bean
public Services services() {
try {
SSLContext sc = SSLContext.getInstance("TLSv1.2");
sc.getClientSessionContext().setSessionTimeout(1);
KeyManagerFactory kmf = KeyManagerFactory.getInstance( KeyManagerFactory.getDefaultAlgorithm() );
KeyStore ks = KeyStore.getInstance("JKS");
ks.load( new FileInputStream(jksPath), jksPass.toCharArray() );
kmf.init( ks, jksPass.toCharArray() );
sc.init( kmf.getKeyManagers(), null, null );
Api api = new Api(url);
Services port = api.getApiPort();
((BindingProvider) port).getRequestContext().put(
"com.sun.xml.internal.ws.transport.https.client.SSLSocketFactory", sc.getSocketFactory() );
return port;
} catch (NoSuchAlgorithmException e) {
}
}
Now all works fine if I run the application with BootDashboard and the embedded Tomcat. But if I deploy the application inside my JBoss 7.2 EAP I can't manage to do any call..
Inside the log I see often apache.cxf trying to do some.. but I don't use it on my application and I don't understand why JBoss try to use it.
Is there a way to remove the cxf from jboss-deployment-structure.xml? Someone had the same problem?
How can I make it work inside JBoss?

Related

EBean enhancement issue [POJO not enhanced]

I am Using Ebean and Vert.x for my cron jobs.
But for some reason Entities are not being enhanced by ebean-maven-plugin.
Here is what I am using:
<plugin>
<groupId>io.ebean</groupId>
<artifactId>ebean-maven-plugin</artifactId>
<version>12.1.12</version>
<executions>
<execution>
<id>main</id>
<phase>process-classes</phase>
<configuration>
<transformArgs>debug=1</transformArgs>
</configuration>
<goals>
<goal>enhance</goal>
</goals>
</execution>
</executions>
</plugin>
My entity:
import javax.persistence.Column;
import javax.persistence.Entity;
import javax.persistence.Id;
import javax.persistence.Table;
#Entity
#Table(name = "targeting_locations")
public class TargetingLocations {
#Id
#Column(name = "id")
public Long id;
// other properties
}
Here is the error code:
2020-04-15 19:39:26,671 i.e.s.d.BeanDescriptorManager - Error in deployment
java.lang.IllegalStateException: Bean class com.xxx.model.TargetingLocations is not enhanced?
at io.ebeaninternal.server.deploy.BeanDescriptorManager.setEntityBeanClass(BeanDescriptorManager.java:1414)
at io.ebeaninternal.server.deploy.BeanDescriptorManager.createByteCode(BeanDescriptorManager.java:1286)
at io.ebeaninternal.server.deploy.BeanDescriptorManager.readDeployAssociations(BeanDescriptorManager.java:1208)
at io.ebeaninternal.server.deploy.BeanDescriptorManager.readEntityDeploymentAssociations(BeanDescriptorManager.java:711)
From different posts, could not really figure out what is causing this issue.
Your domain should extend io.ebean.Model
Rest looks fine.
Hi I have the same problem recently and I change from ebean-maven-plugin to tiles-maven-plugin. It will enchance your Ebean Entity. This is the plugin:
<plugin>
<groupId>io.repaint.maven</groupId>
<artifactId>tiles-maven-plugin</artifactId>
<version>2.10</version>
<extensions>true</extensions>
<configuration>
<tiles>
<tile>org.avaje.tile:java-compile:1.1</tile>
<tile>io.ebean.tile:enhancement:5.3</tile>
</tiles>
</configuration>
</plugin>
By using tiles-maven-plugin you only need this ebean dependency:
<dependency>
<groupId>io.ebean</groupId>
<artifactId>ebean</artifactId>
<version>11.22.4</version>
</dependency>
Hope it helps. Thank you.

Is it possible to generate Q classes by gradle (Kotlin-DSL) for Kotlin MongoDB Documents?

I have a project with Maven, Kotlin, QueryDSL, Spring Boot and MongoDB. It works quite well but I thought that migrating to Gradle could speed up building it. Everything was good before I began moving module with QueryDSL. It turned up that I can not generate Q-classes for Kotlin classes annotated with #Document.
So is there a way to solve it?
Document example (placed /src/main/kotlin/com/company, in kotlin directory):
package ...
import org.springframework.data.annotation.Id
import org.springframework.data.mongodb.core.mapping.Document
#Document(collection = "myDocument")
data class MyDocument(
val smth: String
)
maven (piece that responsible for generating)
<plugin>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-maven-plugin</artifactId>
<version>${kotlin.version}</version>
<configuration>
<args>
<arg>-Werror</arg>
</args>
<annotationProcessors>
org.springframework.data.mongodb.repository.support.MongoAnnotationProcessor
</annotationProcessors>
<compilerPlugins>
<plugin>spring</plugin>
</compilerPlugins>
</configuration>
<dependencies>
<dependency>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-maven-noarg</artifactId>
<version>${kotlin.version}</version>
</dependency>
<dependency>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-maven-allopen</artifactId>
<version>${kotlin.version}</version>
</dependency>
</dependencies>
<executions>
<execution>
<id>compile</id>
<phase>compile</phase>
<configuration>
<sourceDirs>
<sourceDir>${project.basedir}/src/main/kotlin</sourceDir>
<sourceDir>${project.basedir}/src/main/java</sourceDir>
</sourceDirs>
</configuration>
<goals>
<goal>compile</goal>
</goals>
</execution>
<execution>
<id>kapt</id>
<goals>
<goal>kapt</goal>
</goals>
<configuration>
<sourceDirs>
<sourceDir>${project.basedir}/src/main/kotlin</sourceDir>
<sourceDir>${project.basedir}/src/main/java</sourceDir>
</sourceDirs>
</configuration>
</execution>
<execution>
<id>test-compile</id>
<phase>test-compile</phase>
<configuration>
<sourceDirs>
<sourceDir>${project.basedir}/src/test/kotlin</sourceDir>
</sourceDirs>
</configuration>
<goals>
<goal>test-compile</goal>
</goals>
</execution>
</executions>
</plugin>
For gradle+kotlin AFAIU we have to use kapt to generate Q-classes in this way
kapt("com.querydsl:querydsl-apt:4.2.1:jpa")
but it does not work for me, my new build.gradle.kts:
import org.jetbrains.kotlin.gradle.plugin.KotlinSourceSet
import org.jetbrains.kotlin.gradle.tasks.KotlinCompile
plugins {
id("org.springframework.boot") version "2.2.0.RELEASE"
id("io.spring.dependency-management") version "1.0.8.RELEASE"
kotlin("jvm") version "1.3.50"
kotlin("kapt") version "1.3.50"
kotlin("plugin.jpa") version "1.3.50"
id("org.jetbrains.kotlin.plugin.spring") version "1.3.21"
}
apply(plugin = "kotlin")
apply(plugin = "kotlin-kapt")
apply(plugin = "kotlin-jpa")
apply(plugin = "org.springframework.boot")
apply(plugin = "io.spring.dependency-management")
group = "com.example"
version = "0.0.1-SNAPSHOT"
java.sourceCompatibility = JavaVersion.VERSION_1_8
repositories {
mavenCentral()
}
dependencies {
implementation("org.springframework.boot:spring-boot-starter-data-jpa")
implementation("com.querydsl:querydsl-jpa")
implementation("com.querydsl:querydsl-apt")
kapt("com.querydsl:querydsl-apt:4.2.1:jpa")
kapt("org.springframework.boot:spring-boot-starter-data-jpa")
kapt("org.springframework.boot:spring-boot-configuration-processor")
kapt("org.springframework.data:spring-data-mongodb:2.2.0.RELEASE")
implementation("org.springframework.boot:spring-boot-starter-data-mongodb")
implementation("org.jetbrains.kotlin:kotlin-reflect")
implementation("org.jetbrains.kotlin:kotlin-stdlib-jdk8")
testImplementation("org.springframework.boot:spring-boot-starter-test") {
exclude(group = "org.junit.vintage", module = "junit-vintage-engine")
}
}
//sourceSets { main["kotlin"].srcDirs += [generated] }
//val querydslSrcDir = "src/main/generated"
tasks.withType<Test> {
useJUnitPlatform()
}
tasks.withType<KotlinCompile> {
kotlinOptions {
freeCompilerArgs = listOf("-Xjsr305=strict")
jvmTarget = "1.8"
}
}
In maven I can set precisely annotation processor (org.springframework.data.mongodb.repository.support.MongoAnnotationProcessor) but in gradle I can not figure out how to achieve it.
You should add implementation("com.querydsl:querydsl-mongodb") and kapt("com.querydsl:querydsl-apt") in dependencies section.
Then add the following after dependencies section.
kapt {
annotationProcessor("org.springframework.data.mongodb.repository.support.MongoAnnotationProcessor")
}
Also, don't forget to remove those JPA dependencies as well.
This is a working example i created.

Webjars not working on vert.x application

What is the proper way to configure webjars in a vert.x application? I have this simple app:
class WebVerticle : CoroutineVerticle() {
override suspend fun start() {
val router = Router.router(vertx)
// Serve static resources from the /assets directory
router.route("/").handler(StaticHandler.create())
val json = ConfigRetriever.create(vertx).getConfigAwait()
val port = json.getInteger("port")
try {
vertx.createHttpServer().requestHandler(router).listenAwait(port)
println("HTTP server started on port $port - redeploy enabled")
} catch (ex: Exception) {
error("Could not spawn web server at port $port")
}
}
}
pom.xml
<dependency>
<groupId>io.vertx</groupId>
<artifactId>vertx-web</artifactId>
<version>${vertx.version}</version>
</dependency>
<dependency>
<groupId>org.webjars</groupId>
<artifactId>vue</artifactId>
<version>2.5.16</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>io.fabric8</groupId>
<artifactId>vertx-maven-plugin</artifactId>
<version>1.0.13</version>
<executions>
<execution>
<id>vmp</id>
<goals>
<goal>package</goal>
<goal>initialize</goal>
</goals>
</execution>
</executions>
<configuration>
<redeploy>true</redeploy>
</configuration>
</plugin>
</plugins>
</build>
The file structure as following:
src/main/resources/webroot
| index.html
When I hit localhost:8888 it does work but localhost:8888/vue/vue.js does not.
Is there something else that I need to configure?
Change the router configuration to:
router.route().handler(StaticHandler.create("META-INF/resources"))
Then point your browser to:
http://localhost:8888/webjars/vue/2.5.16/vue.js

TypeSafe config : reference.conf values not overridden by application.conf

I am using typesafe application conf to provide all external connfig in my scala project, but While trying to create shaded Jar using maven-shade-plugin and running it somehow package the conf in jar itself which cannot be overridden while changing the values in application conf.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>${maven.shade.plugin.version}</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<createDependencyReducedPom>false</createDependencyReducedPom>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>test.myproj.DQJobTrigger</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
I am not sure of the behaviour when I am trying to load all configs from the application conf itself. using ConfigFactory
trait Config {
private val dqConfig = System.getenv("CONF_PATH")
private val config = ConfigFactory.parseFile(new File(dqConfig))
private val sparkConfig = config.getConfig("sparkConf")
private val sparkConfig = config.getConfig("mysql")
}
while CONF_PATH is set as the path where application.conf exist while running the jar.
application.conf
sparkConf = {
master = "yarn-client"
}
mysql = {
url = "jdbc:mysql://127.0.0.1:3306/test"
user = "root"
password = "MyCustom98"
driver = "com.mysql.jdbc.Driver"
connectionPool = disabled
keepAliveConnection = true
}
so now Even If I change the properties in the application conf still it takes the configs which were present while packaging the jar.

hadoop 2.6 cluster cannot be initialized. Successfully run with local jars, but not maven dependency

I'm trying to debug wordcount sample using apache hadoop 2.6.0.I create the project in eclipse. My first try was configure the build path and include all the hadoop jar files (extracted from hadoop folder) in the buildpath. I can successfully run the word count and get the result. Then my second try is to make this project a 'maven' project and using pom.xml to specify needed hadoop jars (and remove local jars in buildpath). Here comes the problem. This time exception throws as follows:
Exception in thread "main" java.io.IOException: Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses.
at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:120)
at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:82)
at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:75)
at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1266)
at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1262)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.mapreduce.Job.connect(Job.java:1261)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1290)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1314)
at WordCount.main(WordCount.java:59)
My wordcount code is pretty simple and classic wordcount.
import java.io.IOException;
import java.util.StringTokenizer;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
public class WordCount {
public static class TokenizerMapper
extends Mapper<Object, Text, Text, IntWritable>{
private final static IntWritable one = new IntWritable(1);
private Text word = new Text();
public void map(Object key, Text value, Context context
) throws IOException, InterruptedException {
StringTokenizer itr = new StringTokenizer(value.toString());
while (itr.hasMoreTokens()) {
word.set(itr.nextToken());
context.write(word, one);
}
}
}
public static class IntSumReducer
extends Reducer<Text,IntWritable,Text,IntWritable> {
private IntWritable result = new IntWritable();
public void reduce(Text key, Iterable<IntWritable> values,
Context context
) throws IOException, InterruptedException {
int sum = 0;
for (IntWritable val : values) {
sum += val.get();
}
result.set(sum);
context.write(key, result);
}
}
public static void main(String[] args) throws Exception {
Configuration conf = new Configuration();
Job job = Job.getInstance(conf, "word count");
job.setJarByClass(WordCount.class);
job.setMapperClass(TokenizerMapper.class);
job.setCombinerClass(IntSumReducer.class);
job.setReducerClass(IntSumReducer.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class);
FileInputFormat.addInputPath(job, new Path("/home/jsun/share/wc/input"));
FileOutputFormat.setOutputPath(job, new Path("/home/jsun/share/wc/output"));
System.exit(job.waitForCompletion(true) ? 0 : 1);
}
}
And the pom.xml for maven:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/1/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>wordcount2</groupId>
<artifactId>wordcount2</artifactId>
<version>0.0.1-SNAPSHOT</version>
<repositories>
<repository>
<id>apache</id>
<url>http://central.maven.org/maven2/</url>
</repository>
</repositories>
<build>
<sourceDirectory>src</sourceDirectory>
<resources>
<resource>
<directory>src</directory>
<excludes>
<exclude>**/*.java</exclude>
</excludes>
</resource>
</resources>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.6.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>2.6.0</version>
<type>jar</type>
</dependency>
</dependencies>
</project>
What is the difference using local hadoop jars and using maven dependencies?
Is that a problem of cluster or the wordcount or using maven?
Thanks in advance.
please check this Link
i had the same issue and i don't have hadoop installed on my machine. you can't run the program without installation. i think it looks for some environment variables to run hadoop commands.
Hope this helps