Unable to set serde, producer and consumer properties per topic/binder level in spring cloud kafka - apache-kafka

I'm trying to bring up simple pub-sub application using spring cloud kafka binder. However I'm unable to set Serializer, DeSerialzer property and other producer and consumer properties in application.yml. I consistently get serialization/deserialization error. Even kafka logs in spring boot project shows producer and consumer config still users ByteArraySerializer. Below is the code sameple.
pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.2.4.RELEASE</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<groupId>io.github.kprasad99.kafka</groupId>
<artifactId>kp-kafka-streams-example</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>kp-kafka-streams-example</name>
<description>Demo project for Spring Boot</description>
<properties>
<java.version>11</java.version>
<spring-cloud.version>Hoxton.SR1</spring-cloud.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<!-- <dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-streams</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-stream</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-stream-binder-kafka-streams</artifactId>
</dependency> -->
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-stream-binder-kafka</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-stream</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-devtools</artifactId>
<scope>runtime</scope>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-configuration-processor</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>org.junit.vintage</groupId>
<artifactId>junit-vintage-engine</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-stream-test-support</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-dependencies</artifactId>
<version>${spring-cloud.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
Processor.java
public interface Processor {
String INPUT="k-msg-source";
String OUTPUT="k-msg-sink";
#Input(INPUT)
SubscribableChannel input();
#Output(OUTPUT)
MessageChannel output();
}
KRest.java
#RestController
public class KRest {
#Autowired
private Processor processor;
#GetMapping("/send")
public ResponseEntity<Void> send(#RequestParam("key")String key, #RequestParam("msg") String text){
processor.input().send(MessageBuilder.withPayload(Message.builder().text(text).build()).setHeader(KafkaHeaders.MESSAGE_KEY, key).build());
return ResponseEntity.ok().build();
}
}
Message.java
#Data
#NoArgsConstructor
#AllArgsConstructor
#Builder
public class Message {
private String text;
}
And finally application.yml
spring:
cloud:
stream:
bindings:
k-msg-source:
binder: kafka
content-type: application/json
destination: topic.kp.msg
group: kp.msg.source
k-msg-sink:
binder: kafka
content-type: application/json
destination: topic.kp.msg
group: kp.msg.sink
producer:
partition-count: 10
binders:
kafka:
type: kafka
environment:
spring.cloud.stream.kafka.binder:
brokers: localhost:9092
configuration:
value.serde: JsonSerde
key.serde: StringSerde
producer:
value.serde: JsonSerde
key.serde: StringSerde
replication-factor: 1
Versions
spring-boot: 2.2.4
spring-cloud: Hoxton.SR1
spring-cloud-stream-kafka-binder: 3.0.1

Serdes are used by the Kafka Streams binder.
With the MessageChannel binder, the properties are value.serializer and value.deserializer (and key...), and key/value.deserializer.
You also have to specify the fully qualified named of the classes.

Final application.yml
spring.cloud.stream:
bindings:
k-msg-source:
binder: kafka
destination: topic.kp.msg
content-type: application/json
producer:
partition-count: 10
use-native-encoding: true
k-msg-sink:
binder: kafka
destination: topic.kp.msg
group: ne-publisher
content-type: application/json
consumer:
use-native-decoding: true
binders:
kafka:
type: kafka
environment:
spring.cloud.stream.kafka.binder:
brokers: PLAINTEXT://localhost:19092,PLAINTEXT://localhost:29092,PLAINTEXT://localhost:39092
configuration:
key.serializer: org.apache.kafka.common.serialization.StringSerializer
value.serializer: org.springframework.kafka.support.serializer.JsonSerializer
key.deserializer: org.apache.kafka.common.serialization.StringDeserializer
value.deserializer: org.springframework.kafka.support.serializer.JsonDeserializer
consumer-properties:
spring.json.trusted.packages: "*"

Related

Why do i get an error cannot find symbol , symbol:class Mongoclient?

I am trying to create new collection in existing database.
try {
MongoCollection collection = null;
MongoClient mongoClient = new MongoClient("127.0.0.1", 27017);
MongoTemplate mongoTemplate = new MongoTemplate(mongoClient, "udata");
collection = mongoTemplate.createCollection("MyNewCollection");
}
catch (Exception e) {
e.printStackTrace();
}
i am using spring boot and importing the following packages:
import com.mongodb.client.MongoClients;
import com.oegems.ems.EmsMongoOps;
import com.mongodb.client.MongoCollection ;
This is My pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.5.4</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<groupId>com.oegems</groupId>
<artifactId>ems</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>ems</name>
<description>Demo project for ..</description>
<properties>
<java.version>11</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-mongodb</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.11.1</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.11.1</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
Your import contains this:
import com.mongodb.client.MongoClients;
which is a valid class, a factory for MongoClient instances
Mind the extra s in the end
Then, you use:
MongoClient mongoClient
Note that the compiler has recognised MongoCollection from the line above which is the same package as MongoClient
So use one instead of the other:
import com.mongodb.client.MongoClient;
import com.mongodb.client.MongoClients;

Apache Camel routing API call to message queue

I have two applications that talk to each other using a REST API.
I would like to know if I can use Apache Camel as a proxy that could "persist" the API calls, for example storing them as messages in ActiveMQ, and then later route the requests to the actual API endpoint.
Practically, I would like to use Apache Camel to "enhance" the API endpoints adding persistence, throttling of requests, etc...
What component do you suggest to use?
You can always try to bridge your HTTP request into a queue, but making the thread wait by forcing the exchangePattern to InOut.
See this example :
import org.apache.activemq.broker.BrokerService;
import org.apache.camel.LoggingLevel;
import org.apache.camel.builder.RouteBuilder;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class Main {
private static final Logger logger = LoggerFactory.getLogger(SimpleRouteBuilder.class);
public static void main(String[] args) throws Exception {
org.apache.camel.main.Main main = new org.apache.camel.main.Main();
main.addRouteBuilder(new SimpleRouteBuilder());
logger.info("Next call is blocking, ctrl-c to exit\n");
main.run();
}
}
class SimpleRouteBuilder extends RouteBuilder {
private static final Logger logger = LoggerFactory.getLogger(SimpleRouteBuilder.class);
public void configure() throws Exception {
// launching an activemq in background
final BrokerService broker = new BrokerService();
broker.setBrokerName("activemq");
broker.addConnector("tcp://localhost:61616");
Runnable runnable = () -> {
try {
broker.start();
} catch (Exception e) {
e.printStackTrace();
}
};
runnable.run();
// receiving http request but queuing them
from("jetty:http://127.0.0.1:10000/input")
.log(LoggingLevel.INFO, logger, "received request")
.to("activemq:queue:persist?exchangePattern=InOut"); // InOut has to be forced with JMS
// dequeuing and calling backend
from("activemq:queue:persist")
.log(LoggingLevel.INFO, logger,"requesting to destination")
.removeHeaders("CamelHttp*")
.setHeader("Cache-Control",constant("private, max-age=0,no-store"))
.to("jetty:http://perdu.com?httpMethod=GET");
}
}
If you are using maven, here is the pom.xml :
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>be.jschoreels.camel</groupId>
<artifactId>camel-simple</artifactId>
<version>1.0-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-core</artifactId>
<version>2.19.2</version>
</dependency>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-jms</artifactId>
<version>2.19.2</version>
</dependency>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-jetty</artifactId>
<version>2.19.2</version>
</dependency>
<dependency>
<groupId>org.apache.activemq</groupId>
<artifactId>activemq-camel</artifactId>
<version>5.15.3</version>
</dependency>
<dependency>
<groupId>org.apache.activemq</groupId>
<artifactId>activemq-all</artifactId>
<version>5.15.3</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.activemq/activemq-kahadb-store -->
<dependency>
<groupId>org.apache.activemq</groupId>
<artifactId>activemq-kahadb-store</artifactId>
<version>5.15.3</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.25</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>1.7.25</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
</dependency>
</dependencies>
</project>

java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror

java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;
at org.elasticsearch.spark.serialization.ReflectionUtils$.org$elasticsearch$spark$serialization$ReflectionUtils$$checkCaseClass(ReflectionUtils.scala:42)
at org.elasticsearch.spark.serialization.ReflectionUtils$$anonfun$checkCaseClassCache$1.apply(ReflectionUtils.scala:84)
it is seems scala version uncompatible,but i see the document of spark ,spark 2.10 and scala 2.11.8 is ok.
that is my pom.xml and that is just a test for spark to write to elasticsearch with es-hadoop,i have no idea how to solve this exception. `
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>cn.jhTian</groupId>
<artifactId>sparkLink</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>${project.artifactId}</name>
<description>My wonderfull scala app</description>
<inceptionYear>2015</inceptionYear>
<licenses>
<license>
<name>My License</name>
<url>http://....</url>
<distribution>repo</distribution>
</license>
</licenses>
<properties>
<encoding>UTF-8</encoding>
<scala.version>2.11.8</scala.version>
<scala.compat.version>2.11</scala.compat.version>
</properties>
<repositories>
<repository>
<id>ainemo</id>
<name>xylink</name>
<url>http://10.170.209.180:8081/nexus/content/groups/public/</url>
</repository>
</repositories>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.6.4</version><!-- 2.64 -->
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<!--<dependency>-->
<!--<groupId>org.scala-lang</groupId>-->
<!--<artifactId>scala-compiler</artifactId>-->
<!--<version>${scala.version}</version>-->
<!--</dependency>-->
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-reflect</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.6.4</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-8_2.11</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-java</artifactId>
<version>3.1.0</version>
</dependency>
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch-hadoop</artifactId>
<version>5.3.0 </version>
</dependency>
<!-- Test -->
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.10</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.specs2</groupId>
<artifactId>specs2-core_${scala.compat.version}</artifactId>
<version>2.4.16</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest_${scala.compat.version}</artifactId>
<version>2.2.4</version>
<scope>test</scope>
</dependency>
</dependencies>
</project>'
this is my code
import org.apache.spark.{SparkConf, SparkContext}
import org.elasticsearch.spark._
/**
* Created by jhTian on 2017/4/19.
*/
object EsWrite {
def main(args: Array[String]) {
val sparkConf = new SparkConf()
.set("es.nodes", "1.1.1.1")
.set("es.port", "9200")
.set("es.index.auto.create", "true")
.setAppName("es-spark-demo")
val sc = new SparkContext(sparkConf)
val job1 = Job("C开发工程师","http://job.c.com","c公司","10000")
val job2 = Job("C++开发工程师","http://job.c++.com","c++公司","10000")
val job3 = Job("C#开发工程师","http://job.c#.com","c#公司","10000")
val job4 = Job("Java开发工程师","http://job.java.com","java公司","10000")
val job5 = Job("Scala开发工程师","http://job.scala.com","java公司","10000")
// val numbers = Map("one" -> 1, "two" -> 2, "three" -> 3)
// val airports = Map("arrival" -> "Otopeni", "SFO" -> "San Fran")
// val rdd=sc.makeRDD(Seq(numbers,airports))
val rdd=sc.makeRDD(Seq(job1,job2,job3,job4,job5))
rdd.saveToEs("job/info")
sc.stop()
}
}
case class Job(jobName:String, jobUrl:String, companyName:String, salary:String)'
Generally NoSuchMethodError implies the caller was compiled with a different version than was found on the classpath at runtime (or you have multiple versions on the CP).
In your case, I'd guess that es-hadoop is built against a different version of Scala I've not used maven in a little while but I think the command you need to get some useful into is mvn depdencyTree. Use the output to see which version of Scala es-hadoop is built with and then configure your project to use the same Scala version.
To get stable/reproducible builds I'd recommend using something like the maven-enforcer-plugin:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-enforcer-plugin</artifactId>
<version>1.4.1</version>
<executions>
<execution>
<id>enforce</id>
<configuration>
<rules>
<dependencyConvergence />
</rules>
</configuration>
<goals>
<goal>enforce</goal>
</goals>
</execution>
</executions>
</plugin>
it can be annoying initially but once you have all your dependencies sorted you shouldn't get issues like this anymore.
use dependency like this
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch-spark-20_2.11</artifactId>
<version>5.2.2</version>
</dependency>
for spark 2.0 and scala 2.11

OSGI : CXF exception with Karaf JAX-RS: No resource methods have been found for resource

I am getting an exception while creating a simple CXF web service with OSGI and Karaf.
Would really appreciate any comments/suggestions for fix.
Here is the description of my project and steps I followed:
Downloaded the Apache Karaf-4.0.7
Run the “mvn clean install” in the command line with the path to my project
Started Karaf console, and run the following commands
feature:repo-add cxf 3.1.8
feature:install cxf/3.1.8
Interface:
public interface MyRestService {
String pingMe(String echo);
}
Implementation:
#Path("/")
public class MyRestServiceImpl implements MyRestService {
#GET
#Path("/{echo}")
#Produces(MediaType.APPLICATION_XML)
public String pingMe(#PathParam("echo") String echo) {
return "Knock Knock: " + echo + " !!";
}
}
OSGI blueprint.xml:
<?xml version="1.0" encoding="UTF-8"?>
<blueprint xmlns="http://www.osgi.org/xmlns/blueprint/v1.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:cm="http://aries.apache.org/blueprint/xmlns/blueprint-cm/v1.1.0"
xmlns:cxf="http://cxf.apache.org/blueprint/core" xmlns:jaxrs="http://cxf.apache.org/blueprint/jaxrs"
xmlns:jaxws="http://cxf.apache.org/blueprint/jaxws"
xsi:schemaLocation="
http://www.osgi.org/xmlns/blueprint/v1.0.0 http://www.osgi.org/xmlns/blueprint/v1.0.0/blueprint.xsd
http://www.osgi.org/xmlns/blueprint-ext/v1.1.0 https://svn.apache.org/repos/asf/aries/tags/blueprint-0.3.1/blueprint-core/src/main/resources/org/apache/aries/blueprint/ext/blueprint-ext.xsd
http://cxf.apache.org/blueprint/jaxws http://cxf.apache.org/schemas/blueprint/jaxws.xsd
http://cxf.apache.org/blueprint/jaxrs http://cxf.apache.org/schemas/blueprint/jaxrs.xsd
http://cxf.apache.org/blueprint/core http://cxf.apache.org/schemas/blueprint/core.xsd
http://camel.apache.org/schema/blueprint http://camel.apache.org/schema/blueprint/camel-blueprint.xsd
http://aries.apache.org/blueprint/xmlns/blueprint-cm/v1.1.0 http://aries.apache.org/schemas/blueprint-cm/blueprint-cm-1.1.0.xsd
">
<cxf:bus id=”myBusId”>
<cxf:features>
<cxf:logging></cxf:logging>
</cxf:features>
</cxf:bus>
<jaxrs:server id="myRestService" address="/RestProject/SimpleRestCall">
<jaxrs:serviceBeans>
<ref component-id="myRestImpl" />
</jaxrs:serviceBeans>
</jaxrs:server>
<bean id="myRestImpl" class="com.rest.api.impl.MyRestServiceImpl" />
</blueprint>
Features.xml
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<features xmlns="http://karaf.apache.org/xmlns/features/v1.4.0"
name="restAPI">
<repository>mvn:org.apache.cxf.karaf/apache-cxf/3.1.8/xml/features</repository>
<feature name="test" description="simple test" version="1.0.0-SNAPSHOT">
<details>A Rest server</details>
<!-- CXF and depdendencies -->
<feature version="3.1.8">cxf-jaxrs</feature>
<!-- Jackson and dependencies -->
<bundle>mvn:org.apache.servicemix.specs/org.apache.servicemix.specs.jsr339-api-2.0/2.6.0</bundle>
<bundle>mvn:com.github.fge/jackson-coreutils/1.8</bundle>
<bundle>mvn:com.fasterxml.jackson.core/jackson-core/2.7.4</bundle>
<bundle>mvn:com.fasterxml.jackson.jaxrs/jackson-jaxrs-base/2.7.4</bundle>
<bundle>mvn:com.fasterxml.jackson.core/jackson-annotations/2.7.4</bundle>
<bundle>mvn:com.fasterxml.jackson.core/jackson-databind/2.7.4</bundle>
<bundle>mvn:com.fasterxml.jackson.jaxrs/jackson-jaxrs-json-provider/2.7.4</bundle>
<bundle>mvn:com.github.fge/msg-simple/1.1</bundle>
<bundle>mvn:com.google.guava/guava/16.0.1</bundle>
<bundle>mvn:com.github.fge/btf/1.2</bundle>
<bundle>wrap:mvn:com.google.code.findbugs/jsr305/2.0.1</bundle>
<!-- The myRest Server -->
<bundle>mvn:com.rest.ebb.test/myRest/1.0.0-SNAPSHOT</bundle>
</feature>
</features>
Pom.xml:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>com.rest.ebb.test</groupId>
<artifactId>myRest-parent</artifactId>
<version>1.0.0-SNAPSHOT</version>
</parent>
<groupId>com.rest.ebb.test</groupId>
<artifactId>myRest</artifactId>
<version>1.0.0-SNAPSHOT</version>
<packaging>bundle</packaging>
<name>MyRestServer</name>
<description>A server for test</description>
<url>.example.com</url>
<properties>
<cxf.version>3.1.8</cxf.version>
<skipTests>true</skipTests>
</properties>
<build>
<plugins>
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<version>2.4.0</version>
<extensions>true</extensions>
<configuration>
<instructions>
<Bundle-Activator>com.ebb.rest.Activator</Bundle-Activator>
<Embed-Dependency>!org.osgi.core,*</Embed-Dependency>
<Embed-Transitive>true</Embed-Transitive>
<Import-Package>
!com.google.protobuf,!com.google.protobuf.*,
!com.jcraft.*,
!com.ning.*,
!groovy.lang,
!javassist,
!lzma.*,
!net.jpountz.*,
!org.apache.*,
!org.bouncycastle.*,
!org.codehaus.*,
!org.eclipse.*,
!org.jboss.*,
!rx,
!sun.security.*,
!net.bytebuddy.*,
!org.HdrHistogram,
!org.slf4j.event,
!org.xerial.*,
!sun.reflect,
!sun.misc,
!sun.util.calendar,
!com.sun.jdi.*,
!jersey.repackaged.com.google.common.*,
!javax.servlet,
*
</Import-Package>
</instructions>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.5.1</version>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.osgi</groupId>
<artifactId>org.osgi.core</artifactId>
<version>5.0.0</version>
</dependency>
<dependency>
<groupId>org.glassfish.jersey.core</groupId>
<artifactId>jersey-server</artifactId>
<version>2.24</version>
</dependency>
<dependency>
<groupId>org.glassfish.jersey.containers</groupId>
<artifactId>jersey-container-grizzly2-http</artifactId>
<version>2.24</version>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>19.0</version>
<type>bundle</type>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-core</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>com.google.inject</groupId>
<artifactId>guice</artifactId>
<version>4.1.0</version>
</dependency>
<dependency>
<groupId>com.google.inject.extensions</groupId>
<artifactId>guice-multibindings</artifactId>
<version>4.1.0</version>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-core</artifactId>
<version>3.1.0</version>
</dependency>
<dependency>
<groupId>org.threeten</groupId>
<artifactId>threetenbp</artifactId>
<version>1.3.2</version>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.8.0</version>
</dependency>
<dependency>
<groupId>com.opencsv</groupId>
<artifactId>opencsv</artifactId>
<version>3.7</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>1.7.21</version>
</dependency>
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<version>4.5.2</version>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
<version>2.5</version>
</dependency>
<dependency>
<groupId>javax.ws.rs</groupId>
<artifactId>jsr311-api</artifactId>
<version>1.1.1</version>
</dependency>
<dependency>
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-rt-frontend-jaxrs</artifactId>
<version>${cxf.version}</version>
</dependency>
<dependency>
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-rt-transports-http</artifactId>
<version>${cxf.version}</version>
</dependency>
<dependency>
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-rt-transports-http-jetty</artifactId>
<version>${cxf.version}</version>
</dependency>
</dependencies>
</project>
Errors:
Am not able to deploy osgi bundles in Karaf, and getting the following errors:
2017-01-06 11:00:39,340 | WARN | pool-9-thread-1 | ResourceUtils | 141 - org.apache.cxf.cxf-rt-frontend-jaxrs - 3.1.8 | No resource methods have been found for reso
class com.rest.api.impl.MyRestServiceImpl
2017-01-06 11:00:39,373 | ERROR | pool-9-thread-1 | AbstractJAXRSFactoryBean | 141 - org.apache.cxf.cxf-rt-frontend-jaxrs - 3.1.8 | No resource classes found
2017-01-06 11:00:39,375 | WARN | pool-9-thread-1 | BeanRecipe | 33 - org.apache.aries.blueprint.core - 1.6.2 | Object to be destroyed is not an instance of Unwra
edBeanHolder, type: null
2017-01-06 11:00:39,385 | ERROR | pool-9-thread-1 | BlueprintContainerImpl | 33 - org.apache.aries.blueprint.core - 1.6.2 | Unable to start blueprint container for bundle mvn:com.abb.ecc.my/myRest/1.0.0-SNAPSHOT
org.osgi.service.blueprint.container.ComponentDefinitionException: Unable to initialize bean myRestService
at org.apache.aries.blueprint.container.BeanRecipe.runBeanProcInit(BeanRecipe.java:738)[33:org.apache.aries.blueprint.core:1.6.2]
at org.apache.aries.blueprint.container.BeanRecipe.internalCreate2(BeanRecipe.java:848)[33:org.apache.aries.blueprint.core:1.6.2]
at org.apache.aries.blueprint.container.BeanRecipe.internalCreate(BeanRecipe.java:811)[33:org.apache.aries.blueprint.core:1.6.2]
at org.apache.aries.blueprint.di.AbstractRecipe$1.call(AbstractRecipe.java:79)[33:org.apache.aries.blueprint.core:1.6.2]
at java.util.concurrent.FutureTask.run(FutureTask.java:266)[:1.8.0_51]
at org.apache.aries.blueprint.di.AbstractRecipe.create(AbstractRecipe.java:88)[33:org.apache.aries.blueprint.core:1.6.2]
at org.apache.aries.blueprint.container.BlueprintRepository.createInstances(BlueprintRepository.java:255)[33:org.apache.aries.blueprint.core:1.6.2]
at org.apache.aries.blueprint.container.BlueprintRepository.createAll(BlueprintRepository.java:186)[33:org.apache.aries.blueprint.core:1.6.2]
at org.apache.aries.blueprint.container.BlueprintContainerImpl.instantiateEagerComponents(BlueprintContainerImpl.java:724)[33:org.apache.aries.blueprint.core:1.6.2]
at org.apache.aries.blueprint.container.BlueprintContainerImpl.doRun(BlueprintContainerImpl.java:411)[33:org.apache.aries.blueprint.core:1.6.2]
at org.apache.aries.blueprint.container.BlueprintContainerImpl.run(BlueprintContainerImpl.java:276)[33:org.apache.aries.blueprint.core:1.6.2]
at org.apache.aries.blueprint.container.BlueprintExtender.createContainer(BlueprintExtender.java:300)[33:org.apache.aries.blueprint.core:1.6.2]
at org.apache.aries.blueprint.container.BlueprintExtender.createContainer(BlueprintExtender.java:269)[33:org.apache.aries.blueprint.core:1.6.2]
at org.apache.aries.blueprint.container.BlueprintExtender.createContainer(BlueprintExtender.java:265)[33:org.apache.aries.blueprint.core:1.6.2]
at org.apache.aries.blueprint.container.BlueprintExtender.modifiedBundle(BlueprintExtender.java:255)[33:org.apache.aries.blueprint.core:1.6.2]
at org.apache.aries.util.tracker.hook.BundleHookBundleTracker$Tracked.customizerModified(BundleHookBundleTracker.java:500)[43:org.apache.aries.util:1.1.1]
at org.apache.aries.util.tracker.hook.BundleHookBundleTracker$Tracked.customizerModified(BundleHookBundleTracker.java:433)[43:org.apache.aries.util:1.1.1]
at org.apache.aries.util.tracker.hook.BundleHookBundleTracker$AbstractTracked.track(BundleHookBundleTracker.java:725)[43:org.apache.aries.util:1.1.1]
at org.apache.aries.util.tracker.hook.BundleHookBundleTracker$Tracked.bundleChanged(BundleHookBundleTracker.java:463)[43:org.apache.aries.util:1.1.1]
at org.apache.aries.util.tracker.hook.BundleHookBundleTracker$BundleEventHook.event(BundleHookBundleTracker.java:422)[43:org.apache.aries.util:1.1.1]
at org.apache.felix.framework.util.SecureAction.invokeBundleEventHook(SecureAction.java:1179)[org.apache.felix.framework-5.4.0.jar:]
at org.apache.felix.framework.util.EventDispatcher.createWhitelistFromHooks(EventDispatcher.java:731)[org.apache.felix.framework-5.4.0.jar:]
at org.apache.felix.framework.util.EventDispatcher.fireBundleEvent(EventDispatcher.java:486)[org.apache.felix.framework-5.4.0.jar:]
at org.apache.felix.framework.Felix.fireBundleEvent(Felix.java:4541)[org.apache.felix.framework-5.4.0.jar:]
at org.apache.felix.framework.Felix.startBundle(Felix.java:2172)[org.apache.felix.framework-5.4.0.jar:]
at org.apache.felix.framework.BundleImpl.start(BundleImpl.java:998)[org.apache.felix.framework-5.4.0.jar:]
at org.apache.felix.framework.BundleImpl.start(BundleImpl.java:984)[org.apache.felix.framework-5.4.0.jar:]
at org.apache.karaf.features.internal.service.FeaturesServiceImpl.startBundle(FeaturesServiceImpl.java:1286)[10:org.apache.karaf.features.core:4.0.7]
at org.apache.karaf.features.internal.service.Deployer.deploy(Deployer.java:846)[10:org.apache.karaf.features.core:4.0.7]
at org.apache.karaf.features.internal.service.FeaturesServiceImpl.doProvision(FeaturesServiceImpl.java:1176)[10:org.apache.karaf.features.core:4.0.7]
at org.apache.karaf.features.internal.service.FeaturesServiceImpl$1.call(FeaturesServiceImpl.java:1074)[10:org.apache.karaf.features.core:4.0.7]
at java.util.concurrent.FutureTask.run(FutureTask.java:266)[:1.8.0_51]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)[:1.8.0_51]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)[:1.8.0_51]
at java.lang.Thread.run(Thread.java:745)[:1.8.0_51]
Caused by: org.apache.cxf.service.factory.ServiceConstructionException
at org.apache.cxf.jaxrs.JAXRSServerFactoryBean.create(JAXRSServerFactoryBean.java:219)
at org.apache.cxf.jaxrs.JAXRSServerFactoryBean.init(JAXRSServerFactoryBean.java:142)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)[:1.8.0_51]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)[:1.8.0_51]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)[:1.8.0_51]
at java.lang.reflect.Method.invoke(Method.java:497)[:1.8.0_51]
at org.apache.aries.blueprint.utils.ReflectionUtils.invoke(ReflectionUtils.java:299)[33:org.apache.aries.blueprint.core:1.6.2]
at org.apache.aries.blueprint.container.BeanRecipe.invoke(BeanRecipe.java:980)[33:org.apache.aries.blueprint.core:1.6.2]
at org.apache.aries.blueprint.container.BeanRecipe.runBeanProcInit(BeanRecipe.java:736)[33:org.apache.aries.blueprint.core:1.6.2]
... 34 more
Caused by: org.apache.cxf.service.factory.ServiceConstructionException: No resource classes found
at org.apache.cxf.jaxrs.AbstractJAXRSFactoryBean.checkResources(AbstractJAXRSFactoryBean.java:317)
at org.apache.cxf.jaxrs.JAXRSServerFactoryBean.create(JAXRSServerFactoryBean.java:159)
... 42 more
Try to put the jax-rs annotations on the service interface.
Another possible cause is that the annotation package is not imported. You block a lot of imports in your pom. Can you check without these definitions.
On my side your code works but I had to change the following things:
Remove the cxf:bus from the blueprint
Remove the #Path on the implementation class
Change the import section in the pom.xml
Implementation:
package com.mycompany.testcxf;
import javax.ws.rs.GET;
import javax.ws.rs.Path;
import javax.ws.rs.PathParam;
import javax.ws.rs.Produces;
import javax.ws.rs.core.MediaType;
public class MyRestServiceImpl implements MyRestService {
#Override
#GET
#Path("/{echo}")
#Produces(MediaType.APPLICATION_XML)
public String pingMe(#PathParam("echo") String echo) {
return "Knock Knock: " + echo + " !!";
}
}
pom.xml:
<osgi.export.package>{local-packages}</osgi.export.package>
<osgi.import.package>
com.mycompany.testcxf*,
*
</osgi.import.package>
and I only import the dependency:
<dependency>
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-rt-frontend-jaxrs</artifactId>
<version>2.7.13</version>
</dependency>
Note I cannot test the CXF version 3 as you because my OSGI container is bound to this version. However this setup works perfectly for me, then it could be an issue with the CXF version and some incompatibilities with the embedded http server (Jetty?).
My setup is ServiceMix 5.1.4 with:
Version 2.3.9 of Apache Karaf
Version 2.13.3 of Camel
Version 2.7.13 of CXF

Spring Cloud Ribbon: Load balancer rule configuration doesn't work

Here is my application.properties:
spring.application.name=person
server.port=8080
eureka.client.service-url.defaultZone=http://localhost:8761/eureka
# this line of config doesn't work
person.ribbon.NFLoadBalancerRuleClassName=asdfasdfasdf
By setting person.ribbon.NFLoadBalancerRuleClassName to asdfasdfasdf there should be some errors shown in console output but there's none, which means this config doesn't work. I cannot tell what's going on.
Here are the dependencies:
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-actuator</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-eureka-server</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-hystrix</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-hystrix-dashboard</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-feign</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-ribbon</artifactId>
</dependency>
</dependencies>
The version of spring-cloud is Brixton.SR3,
I successfully configured Ribbon with following configuration class:
#Configuration
#RibbonClient(name = "person", configuration = RibbonConfiguration.RibbonConfig.class)
public class RibbonConfiguration {
static class RibbonConfig {
#Bean
public IRule rule() {
return new WeightedResponseTimeRule();
}
}
}