spring cloud stream binder kafka doesn't work - apache-kafka

Now I'm trying to create Message Service function with kafka to use spring-cloud-stream-bind-kafka, but didn't work so well.
Configuration:
spring boot 1.4.2
build.gradle:
compile "org.springframework.cloud:spring-cloud-stream:2.0.1.RELEASE"
compile "org.springframework.cloud:spring-cloud-stream-binder-kafka:2.0.1.RELEASE"
code:
#EnableBindings(MessagePublish.class)
class MessageConfiguration {
}
interface MessagePublish {
#Output("test")
MessageChannel publish();
}
class TestService {
#Autowired
MessagePublish messagePublish;
public void doSomething() {
// do something
messagePublish.publish().send(MessageBuilder.withPayload("test").build());
}
}
It failed when I start the project with this error log
Caused by: org.springframework.boot.autoconfigure.condition.OnBeanCondition$BeanTypeDeductionException: Failed to deduce bean type for org.springframework.cloud.stream.config.BindingServiceConfiguration.bindingService
....
Caused by: java.lang.ClassNotFoundException: org.springframework.integration.support.converter.ConfigurableCompositeMessageConverter
I'm suspecting my spring boot version. It's so low version.
I think spring-cloud-stream-binder-kafka can't be used under spring boot 2.0 version or other reasons.
I don't know how can I do and how can I explore this situation...
If you give me a little advice, I really appreciate you.

If you are using Spring Boot 1.4.x version then you should use the Spring Cloud Camden release train.
https://github.com/spring-projects/spring-cloud/wiki/Spring-Cloud-Camden-Release-Notes
In particular, you should use the following versions:
compile "org.springframework.cloud:spring-cloud-stream:1.1.2.RELEASE"
compile "org.springframework.cloud:spring-cloud-stream-binder-kafka:1.1.2.RELEASE"

Related

Springboot postgres Failed to determine a suitable driver class

I am trying to develop web application using SpringBoot and Postgres Database. However, on connecting to the application, I am getting error "Failed to determine a suitable driver class"
As per advise in older posts, I have tried using driver of different version of jdbc and also tried creating bean for NamedParameterJdbcTemplate manually. I also validated that libraries are present and is accessible from Java code and those are present in classpath. But its still giving the same issue.
I am using gradle to import all jars into build path.
Here is the git repository for the code:
https://github.com/ashubisht/sample-sbs.git
Gradle dependency code:
apply plugin: 'idea'
apply plugin: 'org.springframework.boot'
apply plugin: 'io.spring.dependency-management'
dependencies {
compile("org.springframework.boot:spring-boot-starter-web")
compile("org.springframework.boot:spring-boot-starter-websocket")
compile("org.springframework.boot:spring-boot-starter-jdbc")
//compile("org.postgresql:postgresql")
compile("org.postgresql:postgresql:9.4-1206-jdbc42")
testCompile("org.springframework.boot:spring-boot-starter-test")
testCompile group: 'junit', name: 'junit', version: '4.12'
}
Code for building Bean
#Configuration
#PropertySource("classpath:application.properties")
public class Datasource {
#Value("${db.driverClassName}")
private String driverClass;
#Value("${db.url}")
private String url;
#Value("${db.username}")
private String username;
#Value("${db.password}")
private String password;
#Bean
public NamedParameterJdbcTemplate namedParameterJdbcTemplate() throws Exception{
System.out.println(driverClass+" "+ url+" "+username+" "+password);
DriverManagerDataSource source = new DriverManagerDataSource();
source.setDriverClassName(driverClass);
source.setUrl(url);
source.setUsername(username);
source.setPassword(password);
NamedParameterJdbcTemplate namedParameterJdbcTemplate = new NamedParameterJdbcTemplate(source);
return namedParameterJdbcTemplate;
}
}
Here is application.properties
server.port=8086
#spring.datasource.driverClassName=org.postgresql.Driver
#spring.datasource.url= jdbc:postgresql://localhost:5432/testdb
#spring.datasource.username=postgres
#spring.datasource.password=password
#spring.datasource.platform=postgresql
#spring.jpa.hibernate.ddl-auto=create-drop
db.driverClassName=org.postgresql.Driver
db.url=jdbc:postgresql://localhost:5432/testdb
db.username=postgres
db.password=password
The issue is resolved by creating two beans. Separate bean is created for DataSource and NamedParameterJdbcTemplate.
#Bean
public DataSource dataSource(){
System.out.println(driverClass+" "+ url+" "+username+" "+password);
DriverManagerDataSource source = new DriverManagerDataSource();
source.setDriverClassName(driverClass);
source.setUrl(url);
source.setUsername(username);
source.setPassword(password);
return source;
}
#Bean
public NamedParameterJdbcTemplate namedParameterJdbcTemplate(){
NamedParameterJdbcTemplate namedParameterJdbcTemplate = new NamedParameterJdbcTemplate(this.dataSource());
return namedParameterJdbcTemplate;
}
For me the issue was in a miss-spell for postgresSql
its only one s,
replace
spring.datasource.url=jdbc:postgres://localhost:5432/databaseName
spring.datasource.url=jdbc:postgressql://localhost:5432/databaseName
with
spring.datasource.url=jdbc:postgresql://localhost:5432/databaseName
also check the same thing on hibernate dialect,
replace PostgresSQLDialect
with PostgreSQLDialect
Had the same problem.
The solution for me was to change application.properties file extension into application.yml
For me the error was
Failed to configure a DataSource: 'url' attribute is not specified and no embedded datasource could be configured.
Reason: Failed to determine a suitable driver class
Action:
Consider the following:
If you want an embedded database (H2, HSQL or Derby), please put
it on the classpath.
If you have database settings to be loaded from a particular profile you may need to activate it (no profiles are currently active).
and the issue was missing profile
so I added the following in the classpath and it worked
spring.profiles.active=dev
Please try it
spring.r2dbc.url=r2dbc:postgresql://ip:port/datafeed?currentSchema=user_management
spring.r2dbc.username=username
spring.r2dbc.password=12345
spring.r2dbc.driver=postgresql
Hope to help you!
I got the same error. It happens when you install sts version 3.
I found the solution to this problem by doing trial & error method.
This error is occured due to the non-availability of the connection between Application Properties & the server. I got to know by changing the port number in the application Properties to 9090, later then while running the application the console showed the default port number 8080.
Thus you should maven clean and maven build your Spring Boot Application.
After the above step, you run your application normally as spring boot application, the database will get connected and the application will get started.

Spring Cloud Stream unable to detect message router

I'm trying to set up a simple cloud stream Sink but keep running into the following errors.
I've tried several binders and they all keep giving the same error.
"SEVERE","logNameSource":"org.springframework.boot.diagnostics.LoggingFailureAnalysisReporter","message":"
***************************
APPLICATION FAILED TO START
***************************
Description:
Parameter 0 of method binderAwareRouterBeanPostProcessor in org.springframework.cloud.stream.config.BindingServiceConfiguration required a bean of type '[ Lorg.springframework.integration.router.AbstractMappingMessageRouter;' that could not be found.
Action:
Consider defining a bean of type '[ Lorg.springframework.integration.router.AbstractMappingMessageRouter;' in your configuration.
I'm trying to use a simple Sink to log an incoming message from a kafka topic
#EnableBinding(Sink.class)
public class ReadEMPMesage {
private static Logger logger =
LoggerFactory.getLogger(ReadEMPMesage.class);
public ReadEMPMesage() {
System.out.println("In constructor");
}
#StreamListener(Sink.INPUT)
public void loggerSink(String ccpEvent) {
logger.info("Received" + ccpEvent);
}
}
and my configuration is as follows
# Test consumer properties
spring.kafka.consumer.auto-offset-reset=earliest
spring.kafka.consumer.group-id=testEmbeddedKafkaApplication
spring.kafka.consumer.key-deserializer=org.apache.kafka.common.serialization.ByteArrayDeserializer
spring.kafka.consumer.value-deserializer=org.apache.kafka.common.serialization.ByteArrayDeserializer
# Binding properties
spring.cloud.stream.bindings.output.destination=testEmbeddedOut
spring.cloud.stream.bindings.input.destination=testEmbeddedIn
spring.cloud.stream.bindings.output.producer.headerMode=raw
spring.cloud.stream.bindings.input.consumer.headerMode=raw
spring.cloud.stream.bindings.input.group=embeddedKafkaApplication
and my pom
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-stream-kafka</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-stream</artifactId>
</dependency>
TL;DR - check your version of Spring Boot and try upgrading it a few minor revs.
I ran into this problem on a project after upgrading from Spring Cloud DALSTON.RELEASE to Spring Cloud Edgware.SR4 -- it was strange because other projects worked fine but there was a single one that didn't.
After further investigation I realized that the troublemaker project was using Spring Boot 1.5.3.RELEASE and others were using 1.5.9.RELEASE
After upgrading Spring Boot to 1.5.9.RELEASE things seemed to start working

Spring STS - unable to resolve properties

This question is not related with spring, but with STS tool suite or with spring eclipse IDE. Given following declaration of the class
#Configuration()
#Import({ WebSharedConfig.class, SpringSecurityConfig.class })
#ComponentScan({ "com.finovera.web", "com.finovera.platformServices","com.finovera.authentication" })
#PropertySources(value = { #PropertySource({ "${FINOVERA_PROPERTIES}" }),
#PropertySource(value = { "${STATIC_OVERRIDE_PROPERTIES}", }, ignoreResourceNotFound = true) })
#Scope("singleton")
#EnableTransactionManagement
public class CabinetConfig extends WebMvcConfigurationSupport {
}
I am seeing following exception in STS plugin (org.springframework.ide.eclipse.beans.core)
org.springframework.beans.factory.BeanDefinitionStoreException: Failed to parse configuration class [com.finovera.web.config.CabinetConfig]; nested exception is java.lang.IllegalArgumentException: Could not resolve placeholder 'FINOVERA_PROPERTIES' in string value "${FINOVERA_PROPERTIES}"
at org.springframework.context.annotation.ConfigurationClassParser.parse(ConfigurationClassParser.java:181)
at org.springframework.context.annotation.ConfigurationClassPostProcessor.processConfigBeanDefinitions(ConfigurationClassPostProcessor.java:321)
Error is simple as properties name is passed to JVM when application starts. The application run time code works fine, but STS does not. Lot of functionality missing as main configuration scan fails. Commenting out property sources annotation, everything works fine.
How do I pass property value to STS or convince it to ignore PropertySources annotation?
I think this is a limitation in the current implementation. Please file an enhancement request against: https://issuetracker.springsource.com/browse/STS and we can try to fix this for the next release of STS and Spring IDE.

Autowiring issues with a class in Spring data Mongo repository

For a variety of reasons, I ended up using spring boot 1.2.0 RC2.
So a spring data mongo application that worked fine in spring boot1.1.8 is now having issues. No code was changed except for the bump to spring boot 1.2.0 RC2. This is due to the snapshot version of spring cloud moving to this spring boot version.
The repository class is as follows
#Repository
public interface OAuth2AccessTokenRepository extends MongoRepository<OAuth2AuthenticationAccessToken, String> {
public OAuth2AuthenticationAccessToken findByTokenId(String tokenId);
public OAuth2AuthenticationAccessToken findByRefreshToken(String refreshToken);
public OAuth2AuthenticationAccessToken findByAuthenticationId(String authenticationId);
public List<OAuth2AuthenticationAccessToken> findByClientIdAndUserName(String clientId, String userName);
public List<OAuth2AuthenticationAccessToken> findByClientId(String clientId);
}
This worked quite well before the bump in versions and now I see this in the log.
19:04:35.510 [main] DEBUG o.s.c.a.ClassPathBeanDefinitionScanner - Ignored because not a concrete top-level class: file [/Users/larrymitchell/rpilprojects/corerpilservicescomponents/channelMap/target/classes/com/cisco/services/rpil/mongo/repository/oauth2/OAuth2AccessTokenRepository.class]
I do have another mongo repository that is recognized but it was defined as a class implementation
#Component
public class ChannelMapRepository { ... }
This one is recognized (I defined it as a implementation class as a workaround for another problem I had). This class is recognized and seems to work fine.
19:04:35.513 [main] DEBUG o.s.c.a.ClassPathBeanDefinitionScanner - Identified candidate component class: file [/Users/larrymitchell/rpilprojects/corerpilservicescomponents/channelMap/target/classes/com/cisco/services/rpil/services/Microservice.class]
Anyone have an idea why? I looked up the various reasons for why component scanning would not work and nothing lends itself to my issue.
Try removing the #Repository annotation? Worked for me. This was an issue in Github as well.

Problem in using Apache camel with Quartz SCheduler

I am new to Apache camel and quartz scheduler. I am trying to to use them both in integration but unable to do it. I found a little article in "Camel in Action" book but I didn't manage to run the program also- here is my code
package com.cockpitconfig.schedulars;
import org.apache.camel.CamelContext;
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.impl.DefaultCamelContext;
public class TestScheduler {
public static void main(String args[]) throws Exception {
CamelContext context = new DefaultCamelContext();
context.disableJMX();
context.addRoutes(new RouteBuilder() {
#Override
public void configure() {
from(
"quartz://myTimer?trigger.repeatInterval=2000&trigger.repeatCount=-1")
.setBody().simple("Current time is ").to("stream:out");
}
});
context.start();
Thread.sleep(10000);
context.stop();
}
}
But it is giving error
Exception in thread "main" java.lang.InstantiationError: org.quartz.SimpleTrigger
at org.apache.camel.component.quartz.QuartzComponent.createEndpoint(QuartzComponent.java:119)
at org.apache.camel.component.quartz.QuartzComponent.createEndpoint(QuartzComponent.java:54)
at org.apache.camel.impl.DefaultComponent.createEndpoint(DefaultComponent.java:75)
at org.apache.camel.impl.DefaultCamelContext.getEndpoint(DefaultCamelContext.java:419)
at org.apache.camel.util.CamelContextHelper.getMandatoryEndpoint(CamelContextHelper.java:47)
at org.apache.camel.model.RouteDefinition.resolveEndpoint(RouteDefinition.java:189)
at org.apache.camel.impl.DefaultRouteContext.resolveEndpoint(DefaultRouteContext.java:110)
at org.apache.camel.impl.DefaultRouteContext.resolveEndpoint(DefaultRouteContext.java:116)
at org.apache.camel.model.FromDefinition.resolveEndpoint(FromDefinition.java:73)
at org.apache.camel.impl.DefaultRouteContext.getEndpoint(DefaultRouteContext.java:88)
at org.apache.camel.model.RouteDefinition.addRoutes(RouteDefinition.java:751)
at org.apache.camel.model.RouteDefinition.addRoutes(RouteDefinition.java:174)
at org.apache.camel.impl.DefaultCamelContext.startRoute(DefaultCamelContext.java:610)
at org.apache.camel.impl.DefaultCamelContext.startRouteDefinitions(DefaultCamelContext.java:1514)
at org.apache.camel.impl.DefaultCamelContext.doStartCamel(DefaultCamelContext.java:1306)
at org.apache.camel.impl.DefaultCamelContext.doStart(DefaultCamelContext.java:1213)
at org.apache.camel.impl.ServiceSupport.start(ServiceSupport.java:65)
at org.apache.camel.impl.ServiceSupport.start(ServiceSupport.java:52)
at org.apache.camel.impl.DefaultCamelContext.start(DefaultCamelContext.java:1191)
at com.cockpitconfig.schedulars.TestScheduler.main(TestScheduler.java:24)
Does anybody have solution to this problem?
what version of Camel are you using? it worked fine for me using 2.8-SNAPSHOT. just make sure you have camel-core, camel-quartz and camel-stream dependencies in your path.
It's not obvious from the current camel-quartz plugin docs, but as of Camel 2.11.0 it is only compatible with Quartz 1.x and is NOT compatible with Quartz 2.x (which has been out for a couple of years now) because Quartz 2.x is not compatible with Spring 3.0 (and Camel has said that it will maintain Spring 3.0 compatibility for now.
This is documented on this Camel JIRA issue.