Kafka - spring cloud stream - apache-kafka

I am trying to use spring-cloud-stream with kafka. Below is the sample code. But it does not seem to do anything. It always creates a topic called 'output'. But the values are not published.
application.yaml
spring.cloud.stream:
function:
definition: streamSupplier
bindings:
streamSupplier-out-0:
destination: numbers
My aim is to just produce values.
#SpringBootApplication
#EnableBinding(Source.class)
public class CloudStreamDemoApplication {
private AtomicInteger atomicInteger = new AtomicInteger();
public static void main(String[] args) {
SpringApplication.run(CloudStreamDemoApplication.class, args);
}
#Bean
public Supplier<Integer> streamSupplier(){
return () -> {
System.out.println("Publishing : " + atomicInteger.incrementAndGet());
return atomicInteger.get();
};
}
}
dependency - 2.2.6.RELEASE
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-webflux</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-stream</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-stream-binder-kafka</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
</dependency>

You need to remove #EnableBinding(Source.class) from the class. If that is present, the functional bindings will not take place.

Annotation #EnableBinding has caused the issue as explained above.
Read the below excerpts from spring Docs:
Unlike previous versions of spring-cloud-stream which relied on #EnableBinding and #StreamListener annotations, the above example looks no different then any vanilla spring-boot application. It defines a single bean of type Function and that it is. So, how does it became spring-cloud-stream application? It becomes spring-cloud-stream application simply based on the presence of spring-cloud-stream and binder dependencies and auto-configuration classes on the classpath effectively setting the context for your boot application as spring-cloud-stream application. And in this context beans of type Supplier, Function or Consumer are treated as defacto message handlers triggering binding of to destinations exposed by the provided binder following certain naming conventions and rules to avoid extra configuration.

Related

How to send trace ID through kafka

Microservice1 -> kafka -> Microservice2
How do I pass the trace ID when transferring data?
and i'm using spring sleuth for makeing trace ID.
and i'm using "compile('org.springframework.kafka:spring-kafka:2.1.2.RELEASE')"
Please read the docs https://cloud.spring.io/spring-cloud-static/Finchley.SR2/single/spring-cloud.html#_sleuth_with_zipkin_over_rabbitmq_or_kafka
48.3.3 Sleuth with Zipkin over RabbitMQ or Kafka If you want to use RabbitMQ or Kafka instead of HTTP, add the spring-rabbit or
spring-kafka dependency. The default destination name is zipkin.
If using Kafka, you must set the property spring.zipkin.sender.type
property accordingly:
spring.zipkin.sender.type: kafka [Caution] Caution
spring-cloud-sleuth-stream is deprecated and incompatible with these
destinations.
If you want Sleuth over RabbitMQ, add the spring-cloud-starter-zipkin
and spring-rabbit dependencies.
The following example shows how to do so for Gradle:
Maven.
<dependencyManagement> 1
<dependencies>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-dependencies</artifactId>
<version>${release.train.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies> </dependencyManagement>
<dependency> 2
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-zipkin</artifactId> </dependency> <dependency> 3
<groupId>org.springframework.amqp</groupId>
<artifactId>spring-rabbit</artifactId> </dependency> 1
We recommend that you add the dependency management through the Spring
BOM so that you need not manage versions yourself.
2
Add the dependency to spring-cloud-starter-zipkin. That way, all
nested dependencies get downloaded.
3
To automatically configure RabbitMQ, add the spring-rabbit dependency.
Gradle.
dependencyManagement { 1
imports {
mavenBom "org.springframework.cloud:spring-cloud-dependencies:${releaseTrainVersion}"
} }
dependencies {
compile "org.springframework.cloud:spring-cloud-starter-zipkin" 2
compile "org.springframework.amqp:spring-rabbit" 3 } 1
We recommend that you add the dependency management through the Spring
BOM so that you need not manage versions yourself.
2
Add the dependency to spring-cloud-starter-zipkin. That way, all
nested dependencies get downloaded.
3
To automatically configure RabbitMQ, add the spring-rabbit dependency.

Spring #DataNeo4jTest with Procedure Support

I'm writing Spring Data Neo4J repository tests with #DataNeo4jTest and all is well until I write a test against a custom query that uses a procedure, for example apoc.coll.intersection. The error declares procedure apoc.coll.intersection is unknown. I have the APOC JAR on the classpath so am guessing I need to find a way to register the procedure with the embedded datasource/driver that #DataNeo4jTest uses.
Any help would be appreciated. Thanks.
Some background to understand the situation: The #DataNeo4jTest annotation provides you the Spring Boot based auto configuration. It will pick up your Neo4j connection configuration in your application.properties (either test or production if no test properties are defined) and create Neo4j-OGM's SessionFactory with the matching configuration for you.
There are two ways to solve you problem:
Define the SessionFactory bean by yourself with embedded instance setup and configuration:
#Bean
public SessionFactory sessionFactory() {
GraphDatabaseService graphDatabaseService = new GraphDatabaseFactory()
.newEmbeddedDatabaseBuilder(Paths.get("pathToDb").toFile()).newGraphDatabase();
registerProcedure(graphDatabaseService, MyProcedure.class);
EmbeddedDriver driver = new EmbeddedDriver(graphDatabaseService);
SessionFactory sessionFactory = new SessionFactory(driver, "package");
}
Or during "runtime" with the already existing SessionFactory bean e.g. in your test setup (make sure to do this just once)
EmbeddedDriver loadedDriver = (EmbeddedDriver) sessionFactory.getDriver();
registerProcedure(loadedDriver.getGraphDatabaseService(), MyProcedure.class);
both will call a method like this
public static void registerProcedure(GraphDatabaseService db, Class<?>...procedures) throws KernelException {
Procedures proceduresService = ((GraphDatabaseAPI) db).getDependencyResolver().resolveDependency(Procedures.class);
for (Class<?> procedure : procedures) {
proceduresService.registerProcedure(procedure,true);
proceduresService.registerFunction(procedure, true);
proceduresService.registerAggregationFunction(procedure, true);
}
}
Update: Added example and version definitions.
GraphDatabaseService graphDatabaseService = new GraphDatabaseFactory()
.newEmbeddedDatabaseBuilder(Paths.get("path/to/db").toFile()).newGraphDatabase();
// Option I
registerProcedure(graphDatabaseService, MyProcedure.class);
EmbeddedDriver driver = new EmbeddedDriver(graphDatabaseService);
SessionFactory sessionFactory = new SessionFactory(driver, "org.neo4j.ogmindex.domain");
// Option II if embedded driver is not directly accessible anymore
EmbeddedDriver loadedDriver = (EmbeddedDriver) sessionFactory.getDriver();
// register the apoc version function
registerProcedure(loadedDriver.getGraphDatabaseService(), Version.class);
// Test call to apoc.version
Session session = sessionFactory.openSession();
session.query("RETURN apoc.version()", emptyMap())
.forEach(System.out::println); // outputs {apoc.version()=3.4.0.2}
pom.xml definition for the example above:
<dependency>
<groupId>org.neo4j.test</groupId>
<artifactId>neo4j-harness-enterprise</artifactId>
<version>3.4.6</version>
</dependency>
<dependency>
<groupId>org.neo4j.procedure</groupId>
<artifactId>apoc</artifactId>
<version>3.4.0.2</version>
</dependency>
After messing around with this for quite some time, trying different dependency versions and also playing with configuration code as suggested by #meistermeier I found a solution, which was to simply use the correct version of the 2 Neo4J test JARs I was referencing. This is a Spring Boot project so here are all the Neo4J dependencies in my Maven POM that solve the issue:
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-neo4j</artifactId>
</dependency>
<dependency>
<groupId>org.neo4j.test</groupId>
<artifactId>neo4j-harness</artifactId>
<version>${neo4j.ogm.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.neo4j</groupId>
<artifactId>neo4j-ogm-embedded-driver</artifactId>
<version>${neo4j.ogm.version}</version>
<scope>test</scope>
</dependency>
I set neo4j.ogm.version to match the version specified in the spring-data-neo4j-parent POM (which is brought in transitively).

Spring Cloud - hystrix-dashboard is not working?

Spring Cloud Hystrix Circuit Breaker Pattern Example. I have added below dependency in the code taking https://howtodoinjava.com/spring/spring-cloud/spring-hystrix-circuit-breaker-tutorial/ Spring Boot Starter parent version is 1.5.13.BUILD-SNAPSHOT
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-netflix-hystrix</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-netflix-hystrix-dashboard</artifactId>
</dependency>
When I launch http://localhost:9098/hystrix, nothing is coming.
Could you please guide how to fixed it ?
Here is the code:
#SpringBootApplication
#EnableHystrixDashboard
#EnableCircuitBreaker
public class SpringHystrixSchoolServiceApplication {
public static void main(String[] args) {
SpringApplication.run(SpringHystrixSchoolServiceApplication.class, args);
}
}
Since you are using spring boot 1.3,you need to use http://localhost:9098/hystrix.html
From Spring boot 2.x hystrix URL move to http://localhost:9098/hystrix

CQ5 QueryBuilder Reference in Sling Servlet

I am declaring a sling servlet like so
#Component(metatype = false)
#Service(Servlet.class)
#Properties({
#Property(name = "sling.servlet.paths", value = "/bin/foo/bar"),
#Property(name = "sling.servlet.methods", value = "POST") })
public class FooBarServlet extends SlingAllMethodsServlet {
...
}
I override doPost like so
#Override
protected void doPost(SlingHttpServletRequest request, SlingHttpServletResponse response) throws IOException {
...
}
And I am able to post from a client. Great!
I throw in the following
#Reference
private QueryBuilder queryBuilder;
as per the documentation, a reference to query builder should be injected. But it does not seem to. In the log I see this error
bindQueryBuilder cannot be found (java.lang.VerifyError: ...
And when I try to post to the servlet I get this
javax.jcr.RepositoryException: org.apache.sling.api.resource.PersistenceException: Resource at '/bin/foo/bar' is not modifiable.
And in the OSGi console I see my bundle is installed, and this is what it has to say about my servlet
Service ID 3075 Types: javax.servlet.Servlet
Service PID: com.myproject.FooBarServlet
Component Name: com.myproject.FooBarServlet
Component ID: 5526
Vendor: Adobe
Any suggestions as to what I am doing wrong?
I had been using this tutorial as a reference.
I came across this about the Felix Service Component Runtime (SCR)
and so I implemented the following
protected void activate(ComponentContext context) {
LOGGER.info("activating {}", this.getClass().getName());
}
protected void unbindQueryBuilder(QueryBuilder queryBuilder) {
this.queryBuilder = null;
}
protected void bindQueryBuilder(QueryBuilder queryBuilder) {
this.queryBuilder = queryBuilder;
}
and it worked! So upon closer investigation I learned that these bind/unbind methods are actually supposed to be generated by the maven-scr-plugin, of which I have version 1.6.0
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-scr-plugin</artifactId>
<version>1.6.0</version>
<executions>
<execution>
<id>generate-scr-scrdescriptor</id>
<goals>
<goal>scr</goal>
</goals>
<configuration>
<!-- Private service properties for all services. -->
<properties>
<service.vendor>Adobe</service.vendor>
</properties>
</configuration>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>1.5.2</version>
</dependency>
</dependencies>
</plugin>
and for the annotations I have 1.4.0
<dependency>
<groupId>org.apache.felix</groupId>
<artifactId>org.apache.felix.scr.annotations</artifactId>
<version>1.4.0</version>
<scope>provided</scope>
</dependency>
so although I am not sure why the bind/unbind methods are not getting generated, I know that they should be, so I generate them manually.
Update
I tried to update the maven-scr-plugin to version 1.20.0, which yielded the following error during mvn build
[ERROR] Project depends on org.apache.felix:org.apache.felix.scr.annotations:jar:1.4.0:provided
[ERROR] Minimum required version is 1.9.0
so... I updated the org.apache.felix.scr.annotations to 1.9.0. And it works! My bind/unbind accessors are generated and all is great. However, I am concerned and do not know if I should use version 1.9.0 of org.apache.felix.scr.annotations because I am marking it as provided in the maven dependency and when I look at the OSGi bundles installed on the cq instance I see the following
Apache Felix Declarative Services (org.apache.felix.scr) : Version 1.6.3.R1409029
For the dependency injection to work, you should declare the member variable as public.
Try changing it to
#Reference
public QueryBuilder queryBuilder;

How to: SLF4J with multiple bindings in a GWT Maven multi-module project?

I have a GWT multi-module Maven project.
Layout:
pom.xml
client-module //client side: contains some base classes. I use JDK's java.util.log so that it logs on Firebug's console
server-module //server side: extends some code from client-module (so it uses JUL) for common behaviour. I use log4j
client-module is shared code.
I want to use SLF4J globally but it seems that I cannot use multiple bindings (multiple bindings per project, single binding per maven module).
I would agree, but I want to use a specific SLF4J binding per maven module.
So in my case would be SLF4J-JDK for client-module and SLF4J-log4j for server-module.
Pieces of relevant code:
pom.xml
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.6.3</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-jdk14</artifactId>
<version>1.6.3</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.6.3</version>
</dependency>
client-module/BaseFoo.java
public abstract class BaseFoo {
private static final Logger logger = LoggerFactory.getLogger(BaseFoo.class.getName());
// ... interesting stuff
}
server-module/Foo.java
public class Foo extends BaseFoo {
private static final Logger logger = LoggerFactory.getLogger(BaseFoo.class.getName());
// ...more interesting stuff
}
I have already tried to configure according to documentation, but it doesn't work.
The problem is that SLF4J needs to be GWT ready.
The error I get:
[ERROR] Line 23: No source code is available for type org.slf4j.Logger; did you forget to inherit a required module?
What it means that GWT compiles the BaseFoo class to Javascript (that's why is it in client-module to run on client side) but fails because the classes/source code in SLF4j are not emulated.
Do you have any workarounds?
Thanks!