spring boot - springfox-boot-starter integration giving 404 with no error - spring-data-jpa

Java Version: 19.0.1
Springboot: 3.0.1
using dependency:
<dependency>
<groupId>io.springfox</groupId>
<artifactId>springfox-boot-starter</artifactId>
<version>3.0.0</version>
</dependency>
Swagger configuration class
#Configuration
public class SwaggerConfiguration {
#Bean
public Docket api() {
return new Docket(DocumentationType.SWAGGER_2)
.select()
.apis(RequestHandlerSelectors.any())
.paths(PathSelectors.any())
.build();
}
There is no error, application is starting normally, but when visiting
http://localhost:8060/swagger-ui it is 404.
if I am using #EnableSwagger2 then it is showing error while starting application that decencies are not available.
I followed swagger-2-documentation-for-spring-rest-api, seems #EnableSwagger2 annotation is not required.

springfox-boot-starter version 3.0.0 seems not to support Spring Boot 3. Consider switching to SpringDoc v2 which does support it.

Related

Kafka - spring cloud stream

I am trying to use spring-cloud-stream with kafka. Below is the sample code. But it does not seem to do anything. It always creates a topic called 'output'. But the values are not published.
application.yaml
spring.cloud.stream:
function:
definition: streamSupplier
bindings:
streamSupplier-out-0:
destination: numbers
My aim is to just produce values.
#SpringBootApplication
#EnableBinding(Source.class)
public class CloudStreamDemoApplication {
private AtomicInteger atomicInteger = new AtomicInteger();
public static void main(String[] args) {
SpringApplication.run(CloudStreamDemoApplication.class, args);
}
#Bean
public Supplier<Integer> streamSupplier(){
return () -> {
System.out.println("Publishing : " + atomicInteger.incrementAndGet());
return atomicInteger.get();
};
}
}
dependency - 2.2.6.RELEASE
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-webflux</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-stream</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-stream-binder-kafka</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
</dependency>
You need to remove #EnableBinding(Source.class) from the class. If that is present, the functional bindings will not take place.
Annotation #EnableBinding has caused the issue as explained above.
Read the below excerpts from spring Docs:
Unlike previous versions of spring-cloud-stream which relied on #EnableBinding and #StreamListener annotations, the above example looks no different then any vanilla spring-boot application. It defines a single bean of type Function and that it is. So, how does it became spring-cloud-stream application? It becomes spring-cloud-stream application simply based on the presence of spring-cloud-stream and binder dependencies and auto-configuration classes on the classpath effectively setting the context for your boot application as spring-cloud-stream application. And in this context beans of type Supplier, Function or Consumer are treated as defacto message handlers triggering binding of to destinations exposed by the provided binder following certain naming conventions and rules to avoid extra configuration.

Spring #DataNeo4jTest with Procedure Support

I'm writing Spring Data Neo4J repository tests with #DataNeo4jTest and all is well until I write a test against a custom query that uses a procedure, for example apoc.coll.intersection. The error declares procedure apoc.coll.intersection is unknown. I have the APOC JAR on the classpath so am guessing I need to find a way to register the procedure with the embedded datasource/driver that #DataNeo4jTest uses.
Any help would be appreciated. Thanks.
Some background to understand the situation: The #DataNeo4jTest annotation provides you the Spring Boot based auto configuration. It will pick up your Neo4j connection configuration in your application.properties (either test or production if no test properties are defined) and create Neo4j-OGM's SessionFactory with the matching configuration for you.
There are two ways to solve you problem:
Define the SessionFactory bean by yourself with embedded instance setup and configuration:
#Bean
public SessionFactory sessionFactory() {
GraphDatabaseService graphDatabaseService = new GraphDatabaseFactory()
.newEmbeddedDatabaseBuilder(Paths.get("pathToDb").toFile()).newGraphDatabase();
registerProcedure(graphDatabaseService, MyProcedure.class);
EmbeddedDriver driver = new EmbeddedDriver(graphDatabaseService);
SessionFactory sessionFactory = new SessionFactory(driver, "package");
}
Or during "runtime" with the already existing SessionFactory bean e.g. in your test setup (make sure to do this just once)
EmbeddedDriver loadedDriver = (EmbeddedDriver) sessionFactory.getDriver();
registerProcedure(loadedDriver.getGraphDatabaseService(), MyProcedure.class);
both will call a method like this
public static void registerProcedure(GraphDatabaseService db, Class<?>...procedures) throws KernelException {
Procedures proceduresService = ((GraphDatabaseAPI) db).getDependencyResolver().resolveDependency(Procedures.class);
for (Class<?> procedure : procedures) {
proceduresService.registerProcedure(procedure,true);
proceduresService.registerFunction(procedure, true);
proceduresService.registerAggregationFunction(procedure, true);
}
}
Update: Added example and version definitions.
GraphDatabaseService graphDatabaseService = new GraphDatabaseFactory()
.newEmbeddedDatabaseBuilder(Paths.get("path/to/db").toFile()).newGraphDatabase();
// Option I
registerProcedure(graphDatabaseService, MyProcedure.class);
EmbeddedDriver driver = new EmbeddedDriver(graphDatabaseService);
SessionFactory sessionFactory = new SessionFactory(driver, "org.neo4j.ogmindex.domain");
// Option II if embedded driver is not directly accessible anymore
EmbeddedDriver loadedDriver = (EmbeddedDriver) sessionFactory.getDriver();
// register the apoc version function
registerProcedure(loadedDriver.getGraphDatabaseService(), Version.class);
// Test call to apoc.version
Session session = sessionFactory.openSession();
session.query("RETURN apoc.version()", emptyMap())
.forEach(System.out::println); // outputs {apoc.version()=3.4.0.2}
pom.xml definition for the example above:
<dependency>
<groupId>org.neo4j.test</groupId>
<artifactId>neo4j-harness-enterprise</artifactId>
<version>3.4.6</version>
</dependency>
<dependency>
<groupId>org.neo4j.procedure</groupId>
<artifactId>apoc</artifactId>
<version>3.4.0.2</version>
</dependency>
After messing around with this for quite some time, trying different dependency versions and also playing with configuration code as suggested by #meistermeier I found a solution, which was to simply use the correct version of the 2 Neo4J test JARs I was referencing. This is a Spring Boot project so here are all the Neo4J dependencies in my Maven POM that solve the issue:
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-neo4j</artifactId>
</dependency>
<dependency>
<groupId>org.neo4j.test</groupId>
<artifactId>neo4j-harness</artifactId>
<version>${neo4j.ogm.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.neo4j</groupId>
<artifactId>neo4j-ogm-embedded-driver</artifactId>
<version>${neo4j.ogm.version}</version>
<scope>test</scope>
</dependency>
I set neo4j.ogm.version to match the version specified in the spring-data-neo4j-parent POM (which is brought in transitively).

Spring Cloud - hystrix-dashboard is not working?

Spring Cloud Hystrix Circuit Breaker Pattern Example. I have added below dependency in the code taking https://howtodoinjava.com/spring/spring-cloud/spring-hystrix-circuit-breaker-tutorial/ Spring Boot Starter parent version is 1.5.13.BUILD-SNAPSHOT
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-netflix-hystrix</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-netflix-hystrix-dashboard</artifactId>
</dependency>
When I launch http://localhost:9098/hystrix, nothing is coming.
Could you please guide how to fixed it ?
Here is the code:
#SpringBootApplication
#EnableHystrixDashboard
#EnableCircuitBreaker
public class SpringHystrixSchoolServiceApplication {
public static void main(String[] args) {
SpringApplication.run(SpringHystrixSchoolServiceApplication.class, args);
}
}
Since you are using spring boot 1.3,you need to use http://localhost:9098/hystrix.html
From Spring boot 2.x hystrix URL move to http://localhost:9098/hystrix

WSWS4104E: SOAP 1.2 protocol not supported by SAAJ 1.2

I have a JAXWS client in a standalone application that is throwing:
Caused by: java.lang.UnsupportedOperationException: WSWS4104E: SOAP 1.2 Protocol is not supported by SAAJ 1.2.
at com.ibm.ws.webservices.engine.xmlsoap.SOAPFactory.setSOAPConstants(SOAPFactory.java:143)
at com.ibm.ws.webservices.engine.xmlsoap.SOAPFactory.<init>(SOAPFactory.java:111)
at com.ibm.ws.webservices.engine.soap.SAAJMetaFactoryImpl.newSOAPFactory(SAAJMetaFactoryImpl.java:68)
at javax.xml.soap.SOAPFactory.newInstance(SOAPFactory.java:297)
at com.sun.xml.internal.ws.api.SOAPVersion.<init>(SOAPVersion.java:176)
at com.sun.xml.internal.ws.api.SOAPVersion.<clinit>(SOAPVersion.java:94)
I have added the following jar com.ibm.jaxws.thinclient_8.0.0.jar but still throws the same error.
Also tried adding these dependencies:
<dependency>
<groupId>com.sun.xml.messaging.saaj</groupId>
<artifactId>saaj-impl</artifactId>
<version>1.3.25</version>
</dependency>
<dependency>
<groupId>javax.xml.soap</groupId>
<artifactId>saaj-api</artifactId>
<version>1.3.5</version>
</dependency>
Even running under Oracle's JDK 1.8 and IBM JDK 1.7.
This is driving me crazy, any idea why it doesn't work?
After strugging with this I finally understood what was going on:
when using SOAP 1.2, the thin client tries to determine if SAAJ 1.3 is available.
com.ibm.ws.webservices.engine.xmlsoap.Utils
private static final boolean isSAAJ13Available = discoverSAAJ13Availability();
discoverSAAJ13Availability() ends up trying to load com.ibm.ws.webservices.engine.xmlsoap.saaj13only.SOAPDynamicConstants which isn't on the classpath and finally raises the exception.
To solve it you also have to add the jar that contains that class: com.ibm.jaxws.thinclient_8.0.0.jar.

CQ5 QueryBuilder Reference in Sling Servlet

I am declaring a sling servlet like so
#Component(metatype = false)
#Service(Servlet.class)
#Properties({
#Property(name = "sling.servlet.paths", value = "/bin/foo/bar"),
#Property(name = "sling.servlet.methods", value = "POST") })
public class FooBarServlet extends SlingAllMethodsServlet {
...
}
I override doPost like so
#Override
protected void doPost(SlingHttpServletRequest request, SlingHttpServletResponse response) throws IOException {
...
}
And I am able to post from a client. Great!
I throw in the following
#Reference
private QueryBuilder queryBuilder;
as per the documentation, a reference to query builder should be injected. But it does not seem to. In the log I see this error
bindQueryBuilder cannot be found (java.lang.VerifyError: ...
And when I try to post to the servlet I get this
javax.jcr.RepositoryException: org.apache.sling.api.resource.PersistenceException: Resource at '/bin/foo/bar' is not modifiable.
And in the OSGi console I see my bundle is installed, and this is what it has to say about my servlet
Service ID 3075 Types: javax.servlet.Servlet
Service PID: com.myproject.FooBarServlet
Component Name: com.myproject.FooBarServlet
Component ID: 5526
Vendor: Adobe
Any suggestions as to what I am doing wrong?
I had been using this tutorial as a reference.
I came across this about the Felix Service Component Runtime (SCR)
and so I implemented the following
protected void activate(ComponentContext context) {
LOGGER.info("activating {}", this.getClass().getName());
}
protected void unbindQueryBuilder(QueryBuilder queryBuilder) {
this.queryBuilder = null;
}
protected void bindQueryBuilder(QueryBuilder queryBuilder) {
this.queryBuilder = queryBuilder;
}
and it worked! So upon closer investigation I learned that these bind/unbind methods are actually supposed to be generated by the maven-scr-plugin, of which I have version 1.6.0
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-scr-plugin</artifactId>
<version>1.6.0</version>
<executions>
<execution>
<id>generate-scr-scrdescriptor</id>
<goals>
<goal>scr</goal>
</goals>
<configuration>
<!-- Private service properties for all services. -->
<properties>
<service.vendor>Adobe</service.vendor>
</properties>
</configuration>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>1.5.2</version>
</dependency>
</dependencies>
</plugin>
and for the annotations I have 1.4.0
<dependency>
<groupId>org.apache.felix</groupId>
<artifactId>org.apache.felix.scr.annotations</artifactId>
<version>1.4.0</version>
<scope>provided</scope>
</dependency>
so although I am not sure why the bind/unbind methods are not getting generated, I know that they should be, so I generate them manually.
Update
I tried to update the maven-scr-plugin to version 1.20.0, which yielded the following error during mvn build
[ERROR] Project depends on org.apache.felix:org.apache.felix.scr.annotations:jar:1.4.0:provided
[ERROR] Minimum required version is 1.9.0
so... I updated the org.apache.felix.scr.annotations to 1.9.0. And it works! My bind/unbind accessors are generated and all is great. However, I am concerned and do not know if I should use version 1.9.0 of org.apache.felix.scr.annotations because I am marking it as provided in the maven dependency and when I look at the OSGi bundles installed on the cq instance I see the following
Apache Felix Declarative Services (org.apache.felix.scr) : Version 1.6.3.R1409029
For the dependency injection to work, you should declare the member variable as public.
Try changing it to
#Reference
public QueryBuilder queryBuilder;