Spring #DataNeo4jTest with Procedure Support - spring-data

I'm writing Spring Data Neo4J repository tests with #DataNeo4jTest and all is well until I write a test against a custom query that uses a procedure, for example apoc.coll.intersection. The error declares procedure apoc.coll.intersection is unknown. I have the APOC JAR on the classpath so am guessing I need to find a way to register the procedure with the embedded datasource/driver that #DataNeo4jTest uses.
Any help would be appreciated. Thanks.

Some background to understand the situation: The #DataNeo4jTest annotation provides you the Spring Boot based auto configuration. It will pick up your Neo4j connection configuration in your application.properties (either test or production if no test properties are defined) and create Neo4j-OGM's SessionFactory with the matching configuration for you.
There are two ways to solve you problem:
Define the SessionFactory bean by yourself with embedded instance setup and configuration:
#Bean
public SessionFactory sessionFactory() {
GraphDatabaseService graphDatabaseService = new GraphDatabaseFactory()
.newEmbeddedDatabaseBuilder(Paths.get("pathToDb").toFile()).newGraphDatabase();
registerProcedure(graphDatabaseService, MyProcedure.class);
EmbeddedDriver driver = new EmbeddedDriver(graphDatabaseService);
SessionFactory sessionFactory = new SessionFactory(driver, "package");
}
Or during "runtime" with the already existing SessionFactory bean e.g. in your test setup (make sure to do this just once)
EmbeddedDriver loadedDriver = (EmbeddedDriver) sessionFactory.getDriver();
registerProcedure(loadedDriver.getGraphDatabaseService(), MyProcedure.class);
both will call a method like this
public static void registerProcedure(GraphDatabaseService db, Class<?>...procedures) throws KernelException {
Procedures proceduresService = ((GraphDatabaseAPI) db).getDependencyResolver().resolveDependency(Procedures.class);
for (Class<?> procedure : procedures) {
proceduresService.registerProcedure(procedure,true);
proceduresService.registerFunction(procedure, true);
proceduresService.registerAggregationFunction(procedure, true);
}
}
Update: Added example and version definitions.
GraphDatabaseService graphDatabaseService = new GraphDatabaseFactory()
.newEmbeddedDatabaseBuilder(Paths.get("path/to/db").toFile()).newGraphDatabase();
// Option I
registerProcedure(graphDatabaseService, MyProcedure.class);
EmbeddedDriver driver = new EmbeddedDriver(graphDatabaseService);
SessionFactory sessionFactory = new SessionFactory(driver, "org.neo4j.ogmindex.domain");
// Option II if embedded driver is not directly accessible anymore
EmbeddedDriver loadedDriver = (EmbeddedDriver) sessionFactory.getDriver();
// register the apoc version function
registerProcedure(loadedDriver.getGraphDatabaseService(), Version.class);
// Test call to apoc.version
Session session = sessionFactory.openSession();
session.query("RETURN apoc.version()", emptyMap())
.forEach(System.out::println); // outputs {apoc.version()=3.4.0.2}
pom.xml definition for the example above:
<dependency>
<groupId>org.neo4j.test</groupId>
<artifactId>neo4j-harness-enterprise</artifactId>
<version>3.4.6</version>
</dependency>
<dependency>
<groupId>org.neo4j.procedure</groupId>
<artifactId>apoc</artifactId>
<version>3.4.0.2</version>
</dependency>

After messing around with this for quite some time, trying different dependency versions and also playing with configuration code as suggested by #meistermeier I found a solution, which was to simply use the correct version of the 2 Neo4J test JARs I was referencing. This is a Spring Boot project so here are all the Neo4J dependencies in my Maven POM that solve the issue:
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-neo4j</artifactId>
</dependency>
<dependency>
<groupId>org.neo4j.test</groupId>
<artifactId>neo4j-harness</artifactId>
<version>${neo4j.ogm.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.neo4j</groupId>
<artifactId>neo4j-ogm-embedded-driver</artifactId>
<version>${neo4j.ogm.version}</version>
<scope>test</scope>
</dependency>
I set neo4j.ogm.version to match the version specified in the spring-data-neo4j-parent POM (which is brought in transitively).

Related

java.lang.NoClassDefFoundError: javax/xml/transform/TransformerConfigurationException

I am using dummy html string and trying to create the pdf from that...
once it tries to create ITextRenderer object, I am getting the "java.lang.NoClassDefFoundError: javax/xml/transform/TransformerConfigurationException"
Document doc = Jsoup.parse("<html><head><title>Pdf Generation..!</title></head><body><p>Pdf generated using flying saucer pdf openpdf!!!!</p></body></html>","UTF-8");
doc.outputSettings().syntax(Document.OutputSettings.Syntax.xml);
try (OutputStream os = new FileOutputStream("output.pdf")){
ITextRenderer renderer = new ITextRenderer();
SharedContext cntxt = renderer.getSharedContext();
cntxt.setPrint(true);
cntxt.setInteractive(false);
renderer.setDocumentFromString(doc.html(), "");
renderer.layout();
renderer.createPDF(os);
logger.info("PDF Generation using OpenPDF Done Successfully!!!");
}
catch(Exception ex){
ex.printStackTrace();
}
This is a maven archetype project and dependencies used for this are,
<dependency>
<groupId>org.jsoup</groupId>
<artifactId>jsoup</artifactId>
<version>1.14.3</version>
</dependency>
<dependency>
<groupId>org.xhtmlrenderer</groupId>
<artifactId>flying-saucer-pdf-openpdf</artifactId>
<version>9.1.20</version>
</dependency>
<dependency>
<groupId>org.xhtmlrenderer</groupId>
<artifactId>flying-saucer-core</artifactId>
<version>9.1.20</version>
</dependency>
I have looked through some of the shared suggestions, but none of it could resolve this...
Check the Import-Package directive in the BND Maven plugin. Either import javax.xml.transform explicitly or import everything (*)
javax.xml.transform.* is a provided API which is present in many bundle definitions. The correct Import-Package should include something like this:
javax.xml.transform,version=2.1.0 from org.apache.felix.framework (0)

Error while creating building a hibernate app without Maven

I am learning to hibernate so I made a small app for crud operation using mySQL as database. However, I am getting some errors and I cannot find the solution anywhere. SDK 17.0.2, I am not using maven , Also all hibernate final jar files have been added
my class:
import org.hibernate.Session;
import org.hibernate.SessionFactory;
import org.hibernate.cfg.Configuration;
import com.luv2code.hibernate.demo.entity.Student;
public class CreateStudentDemo {
public static void main(String[] args) {
// Create session factory
SessionFactory factory= new Configuration()
.configure("hibernate.cfg.xml")
.addAnnotatedClass(Student.class)
.buildSessionFactory();
//Create session
Session session = factory.getCurrentSession();
try {
//Create a Student object
System.out.println("Creating new student");
Student tempStudent = new Student("Pau;", "Wall", "paul#luv2code.com");
//Start a transaction
session.beginTransaction();
//save the student object
session.save(tempStudent);
//commit transaction
session.getTransaction().commit();
}
finally {
factory.close();
}
}
}
runtime error :
java.security.PrivilegedActionException: java.lang.NoSuchMethodException: sun.misc.Unsafe.defineClass(java.lang.String,[B,int,int,java.lang.ClassLoader,java.security.ProtectionDomain)
Caused by: java.lang.NoSuchMethodException: sun.misc.Unsafe.defineClass(java.lang.String,[B,int,int,java.lang.ClassLoader,java.security.ProtectionDomain)
Exception in thread "main" java.lang.NullPointerException: Cannot invoke "java.lang.reflect.Method.invoke(Object, Object[])" because "com.sun.xml.bind.v2.runtime.reflect.opt.Injector.defineClass" is null
java removed java.xml.bind from JAVA 9 and higher editions.
You need to add these jar files manually to your library
basically you can to that in this way.
If you use Intellij this is the easyest way:
jaxb-impl-2.3.0.jar
<dependency>
<groupId>com.sun.xml.bind</groupId>
<artifactId>jaxb-impl</artifactId>
<version>2.3.0</version>
</dependency>
jaxb-core-2.3.0.jar
<dependency>
<groupId>com.sun.xml.bind</groupId>
<artifactId>jaxb-core</artifactId>
<version>2.3.0</version>
</dependency>
jaxb-api-2.3.1.jar
<dependency>
<groupId>javax.xml.bind</groupId>
<artifactId>jaxb-api</artifactId>
<version>2.3.1</version>
</dependency>
javax.activation-api-1.2.0.jar
<dependency>
<groupId>com.sun.activation</groupId>
<artifactId>javax.activation</artifactId>
<version>1.2.0</version>
</dependency>
If you use intellij here is a step my step to get these files through IDE
go to project structure:
Libraries, and click + sign
Chick to from Maven:
Paste these dependencies to the search.
It will automatically find exact jar files to download, Kist Paste it and click to the enter that is it.
Also later you can just add these files to the classpath just in case:
java removed java.xml.bind from JAVA 9 and higher editions.
Here is the eclipse solution:
In Eclipse IDE:
Download these jar files and paste them inside lib folder:
Go the project properties: The last line
Done these steps:
Happy Coding! :)

Kafka - spring cloud stream

I am trying to use spring-cloud-stream with kafka. Below is the sample code. But it does not seem to do anything. It always creates a topic called 'output'. But the values are not published.
application.yaml
spring.cloud.stream:
function:
definition: streamSupplier
bindings:
streamSupplier-out-0:
destination: numbers
My aim is to just produce values.
#SpringBootApplication
#EnableBinding(Source.class)
public class CloudStreamDemoApplication {
private AtomicInteger atomicInteger = new AtomicInteger();
public static void main(String[] args) {
SpringApplication.run(CloudStreamDemoApplication.class, args);
}
#Bean
public Supplier<Integer> streamSupplier(){
return () -> {
System.out.println("Publishing : " + atomicInteger.incrementAndGet());
return atomicInteger.get();
};
}
}
dependency - 2.2.6.RELEASE
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-webflux</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-stream</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-stream-binder-kafka</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
</dependency>
You need to remove #EnableBinding(Source.class) from the class. If that is present, the functional bindings will not take place.
Annotation #EnableBinding has caused the issue as explained above.
Read the below excerpts from spring Docs:
Unlike previous versions of spring-cloud-stream which relied on #EnableBinding and #StreamListener annotations, the above example looks no different then any vanilla spring-boot application. It defines a single bean of type Function and that it is. So, how does it became spring-cloud-stream application? It becomes spring-cloud-stream application simply based on the presence of spring-cloud-stream and binder dependencies and auto-configuration classes on the classpath effectively setting the context for your boot application as spring-cloud-stream application. And in this context beans of type Supplier, Function or Consumer are treated as defacto message handlers triggering binding of to destinations exposed by the provided binder following certain naming conventions and rules to avoid extra configuration.

How to run Neo4j with OSM and Neo4jSpatial?

Hello i'm new in neo4j and i would like to use OSM + Neo4j Spatial.
I have a maven project and my Neo4j version is 2.3.0-M01
I have a simple code just for importing an OSM file but it displays some errors in the import files: GraphDatabaseService, EmbeddedGraphDatabase and BatchInserter.
package testOSM;
import java.nio.charset.Charset;
import org.neo4j.gis.spatial.osm.OSMImporter;
import org.neo4j.graphdb.GraphDatabaseService;
import org.neo4j.kernel.EmbeddedGraphDatabase;
import org.neo4j.kernel.impl.batchinsert.BatchInserter;
public class TestOsm {
private static final String DB_PATH = "/community/data/graph.db";
public static void main(final String[] args){
OSMImporter importer = new OSMImporter("clz_map.osm");
importer.setCharset(Charset.forName("UTF-8"));
BatchInserter batchInserter = BatchInserter.inserter(DB_PATH);
try{
importer.importFile(batchInserter, "clz_map.osm", false);
GraphDatabaseService db = new EmbeddedGraphDatabase(DB_PATH);
importer.reIndex(db);
db.shutdown();
}
catch(Exception e){
System.out.println(e.getMessage());
}
batchInserter.shutdown();
}
}
May be my problem is with the versions, because i'm using Neo4j 2.3-M01, but i don't know exactly how should i set the versions e.g. here
<dependency>
<groupId>org.neo4j</groupId>
<artifactId>neo4j-graph-collections</artifactId>
<version>0.7.1-neo4j-2.0.2-SNAPSHOT</version>
<type>jar</type>
</dependency>
My pom.xml is based on https://github.com/neo4j-contrib/spatial/blob/master/pom.xml
Plus:
<repository>
<id>neo4j</id>
<url>http://m2.neo4j.org/content/repositories/releases/</url>
<releases>
<enabled>true</enabled>
</releases>
</repository>
<dependency>
<groupId>org.neo4j</groupId>
<artifactId>neo4j</artifactId>
<version>2.3.0-M01</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.neo4j</groupId>
<artifactId>neo4j-kernel</artifactId>
<version>2.3.0-M01</version>
</dependency>
You can have a look into my git repository
https://github.com/amhg/OSM
Thank you in advance!
For anyone else that runs into this, here is how to do it: https://github.com/maxdemarzi/OSM
Notice a small differences between 2.2.x and 2.3 (7/30-currently on M2).
Just needed the right dependencies.
there are API changes since the last released version. Looking at https://github.com/neo4j-contrib/spatial/blob/master/pom.xml#L4 , it seems you can use Neo4j 2.2.3 if you build that project yourself with
mvn install
and then include version 0.15-neo4j-2.2.3 of the spatial plugin into your pom.xml from the local mvn repo.
I took a look at your pom.xml and it looks like you copied the pom.xml from Neo4j Spatial. This is not what you want.
Since you are trying to write a new application that uses Neo4j Spatial, you should have a pom that is new and refers to neo4j-spatial as a dependency, not a pom that is in any way similar to the neo4j-spatial pom. There is a section in the README that describes how to add neo4j-spatial as a dependency to your own pom.
So I would suggest you do the following:
create a new pom.xml - https://maven.apache.org/guides/getting-started/index.html#How_do_I_make_my_first_Maven_project
then add the dependency as described in https://github.com/neo4j-contrib/spatial#using-neo4j-spatial-in-your-java-project-with-maven

Solr4.0 testing EmbeddedSolrServer using eclipse

I am unable to test the EmbeddedSOlrServer and I run into following exception
Exception in thread main
java.lang.NoClassDEfFOundError:org/apache/lucene/codecs/PostingFormat
at
org.apache.solr.core.SolrResourceLoader.reloadLuceneSPI(SolrResourceLoader.java:179)
Code -
System.setProperty("solr.solr.home", "c:/apps/solr4/example/solr");
CoreContainer.Initializer initializer = new CoreContainer.Initializer();
CoreContainer coreContainer = initializer.initialize();
EmbeddedSolrServer server = new EmbeddedSolrServer(coreContainer, "");
I believe I have all jars in class path and solr.solr.home setting is also updated. Please advise
i solved the problem adding the lucene-core dependency into the test scope:
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-core</artifactId>
<version>4.3.0</version>
<scope>test</scope>
</dependency>