Connecting Spring Batch to Remote Cassandra Database - spring-batch

I'm hoping to read data from a CSV file, process the data, and upload it to a remote database. I'm using Spring's starter repo as a base. Code is here
I tried putting this information in the properties file:
spring.data.cassandra.contact-points=IP, IP
spring.data.cassandra.port=PORT
spring.data.cassandra.keyspace-name=KEYSPACE_NAME
spring.data.cassandra.username=USER
spring.data.cassandra.password=PASS
spring.data.cassandra.ssl=TRUE
However, I think it keeps defaulting to pushing to some local tomcat jdbc. I'm not really sure where to start. Any help is appreciated! Thanks.

Your code doesn't have anything to use Cassandra. It doesn't have any of the dependencies and the ItemWriter implementation is a JdbcBatchItemWriter which I don't think will work for Cassandra. You need to configure your application to actually use Cassandra (the spring data starter as well as an ItemWriter implementation that can write to Cassandra).

Related

Spring batch integration using OutBoundGateway and ReplyingKafkaTemplate

My Goal
I need to read a file and divide each line as a message and send to kafka from a spring batch project and another spring integration project will be receiving the messages to process it in a async way. I want to return those messages after processing to the batch project and create 4 different files out of those messages.
Here I am trying to use OutBoundGateway and ReplyingKafkaTemplate. I am unable to configure it properly... Is there any example or reference guide to configure it.
I have checked spring batch integration samples github repository... There is no sample for outBoundGateway or ReplyingKafkaTemplate.
Thanks in Advance.
For ReplyingKafkaTemplate logic in Spring Integration there is a dedicated KafkaProducerMessageHandler which can be configured with a ReplyingKafkaTemplate.
See more info in docs:
https://docs.spring.io/spring-integration/docs/current/reference/html/kafka.html#kafka-outbound-gateway
And more about ReplyingKafkaTemplate:
https://docs.spring.io/spring-kafka/reference/html/#replying-template
Probably on the other a KafkaInboundGateway must be configured, respectively:
https://docs.spring.io/spring-integration/docs/current/reference/html/kafka.html#kafka-inbound-gateway

Apache NiFi - Move table content from Oracle to Mongo DB

I am very new to Apache Nifi. I am trying to Migrate data from Oracle to Mongo DB as per the screenshot in Apache NiFi. I am failing with the reported error. Pls help.
Till PutFile i think its working fine, as i can see the below Json format file in my local directory.
Simple setup direct from Oracle Database to MongoDb without SSL or username and password (not recommended for Production)
Just keep tinkering on PutMongoRecord Processor until you resolve all outstanding issues and exclamation mark is cleared
I am first using an ExecuteSQL processor which is resulting the dataset in Avro, I need the final data in JSON. In DBconnection pooling Service, you need to create a controller with the credentials of your Orcale database. Post that I am using Split Avro and then Transform XML to convert it into JSON. In Transform XML, you need to use XSLT file. After that, I use PutMongo Processor for ingestion in Json which gets automatically converted in BSON

Getting an EntityManagerFactory whitout persistence.xml

I develop a webApp which is connecting to many similar database.
The target databases are set by the final user in an administration GUI.
They work with differents database engine.
I use JPA and Eclipselink 2.6.4 to query theses databases.
Actually I've no other choice than writing a persistence.xml file on the fly and to use it to create an EntityManagerFactory.
pros.setProperty(PersistenceUnitProperties.ECLIPSELINK_PERSISTENCE_XML, persistenceFilesPath + "/" + persistenceFileName);
Persistence.createEntityManagerFactory(envCode, pros);
I would like to bypass this step to get directly an EntityManagerFactory without writing an persitence.xml file.
I've read some things on PersistenceUnitInfo and createContainerEntityManagerFactory but nothing really concrete.
I'm looking for ideas to reach my goal. I hope you will have somes.
Thanks

sftp channel outbound adapter with retry

I am using spring batch and spring integration where once my batch job is completed,it creates texts files and those needs to be uploaded to some ftp server. Sometimes we noticed that those connections drops and it needs to be retried. Is there anyway we can use spring retry project to try few sec later to see if it can upload those files. we want it to be configurable.
If so is there any example out there.
Thanks
Yes, Spring Integration provide retry component for you. It is called RequestHandlerRetryAdvice:
<int-sftp:outbound-channel-adapter>
<int-sftp:request-handler-advice-chain>
<bean class="org.springframework.integration.handler.advice.RequestHandlerRetryAdvice" />
</int-sftp:request-handler-advice-chain>
</int-sftp:outbound-channel-adapter>
Please, find more info in the Reference Manual.
Consider to you use RequestHandlerCircuitBreakerAdvice also for your "connections drops" cases.
And here you are the sample.

WSO2 Data Services Server - Adding a new Database support

How do I add new Database support (MongoDB) in 2.6.3 version of WSO2 Data Service Server.
You can use DSS (2.6.3) with any database type if the database connectivity is exposed via JDBC. In other words, if your preferred database type exposes a JDBC driver/adapter for the users to connect to it via JDBC then you can use DSS to expose your data stored in your data store as a web service. Similarly, if MongoDB too has a JDBC adapter you wouldn't have any (or too many :) ) issues integrating that with DSS. However, there are some exceptions when it comes to exposing flat files such as google spreadsheeets, excel sheets, csv files as DSS uses the relevant client APIs such as Google gdate client API, Apache POI, etc to connect to those datasources and extract data. However, if we consider the general case is you need to have an adapter or a similar mechanism to connect to your datasource via JDBC.
But in the upcoming version of DSS (v3.0.0), it is planned to introduce custom datasource support so you can easily write an adapter to any datasource and use it with DSS.
Regards,
Prabath
I am not sure about this, but I suppose that if is not supported by default you can always download the jar library for MongoDB and put it in CARBON_HOME\repository\components\lib and restart. For example for mysql I have the mysql-connector-java-5.1.7-bin.jar in that folder.
Hope this help