Database export to separate JSON files with Sprinb Batch - spring-batch

I want to export every entity from database to separate json file. Is that possible with Spring batch?

Related

How to send data from S3 Bucket to FTP Server with help of Apache nifi?

I have file name and file path in SQL tables, files can be multiple for each row and this files are stored in S3 Bucket, now i want to send all those files via FTP whose name in sql rows? How we can achieve it via Apache nifi ?
I tried with get file and lists3 but not able to come to conclusion
You need below components to develop your data ingestion pipeline:
Processors
QueryDatabaseTableRecord: to query and get result data from database
SplitText: split query result line by line
ExtractText: get content of FlowFile into attribute
UpdateAttibute: derive S3 object key, bucket, FTP filename etc attributes
FetchS3Object: retrieves the contents of an S3 Object and writes it to the content of a FlowFile
PutFTP: sends FlowFiles to an FTP Server
Controller Services:
DBCPConnectionPool: configure database instance connection, required by QueryDatabaseTableRecord
CSVRecordSetWriter: required by QueryDatabaseTableRecord to parse result dataset
AWSCredentialsProviderControllerService: configure AWS credentials, required for FetchS3Object
Flow Design:
Derive S3 and FTP attributes in one go at UpdateAttibute
QueryDatabaseTableRecord -> SplitText -> ExtractText -> UpdateAttibute -> FetchS3Object -> PutFTP
All the processors are self-explanatory, so refer to official documentation for properties configuration.

Apache NiFi - Move table content from Oracle to Mongo DB

I am very new to Apache Nifi. I am trying to Migrate data from Oracle to Mongo DB as per the screenshot in Apache NiFi. I am failing with the reported error. Pls help.
Till PutFile i think its working fine, as i can see the below Json format file in my local directory.
Simple setup direct from Oracle Database to MongoDb without SSL or username and password (not recommended for Production)
Just keep tinkering on PutMongoRecord Processor until you resolve all outstanding issues and exclamation mark is cleared
I am first using an ExecuteSQL processor which is resulting the dataset in Avro, I need the final data in JSON. In DBconnection pooling Service, you need to create a controller with the credentials of your Orcale database. Post that I am using Split Avro and then Transform XML to convert it into JSON. In Transform XML, you need to use XSLT file. After that, I use PutMongo Processor for ingestion in Json which gets automatically converted in BSON

OpenEdge read csv file with REST web service

Hi guys (and girls)...
I'm wondering if it is possible to read in a csv file using a RESTful web service in Progress OpenEdge.
And if it can unzip the csv file.
Thank you.
You could read the file into a JSON object and pass it to the REST service.
On the server side you'd receive the JSON as LONGCHAR, convert it to a JSON again, extract the file from the JSON object, write it to disk somewhere and then unpack it.
You can add a CLOB to your REST service to pass in a file:
DEFINE TEMP-TABLE ttFile NO-UNDO
FIELD FileData AS CLOB.
In the service, save the CLOB to a disk file:
COPY-LOB FROM OBJECT ttFile.FileData TO FILE "datafile.csv".
To unzip it, install the WinZip command line utilities. Wzunzip will unzip the file for you.
http://kb.winzip.com/kb/entry/125/

Create/Update JIRA request based on results stored in CSV

I have a Jenkins job that runs the Selenium tests. The results are then stored in a CSV file and then fed to Cassandra. My requirement is to create JIRA request if the test fails either by analyzing the CSV file or from Cassandra. Please suggest the possible approaches.
Jira API + CSV Reader or Cassandra API
https://docs.atlassian.com/jira/REST/latest/

Configuring Spring Batch jobs via database instead of xml

i am new to spring batch,i need few clarifications regarding spring batch admin.
can i do job configuration related information in database instead of uploading XML file based configuration???