What is the fastest way to insert data using Spring Boot jdbc? - postgresql

I have a Spring Boot application where I batch insert hundreds of thousands of rows across two tables where I insert one row in one of the tables, return the id and use it to insert one row in the other table. As it is right now it's simply not fast enough. I would like to know what is the fastest way to do it. Here is my code:
private void insertData(DataList request, long personId, String tableName) {
try {
String sql = """
WITH inserted AS
(INSERT INTO user_%s (username, password, email, phone, person_id) VALUES (?,?,?,?,?) RETURNING user_id)
INSERT INTO info_%s (user_id, info, full_text_search) VALUES ((SELECT user_id FROM inserted), ?, to_tsvector(?));
""".formatted(tableName, tableName);
jdbcTemplate.batchUpdate(sql, new BatchPreparedStatementSetter() {
#Override
public void setValues(#Nonnull PreparedStatement ps, int i) throws SQLException {
Data data = request.getData(i);
ps.setTimestamp(1, data.getUsername());
ps.setLong(2, data.getPassword());
ps.setInt(3, data.getEmail());
ps.setInt(4, data.getPhone());
ps.setString(5, data.getpersonId());
ps.setString(6, data.getInfo());
ps.setString(7, data.getInfo());
}
#Override
public int getBatchSize() {
return request.getDataCount();
}
});
} catch (Exception e) {
log.error(e.getMessage(), e);
}
}
I'm using PostgreSQL as my database.

Related

Why is my update so slow JDBCbatchitemwriter?

This is the Step
#Bean
public Step processEodBatchUpdateActualTableStep() {
log.debug("[processEodBatchJob] Start Update Process for Actual Table");
return stepBuilderFactory.get(JobConfigurationConstants.PROCESS_EOD_FILE_UPDATE_STEP_NAME)
.<ExtensionQRMerchantTrxHistEntity, TransactionHistoryExtEntity>chunk(1000)
.reader(updateItemReader())
.processor(new ExtensionToTrxnHistExtConverter(mapper))
.writer(new UpdateActualTable(dataSource).updateActualTable())
.build();
}
This is the reader
#Bean
public JdbcCursorItemReader<ExtensionQRMerchantTrxHistEntity> updateItemReader(){
log.info("[UPDATE Reader] Read all records from temp table");
JdbcCursorItemReader<ExtensionQRMerchantTrxHistEntity> reader = new JdbcCursorItemReader<>();
reader.setSql("SELECT * FROM ext_qr_merchant_trx_hist eqmth " +
"WHERE EXISTS " +
"(SELECT 1 FROM t_trxn_detail_ext ttde WHERE eqmth.trx_ref_no = ttde.ref_no AND eqmth.trx_amt = ttde.amount " +
"AND eqmth.trx_dt = ttde.trxn_date);");
reader.setDataSource(dataSource);
reader.setFetchSize(10);
reader.setRowMapper(new RowMapper<ExtensionQRMerchantTrxHistEntity>() {
#Override
public ExtensionQRMerchantTrxHistEntity mapRow(#NonNull ResultSet rs, int rowNum) throws SQLException {
ExtensionQRMerchantTrxHistEntity entity = new ExtensionQRMerchantTrxHistEntity();
entity.setTransactionDate(rs.getTimestamp(1));
entity.setTransactionRefNo(rs.getString(2));
entity.setTransactionAmount(rs.getBigDecimal(3));
entity.setQrString(rs.getString(4));
return entity;
}
});
return reader;
}
This is the processor
#Slf4j
#RequiredArgsConstructor
public class ExtensionToTrxnHistExtConverter implements ItemProcessor<ExtensionQRMerchantTrxHistEntity, TransactionHistoryExtEntity> {
private final DuitNowRppDTOMapper mapper;
#Override
public TransactionHistoryExtEntity process(#NonNull ExtensionQRMerchantTrxHistEntity entity) throws Exception {
log.info("[Processor] Setting ExtensionQRMerchantTrxHistEntity to TransactionHistoryExtEntity");
return setTransactionHistory(entity);
}
private TransactionHistoryExtEntity setTransactionHistory(ExtensionQRMerchantTrxHistEntity tempEntity){
//Set output
TransactionHistoryExtEntity outputEntity =new TransactionHistoryExtEntity();
//Parse QR String
DuitNowRppDTO dto = mapper.mapFromQRDestination(tempEntity.getQrString());
//Set current date
Date now = new Date();
//Set field for Insert new record
UUID uuid = UUID.randomUUID();
outputEntity.setId(uuid);
outputEntity.setCreateDate(now);
outputEntity.setCreateBy(Constants.SYSTEM);
//Set field for updating record
outputEntity.setUpdateDate(now);
outputEntity.setUpdateBy(Constants.SYSTEM);
//replace field from temp table
outputEntity.setCurrencyCode(dto.getTransactionCurrencyCode());
outputEntity.setTransactionDate(tempEntity.getTransactionDate());
outputEntity.setReferenceNumber(tempEntity.getTransactionRefNo());
outputEntity.setAmount(tempEntity.getTransactionAmount());
return outputEntity;
}
}
This is the writer
#Slf4j
#RequiredArgsConstructor
public class UpdateActualTable {
private final DataSource dataSource;
public JdbcBatchItemWriter<TransactionHistoryExtEntity> updateActualTable() {
log.info("[Update] Using Batch Item Writer to UPDATE to Actual Table");
JdbcBatchItemWriter<TransactionHistoryExtEntity> itemWriter = new JdbcBatchItemWriter<>();
itemWriter.setDataSource(dataSource);
itemWriter.setSql("UPDATE t_trxn_detail_ext " +
"SET " +
"update_by = ?, update_dt = ? " +
"WHERE ref_no = ? AND amount = ? AND trxn_date = ?");
itemWriter.setItemPreparedStatementSetter((entity, preparedStatement) -> {
// insert
preparedStatement.setString(1, entity.getUpdateBy());
preparedStatement.setString(2, entity.getUpdateDate().toString());
//where
preparedStatement.setString(3, entity.getReferenceNumber());
preparedStatement.setBigDecimal(4, entity.getAmount());
preparedStatement.setString(5, entity.getTransactionDate().toString());
});
return itemWriter;
}
}
The performance of updating 100k records is slow compared to insertion of 100k records. I tried changing the update to insert statement in the writer and it manages to insert 100k records in less than 40-45 seconds. Update however, is updating 1k records out of 100k records per 2 minutes. What is causing this issue?
Does the chunk size, 1k in my case, affects the performance? I set the chunk size as a constant throughout the testing of inserting and updating using the same reader and processor.
I would first test the update query outside the Spring batch job (using a sql client for example) to make sure the performance hit is due to Spring Batch or not.
Typically, updates are slower than inserts (as they require an extra check for the existence of the row to update), but this could be related to missing indexes on your table.

i want to enter data in a complete column without according to row data

i'm using sq lite database in android
i already have data in two columns i want to change data in column 2 without condition of where
where ever i searched where condition is used
this is how i stored
public boolean updateData(String name, String pass){
SQLiteDatabase db = this.getWritableDatabase();
ContentValues contentValues = new ContentValues();
contentValues.put(col2,pass);
db.update(TABLE_NAME,contentValues, col1+" =?", new String[]{name});
return true;
}
Pass null for the 3d and 4th arguments:
public boolean updateData(String pass) {
SQLiteDatabase db = this.getWritableDatabase();
ContentValues contentValues = new ContentValues();
contentValues.put(col2, pass);
db.update(TABLE_NAME, contentValues, null, null);
db.close();
return true;
}
I guess you don't need the parameter name since there is no WHERE clause, so I removed it.

How to retrieve data stored in bytea format

I have the data in form of json/xml but in the database it is stored in form of bytea, how can i extract it back in form of json/xml.
Here is my query:
SELECT request_payload FROM unirate_incoming_request WHERE id='1224892672'
Output i am getting:
[
org.postgresql.jdbc42.Jdbc42ResultSet#119020fb
]
Here is my database query:
public ResultSet payload(String a) throws SQLException {
ResultSet dboutput;
String query = " SELECT request_payload FROM unirate_incoming_request WHERE id='1224892672';
System.out.println(query);
DatabaseConnection dc = new DatabaseConnection(server, dbName, user, password);
dboutput = dc.executeQuery1(query);
System.out.println("hi");
while (dboutput.next()) {
System.out.print(dboutput.getBytes("request_payload") + " ");
}
dboutput.close();
dc.close();
return dboutput;
}
Here is my database connection:
public ResultSet executeQuery1(String sql) throws SQLException {
logSql(sql);
Statement stmt = createStatement();
ResultSet rs = stmt.executeQuery(sql);
return rs;
}

How to commit a transaction on multiple sql queries in Rx-java2 jdbc?

I am trying to insert sql records into multiple records with rxjava2-jdbc. Please let me know how can I achieve that. I tried below steps but it was unsuccessful.
Case 1)
public class DatabaseRepository {
private Database db;
public DatabaseRepository() throws Exception{
NonBlockingConnectionPool pool =
Pools.nonBlocking()
.maxPoolSize(Runtime.getRuntime().availableProcessors() * 5)
.connectionProvider(ConnectionProvider.from("jdbc:oracle:thin:#//rcld19-scan.test.com:1522/TGCD01", "test", "testPassword"))
.build();
this.db = Database.from(pool);
}
public Flowable<Integer> insertIntoMultipleTables() {
Flowable<Integer> insertIntoEmployee=db.update(insert into employee(name, designation) values ("Employee_1","Manager"))
.counts()
.doOnError(e -> {
log.error("Exception while inserting record to employee table: {}", e.getMessage());
});
return db.update(insert into department(name, no_of_employees) values("Management",1))
.dependsOn(insertIntoEmployee)
.counts()
.doOnError(e -> {
log.error("Exception while inserting record to department table: {}", e.getMessage());
});
}
}
I am trying to insert into multiple tables as part of a single transaction. In this case, failure on insertion of record into department table will not rollback data from first table
Case 2)
public class DatabaseRepository {
private Database db;
public DatabaseRepository() throws Exception{
NonBlockingConnectionPool pool =
Pools.nonBlocking()
.maxPoolSize(Runtime.getRuntime().availableProcessors() * 5)
.connectionProvider(ConnectionProvider.from("jdbc:oracle:thin:#//rcld19-scan.test.com:1522/TGCD01", "test", "testPassword"))
.build();
this.db = Database.from(pool);
}
public Flowable<Tx<Integer>> insertIntoMultipleTables(){
Flowable<Tx<Integer>> insertIntoEmployee= db.update(insert into employee(name, designation) values ("Employee_1","Manager"))
.transacted()
.counts()
.flatMap(tx ->{
return tx.update(insert into department(name, no_of_employees) values("Management",1))
.counts()
.doOnError(e -> log.error("Exception while inserting record to department table: {}",
e.getMessage()));
})
.doOnError(e -> {
log.error("Exception while inserting record to employee table: {}", e.getMessage());
});
}
}
This code is not working as a transaction. Any SQL error in one of the insertion, is not rolling back the records inserted into other table
My requirement is using reactive java2-jdbc i need to insert records into multiple database tables, I am not able to find any valid examples in Git. Please let me know if I need to do anything differently.

SQL increment id, filling first line of database

I habe a sqlite database in java (eclipse) with the library sqlite-jdbc-3.16.1.jar.
I have 5 rows in table1: id(ID Integer PRIMARY KEY AUTOINCREMENT), name, row3, row4, row5
I want to insert name, row3 and row4 and the id to increment itself.
public static void insertTest(String name, byte[] contentRow3, byte[] contentRow4) {
String sql = "INSERT INTO table1(name, contentRow3, contentRow4) VALUES(?,?,?)";
try (Connection conn = connect();
PreparedStatement pstmt = conn.prepareStatement(sql)) {
pstmt.setString(2, name);
pstmt.setBytes(3, contentRow3);
pstmt.setBytes(4, contentRow4);
System.out.println("Added new Person to DB");
pstmt.executeUpdate();
} catch (SQLException e) {
System.out.println(e.getMessage());
}
}
Error : Exception in thread "main" java.lang.ArrayIndexOutOfBoundsException: 3
What is the problem here?
Placeholders in Java prepared statements begin at index 1, not 2. I expect that the following corrected code should work:
try (Connection conn = connect();
PreparedStatement pstmt = conn.prepareStatement(sql)) {
pstmt.setString(1, name);
pstmt.setBytes(2, contentRow3);
pstmt.setBytes(3, contentRow4);
System.out.println("Added new Person to DB");
pstmt.executeUpdate();
} catch (SQLException e) {
System.out.println(e.getMessage());
}
The exception you are getting is complaining that index position 3 is out of bounds. Most likely, under the hood when you did pstmt.setBytes(3, contentRow4) this translated to accessing the fourth array element, which would be index 3 assuming the array indexing is zero based.