Using Spring Data MongoDB with MongoRepository. I have this bean
#Bean
public Jackson2RepositoryPopulatorFactoryBean repositoryPopulator() {
Jackson2RepositoryPopulatorFactoryBean factory = new Jackson2RepositoryPopulatorFactoryBean();
try {
factory.setResources(resourceResolver.getResources("classpath:static/collections/*.json"));
} catch (IOException e) {
log.error("Could not load data", e);
}
return factory;
}
which just works fine with fongo (db is dropped at the end of a test run) but not with real mongo. If I leave the bean as it is and I switch to real mongo instance, then I get my data base populated but only the first run, if I re-run the project (+tests) then it fails because it's already populated (getting DuplicateKeyException).
How do I populate only on the case the repositories are empty?
Consider using data migration tools like Mongobee. This is basically Liquibase/Flyway for MongoDB.
#Bean
public Jackson2RepositoryPopulatorFactoryBean repositoryPopulator() throws Exception {
Jackson2RepositoryPopulatorFactoryBean factory = new Jackson2RepositoryPopulatorFactoryBean();
try {
Resource[] resources = resourceResolver.getResources("classpath:static/collections/*.json");
//resources to list so I can add only the necessary resources
List<Resource> resourcesToFill = new ArrayList<>();
for (Resource r : resources) {
String collection = r.getFilename().substring(0, r.getFilename().length() - 5);
if (!mongoTemplate().collectionExists(collection))
resourcesToFill.add(r);
}
//back to Array...
resources = new Resource[resourcesToFill.size()];
for(int i=0; i<resources.length; i++)
resources[i] = resourcesToFill.get(i);
factory.setResources(resources); // <-- the reason of this shitty code, why the hell use Array?
} catch (IOException e) {
log.error("Could not load data", e);
}
return factory;
}
Related
i am trying to persist multiple entities to database. but i need to roll back all inserts if one of them faces an exception. how can i do that?
here is what i did:
public class RoleCreationApplyService extends AbstractEntityProxy implements EntityProxy {
#Inject
#Override
public void setEntityManager(EntityManager em) {
super.entityManager = em;
}
#Resource
UserTransaction utx;
public Object acceptAppliedRole(String applyId, Role parentRole, SecurityContext securityContext) throws Exception {
utx.begin();
try {
FilterWrapper filter = FilterWrapper.createWrapperWithFilter("id", Filter.Operator._EQUAL, applyId);
RoleCreationApply roleCreationApply = (RoleCreationApply) getByFilter(RoleCreationApply.class, filter);
Role appliedRole = new Role();
appliedRole.setRoleUniqueName(roleCreationApply.getRoleName());
appliedRole.setRoleName(roleCreationApply.getRoleName());
appliedRole.setRoleDescription(roleCreationApply.getRoleDescription());
appliedRole.setRoleDisplayName(roleCreationApply.getRoleDisplayName());
appliedRole.setCreationTime(new Date());
appliedRole.setCreatedBy(securityContext.getUserPrincipal().getName());
Role childRole = (Role) save(appliedRole);
parentRole.setCreationTime(new Date());
parentRole.setCreatedBy(securityContext.getUserPrincipal().getName());
parentRole = (Role) save(parentRole);
RoleRelation roleRelation = new RoleRelation();
roleRelation.setParentRole(parentRole);
roleRelation.setChildRole(childRole);
RoleRelation savedRoleRelation = (RoleRelation) save(roleRelation);
PostRoleRelation postRoleRelation = new PostRoleRelation();
postRoleRelation.setPost(roleCreationApply.getPost());
postRoleRelation.setRoleRelation(savedRoleRelation);
ir.tamin.framework.domain.Resource result = save(postRoleRelation);
utx.commit();
return result;
} catch (Exception e) {
utx.rollback();
throw new Exception(e.getMessage());
}
}
}
and this is save method in AbstractEntityProxy class:
#Override
#ProxyMethod
public Resource save(Resource clientObject) throws ProxyProcessingException {
checkRelationShips((Entity) clientObject, Method.SAVE, OneToOne.class, ManyToOne.class);
try {
entityManager.persist(clientObject);
} catch (PersistenceException e) {
throw new ResourceAlreadyExistsException(e);
}
return clientObject;
}
but when an exception occures for example Unique Constraint Violated and it goes to catch block, when trying to execute utx.rollback() it complains transaction does not exist and so some entities will persist. but i want all to roll back if one fails.
PS: i don't want to use plain JDBC. what is JPA approach?
In the spring batch project, I used JdbcCursorItemReader to read data to process them in parallel. I can run the batch locally without any problem.
I also heard that JdbcPagingItemReader is recommended for parallel processing against JdbcCursorItemReader, as cursor reader will hold the connection too long while paging reader can release connection once the page size is reached.
I then switched to JdbcPagingItemReader in step2, but out of surprise, I got the exception below when running locally.
Caused by: java.sql.SQLTransientConnectionException: HikariPool-1 -
Connection is not available, request timed out after 300001ms.
However, it seems the above exception occurs in step1 before the paging reader in step2 is executed, and that is the only change made. Please shed some light on why the exception is thrown and if it is good practice to use paging reader instead of cursor in parallel processing. Much appreciated your help!
The code snippet is pasted below:
#Bean
#StepScope
public Flow createParallelSubFlow() {
List<Flow> subFlowList = new ArrayList<>();
List<Stream> streamList;
try {
streamList = dataSourceConfig.streamMapper().
getStreamListByStatus(Constants.PENDING_STATUS_CD);
} catch (Exception e) {
}
streamList.forEach(stream -> {
long id = stream.getStreamId();
String flowName = "stream" + id + "_flow";
Flow subFlow = new FlowBuilder<Flow>(flowName)
.start(step1(id))
.next(step2(id))
.end();
subFlowList.add(subFlow);
});
return new FlowBuilder<Flow>("splitFlow").split(new SimpleAsyncTaskExecutor())
.add(subFlowList.toArray(new Flow[0])).build();
}
public Step step1(long id) {
return stepBuilderFactory.get("step1")
.<Domain, Domain>chunk(100)
.reader(reader1(id))
.writer(writer1())
.build();
}
//#StepScope
//#Bean
public Step step2(long id) {
return stepBuilderFactory.get("step2")
.<Domain, Domain>chunk(100)
.reader(cursorReader2(id))
.processor(processor2)
.writer(writer2())
.build();
}
public JdbcCursorItemReader<Domain> cursorReader2(Long id) {
return new JdbcCursorItemReaderBuilder<Domain>()
.dataSource(dataSourceConfig.dataSource())
.name("cursorReader")
.sql(Constants.QUERY_SQL)
.preparedStatementSetter(new PreparedStatementSetter() {
#Override
public void setValues(PreparedStatement ps) throws SQLException {
ps.setLong(1, id);
}})
.rowMapper(new RowMapper())
.build();
}
//Switch from cursorReader2 to pagingReader2 in step2
public JdbcPagingItemReader<Domain> pagingReader2(Long id) {
return new JdbcPagingItemReaderBuilder<Domain>()
.dataSource(dataSourceConfig.dataSource())
.name("pagingReader")
.queryProvider(queryProvider())
.parameterValues(parameterValues(id))
.rowMapper(new RowMapper())
.pageSize(100)
.build();
}
#Bean
public PagingQueryProvider queryProvider() {
SqlPagingQueryProviderFactoryBean providerFactory = new SqlPagingQueryProviderFactoryBean();
Map<String, Order> sortKeys = new HashMap<>(2);
sortKeys.put("ID", Order.ASCENDING);
providerFactory.setDataSource(dataSourceConfig.dataSource());
providerFactory.setSelectClause("SELECT Clause");
providerFactory.setFromClause("FROM Clause");
providerFactory.setWhereClause("WHERE Clause");
providerFactory.setSortKeys(sortKeys);
PagingQueryProvider pagingQueryProvider = null;
try {
pagingQueryProvider = providerFactory.getObject();
} catch (Exception e) {
logger.error("Failed to get PagingQueryProvider", e);
throw new RuntimeException("Failed to get PagingQueryProvider", e);
}
return pagingQueryProvider;
}
private Map<String, Object> parameterValues(Long id) {
Map<String, Object> parameterValues = new HashMap<>();
parameterValues.put("1", id);
return parameterValues;
}
Have a list of data need to be saved. Before the save had to delete the existing data and save the new data.
If any of the delete & save is failed that transaction need to roll back, rest of the delete & save transaction should continue.
public LabResResponse saveLabResult(List<LabResInvstResultDto> invstResults) {
LabResResponse labResResponse = new LabResResponse();
List<Long> relInvstid = new ArrayList<Long>();
try{
if(invstResults != null){
List<LabResInvstResult> labResInvstResults = mapper.mapAsList(invstResults, LabResInvstResult.class);
for(LabResInvstResult dto: labResInvstResults){
if(dto != null){
//delete all child records before save.
deleteResult(dto, relInvstid);
}
}
}
labResResponse.setRelInvstids(relInvstid);
}catch(Exception e){
e.printStackTrace();
}
return labResResponse;
}
Here new transaction will added for each delete & save
#Transactional(propagation = Propagation.REQUIRES_NEW, rollbackFor = { Exception.class })
private void deleteResult(LabResInvstResult dto, List<Long> relInvstid) {
try{
labResultRepo.deleteById(dto.getId());
LabResInvstResult result = labResultRepo.save(dto);
}catch(Exception e){
e.printStackTrace();
}
}
On delete it throws an exception "Caused by: javax.persistence.TransactionRequiredException: No EntityManager with actual transaction available for current thread - cannot reliably process 'remove' call"
I can solve this by adding a #Transactional for public LabResResponse saveLabResult(List invstResults) method.
But my intial usecase will not work this will roll back entire list of transaction.
Here are two problems.
The first problem is that you call the "real" deleteResult method of the class. When Spring sees #Transactional it creates a proxy object with transactional behavior. Unless you're using AspectJ it won't change the class itself but create a new one, proxy. So when you autowire this bean you will be able use proxy's method that runs transaction related logic. But in your case you're referencing to the method of the class, not proxy.
The second problem is that Spring (again if AspectJ is not used) can't proxy non-public methods.
Summary: make the deleteResult method public somehow and use proxied one. As a suggestion, use another component with deleteResult there.
You are catching exception out of for loop, while your requirement says you want to continue the loop for other objects in list.
Put your try/catch block with-in loop. It should work fine
public LabResResponse saveLabResult(List<LabResInvstResultDto> invstResults) {
LabResResponse labResResponse = new LabResResponse();
List<Long> relInvstid = new ArrayList<Long>();
try{
if(invstResults != null){
List<LabResInvstResult> labResInvstResults = mapper.mapAsList(invstResults, LabResInvstResult.class);
for(LabResInvstResult dto: labResInvstResults){
if(dto != null){
//delete all child records before save.
try {
deleteResult(dto, relInvstid);
} catch(Exception e){
e.printStackTrace();
}
}
}
}
labResResponse.setRelInvstids(relInvstid);
}catch(Exception e){
e.printStackTrace();
}
return labResResponse;
}
Is there a way to test below code.Here I am connecting to database with JNDI.I am new to mockito and not getting a way to test the same.
#SuppressWarnings("unused")
public Connection getJNDIConnection() {
Connection result = null;
try {
InitialContext initialContext = new InitialContext();
if (initialContext == null) {
LOGGER.info("JNDI problem. Cannot get InitialContext.");
}
DataSource datasource = (DataSource) initialContext.lookup(jndiName);
if (datasource != null) {
result = datasource.getConnection();
} else {
LOGGER.info("Failed to lookup datasource.");
}
} catch (NamingException ex) {
LOGGER.error("Cannot get connection: " + ex);
} catch (SQLException ex) {
LOGGER.error("Cannot get connection: " + ex);
}
return result;
}
Of course, you can to do it, but I think you should read the documentation yourself. The main points here is:
InitialContext initialContext = mock(InitialContext.class);
DataSource dataSource = mock(DataSource.class);
Connection expected = mock(Connection.class);
whenNew(InitialContext.class).withNoArguments().thenReturn(initialContext);
when(initialContext.lookup(jndiName)).thenReturn(dataSource);
when(initialContext.getConnection()).thenReturn(connection);
Connection result = intatnceOfCalss.getJNDIConnection();
assertSame("Should be equals", expected, result);
Also you should use PowerMock to mock constructors and static methods. To have deal with Logger, just add this code:
#BeforeClass
public static void setUpClass() {
mockStatic(LoggerFactory.class);
Logger logger = mock(Logger.class);
when(LoggerFactory.getLogger(ApplySqlFileIfExistsChange.class)).thenReturn(logger);
}
Don't forget about annotations:
#RunWith(PowerMockRunner.class)
#PrepareForTest({LoggerFactory.class})
Try to read this doc http://site.mockito.org/mockito/docs/current/org/mockito/Mockito.html
I have the following code in UserController in my Session Scoped Bean
public void addItemToBundle(ItemEntity item){
//System.out.println(item.getTitle());
try {
em.getTransaction().begin();
UserEntity user = em.find(UserEntity.class, this.username);
BundleEntity bundle = new BundleEntity();
BundleEntityPK compositePk = new BundleEntityPK();
compositePk.setCheckedOutDate(new Date());
compositePk.setItemId(item.getItemId());
compositePk.setUsername(user.getUsername());
bundle.setId(compositePk);
Set<BundleEntity> bundles = new HashSet<BundleEntity>();
bundles.add(bundle);
user.setBundleEntities(bundles);
em.persist(user);
em.flush();
em.getTransaction().commit();
} finally {
}
}
public String addToBundle(){
try {
addItemToBundle(item);
} catch (NullPointerException e) {
e.getMessage();
}
return null;
}
This code uses private ItemEntity item; which gets passed in by the following JSF markup:
<p:commandLink action="#{itemController.item}">
<f:setPropertyActionListener target="#{itemController.selectedItem}" value="#{movie}" />
</p:commandLink>
(I'm using PrimeFaces in this example) The problem is that the addItemToBundle is not calling any SQL code in the console (I have FINE enabled) and the bundle never gets created or added to the user. I also tried em.persist(user) and em.flush() and setting cascadeType in my UserEntity with no luck.
#OneToMany(mappedBy="userEntity",cascade=CascadeType.PERSIST)
private Set<BundleEntity> bundleEntities;
Thanks!
You know that this:
try {
addItemToBundle(item);
} catch (NullPointerException e) {
e.getMessage();
}
is very bad practice, right? Maybe, that's the problem here, you run into a NPE and never notice it.
You should at least log the exception to know what's going on there (just for demo purposes, I've used stdout, please replace with your favorite logging framework):
try {
addItemToBundle(item);
} catch (NullPointerException e) {
System.err.println(e.getMessage()); //use logger here
}