I have a Spring Boot project which mostly uses Repository's for all db operations, however I have a table that is not backed by an entity and thus doesn't use a Repository. Instead I use an EntityManager and run a native query.
This works well most of the time, but every so often (but often enough that it's noticeable), the query just fails for seemingly no reason.
Here's my code:
#RequiredArgsConstructor
public class AddChangeRequest {
#Qualifier("a-entity-manager")
private final EntityManager entityManager;
public void of(G g, P p) {
entityManager.joinTransaction();
entityManager.createNativeQuery(
"INSERT INTO CHANGE_REQUESTS (P_ID, G_PK) VALUES (?,?) ON CONFLICT (P_ID) DO UPDATE SET LAST_CHANGE_DATE = NOW()")
.setParameter(1, p.getPId())
.setParameter(2, g.getGPk())
.executeUpdate();
}
One thing to note here is that I have two EntityManagers in my project.
One a-entity-manager which is defined as so:
#Bean
#Qualifier("a-entity-manager")
public EntityManager entityManager(EntityManagerFactory entityManagerFactory) {
return entityManagerFactory.createEntityManager();
}
And another which is brought in from another project in my pom file and by scanning for the package it's in:
#SpringBootApplication(scanBasePackages=....
The error that I sometimes get typically looks something like (usually there's a few different errors back to back):
o.h.engine.jdbc.spi.SqlExceptionHelper : This statement has been closed.
org.hibernate.exception.GenericJDBCException: could not execute statement
at org.hibernate.exception.internal.StandardSQLExceptionConverter.convert(StandardSQLExceptionConverter.java:47)
at org.hibernate.engine.jdbc.spi.SqlExceptionHelper.convert(SqlExceptionHelper.java:113)
at org.hibernate.engine.jdbc.spi.SqlExceptionHelper.convert(SqlExceptionHelper.java:99)
at org.hibernate.engine.jdbc.internal.ResultSetReturnImpl.executeUpdate(ResultSetReturnImpl.java:200)
at org.hibernate.engine.query.spi.NativeSQLQueryPlan.performExecuteUpdate(NativeSQLQueryPlan.java:107)
at org.hibernate.internal.SessionImpl.executeNativeUpdate(SessionImpl.java:1491)
at org.hibernate.query.internal.NativeQueryImpl.doExecuteUpdate(NativeQueryImpl.java:295)
at org.hibernate.query.internal.AbstractProducedQuery.executeUpdate(AbstractProducedQuery.java:1605)
org.postgresql.util.PSQLException: This statement has been closed.
at org.postgresql.jdbc.PgStatement.checkClosed(PgStatement.java:694)
at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:447)
at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:365)
at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:143)
at org.postgresql.jdbc.PgPreparedStatement.executeUpdate(PgPreparedStatement.java:120)
at com.zaxxer.hikari.pool.ProxyPreparedStatement.executeUpdate(ProxyPreparedStatement.java:61)
at com.zaxxer.hikari.pool.HikariProxyPreparedStatement.executeUpdate(HikariProxyPreparedStatement.java)
at org.hibernate.engine.jdbc.internal.ResultSetReturnImpl.executeUpdate(ResultSetReturnImpl.java:197)
at org.hibernate.engine.query.spi.NativeSQLQueryPlan.performExecuteUpdate(NativeSQLQueryPlan.java:107)
at org.hibernate.internal.SessionImpl.executeNativeUpdate(SessionImpl.java:1491)
at org.hibernate.query.internal.NativeQueryImpl.doExecuteUpdate(NativeQueryImpl.java:295)
at org.hibernate.query.internal.AbstractProducedQuery.executeUpdate(AbstractProducedQuery.java:1605)
The last thing to note is that when this code is run it's as a result of a network call and there's usually a fair number of requests made in quick succession. Probably on average 10-50 calls.
Anyone have any ideas what could be going on here? I believe it may have something to do with having multiple EntityManagers because that is rather new and I don't think this happened before. Before this rather recent change this code was just using the "default" EntityManager, I didn't define my own at all.
Thanks!
Related
As a first step to start doing Spring work, Ive been tasked to do just the simplest of things. I've followed this small little tutorial I found and copied it to have a starting point. But when I run the program, I get this error
Advertencia: A system exception occurred during an invocation on EJB UsersFacade, method: public void Session.UsersFacade.Save(java.lang.String,java.lang.String,java.lang.String)
Advertencia: javax.ejb.EJBException
...
...
Caused by: java.lang.annotation.AnnotationFormatError: Duplicate annotation for class: interface javax.validation.constraints.NotNull: #javax.validation.constraints.NotNull(message={javax.validation.constraints.NotNull.message}, groups=[], payload=[])
Even though I followed the tutorial correctly. Do I have to do more steps because I use Postgresql, or is there something else that I have to do that wasn't in the video?
Well, in the given error, it says
Duplicate annotation for class: interface javax.validation.constraints.NotNull
So in the Users class I decided to comment the #NotNull and (after a second error) the #Size for all my strings. This lets me save things into my database without a problem.
To use the #NotNull and #Size correctly though, I had to modify the #Column to be like this:
#Column(name = "name", length = 20, nullable = false)
The only thing left is to limit the textboxes to the size of the columns.
First the problem statement:
I am using Spring-Batch in my DEV environment fine. When I move the code to a production environment I am running into a problem. In my DEV environment, Spring-Batch is able to create it's transaction data tables in our DB2 database server with out problem. This is not a option when we go to PROD as this is a read only job.
Attempted solution:
Search Stack Overflow I found this posting:
Spring-Batch without persisting metadata to database?
Which sounded perfect, so I added
#Bean
public ResourcelessTransactionManager transactionManager() {
return new ResourcelessTransactionManager();
}
#Bean
public JobRepository jobRepository(ResourcelessTransactionManager transactionManager) throws Exception {
MapJobRepositoryFactoryBean mapJobRepositoryFactoryBean = new MapJobRepositoryFactoryBean(transactionManager);
mapJobRepositoryFactoryBean.setTransactionManager(transactionManager);
return mapJobRepositoryFactoryBean.getObject();
}
I also added it to my Job by calling .reporitory(jobRepository).
But I get
Caused by: java.lang.NullPointerException: null
at org.springframework.batch.core.repository.dao.MapJobExecutionDao.synchronizeStatus(MapJobExecutionDao.java:158) ~[spring-batch-core-3.0.6.RELEASE.jar:3.0.6.RELEASE]
So I am not sure what to do here. I am new to Spring so I am teaching myself as I go. I am open to other solutions, such as an in memory database, but I have not been able to get them to work either. I do NOT need to save any state or session information between runs, but the data base query I am running will return around a million or so rows, so I will need to get that in chunks.
Any suggestions or help would be greatly appreciated.
Add this beans to AppClass
#Bean
public PlatformTransactionManager transactionManager() {
return new ResourcelessTransactionManager();
}
#Bean
public JobExplorer jobExplorer() throws Exception {
MapJobExplorerFactoryBean jobExplorerFactory = new MapJobExplorerFactoryBean(mapJobRepositoryFactoryBean());
jobExplorerFactory.afterPropertiesSet();
return jobExplorerFactory.getObject();
}
#Bean
public MapJobRepositoryFactoryBean mapJobRepositoryFactoryBean() {
MapJobRepositoryFactoryBean mapJobRepositoryFactoryBean = new MapJobRepositoryFactoryBean();
mapJobRepositoryFactoryBean.setTransactionManager(transactionManager());
return mapJobRepositoryFactoryBean;
}
#Bean
public JobRepository jobRepository() throws Exception {
return mapJobRepositoryFactoryBean().getObject();
}
#Bean
public JobLauncher jobLauncher() throws Exception {
SimpleJobLauncher simpleJobLauncher = new SimpleJobLauncher();
simpleJobLauncher.setJobRepository(jobRepository());
return simpleJobLauncher;
}
This doesn't directly answer your question, but that is not a good solution; the map-based repository is supposed to be used only for testing. It will grow in memory indefinitely.
I suggest you use an embedded database like sqlite. The main problem in using a separate database for job metadata is that you should then coordinate the transactions between the two databases that you use (so that the state of metadata matches that of the data), but since it seems you're not even writing in the main database, that probably won't be a problem for you.
You could use an in-memory database (for example H2 or HSQL) quite easily. Examples of that you can find for example here: http://www.mkyong.com/spring/spring-embedded-database-examples/.
As for the Map-backed job repository, it does provide a method to clear its contents:
public void clear()
Convenience method to clear all the map DAOs globally, removing all entities.
Be aware that a Map-based job repository is not fit for use in partitioned steps and other multi-threading.
The following seems to have done the job for me:
#Bean
public DataSource dataSource() {
EmbeddedDatabaseBuilder builder = new EmbeddedDatabaseBuilder();
EmbeddedDatabase db = builder
.setType(EmbeddedDatabaseType.HSQL)
.build();
return db;
}
Now Spring is not creating tables in our production database, and when the JVM exits state is lost so nothing seems to be hanging around.
UPDATE: The above code has caused concurrency errors for us. We have addressed this by abandoning the EmbeddedDatabaseBuilder and declaring the HSQLDB this way instead:
#Bean
public BasicDataSource dataSource() {
BasicDataSource dataSource = new BasicDataSource();
dataSource.setDriverClassName("org.hsqldb.jdbcDriver");
dataSource.setUrl("jdbc:hsqldb:mem:testdb;sql.enforce_strict_size=true;hsqldb.tx=mvcc");
dataSource.setUsername("sa");
dataSource.setPassword("");
return dataSource;
}
The primary difference is that we are able to specify mvcc (Multiversion concurrency control) in connection string which resolves the issue.
We have a web application with Entity Framework 4.0. Unfortunately, when the large volume of users hit the application the EF throws an error
The underlying provider failed on Open
Below is the code snippet:
//DAL
public IQueryable<EmployeeEntity> GetEmployeeDetail()
{
DatabaseEntities ent = new DatabaseEntities(this._connectionString);
IQueryable<EmployeeEntity> result = from employee in ent.EmployeeEntity
select employee;
return result;
}
Please note the above code returns IQuerable.
Is anything wrong with above pattern that could cause the exception to occur?
When and how does Entity Framework determine to close / open db connection and also how long to retain?
On what scenario does above error occurs?
What's the maximum number of connection pool for EF and how do we configure?
Do we need to explicitely specify open and close
Is code below a best way to resolve above issue?
public IQueryable<EmployeeEntity> GetEmployeeDetail()
{
using (DatabaseEntities ent = new DatabaseEntities(this._connectionString))
{
IQueryable<EmployeeEntity> result = from employee in ent.EmployeeEntity
select employee;
return result.ToList().AsQuerable();
}
}
The ToList() call will cause the query to run on the database immediately and as this is not filtered in any way will return every employee in your database. This is probably likely to cause you performance issues.
However you can't remove this in your case because if you return the IQueryable directly then the context will be disposed by the time you try and fetch results.
You could either:
change the way it works so that the scope of ent does not end when the method returns and return the query without calling ToList(). You can then further filter the IQueryable before calling ToList().
call ToList() within the method but filter/limit the query first (e.g. pass some parameters into the method to specify this) to reduce the number of rows coming back from the database.
I'm trying to update a table using JPA
EntityManager em=JPA.em();
EntityTransaction entr = em.getTransaction();
try{
if(!entr.isActive())
entr.begin();
Tblrecordtypefields updateTblrecordtypes = em.find(Tblrecordtypefields.class,9);
updateTblrecordtypes.setFieldlabel("JPATest");
em.getTransaction().commit();
}finally
{
if(entr.isActive())
entr.rollback();
}
i'm getting the error
NullPointerException occured : null at
updateTblrecordtypes.setFieldlabel("JPATest");
What should i do.
I see some possible issues in there:
First, Play manages the transactions on it's own. A transaction is created at the beginning of the request and committed (rollback if exception) at the end. You are trying to force your way into it, that's not recommended. To manage the entity, just do an entity.save() to mark it as "to be saved" and don't do tht to ignore any changes.
Second, if you are using the Model class in Play (as you should) you can use the "find" and "findById" methods provided by this class. This is recommened, instead of using the EntityManager directly.
See the documentation for more information.
Basically, redo your code to follow the Play way, to avoid problems :)
EDIT: as a clarification, I'm not really answering your question on why you get the NPE, but I think that as you are forcing your way into the settings of the framework you might (maybe not!) be seeing unexpected artifacts that will dissapear once you fix your code to follow convention.
If after that you still have the error let us know :)
This means that there is no row with ID 9 in the database table mapped by the entity Tblrecordtypefields.
BTW: I find it very questionable to commit a transaction in a method which is not necessary the one that started the transaction.
I have changed my code as below
Tblrecordtypefields updateTblrecordtypeFields = Tblrecordtypefields.findById(9);
updateTblrecordtypeFields.setFieldlabel("Test");
validation.valid(updateTblrecordtypeFields);
if(validation.hasErrors())
{
updateTblrecordtypeFields.refresh();
}
else
{
updateTblrecordtypeFields.save();
}
in my model class
public void setFieldlabel(String fieldlabel) {
this.fieldlabel = fieldlabel;
}
Works Fine.....
Trying to do some unit testing with EF 4.1 code first. I have my live db (SQL Server) and my unit test DB( Sql CE). After fighting (and losing) with EF, Sql CE 4.0 and Transaction support I decided the simplest way to run my test was to:
Create Db
Run Test
Delete Db
Rinse and repeat
I have my [Setup] and [TearDown] functions:
[SetUp]
public void Init()
{
System.Data.Entity.Database.SetInitializer(new MyTestContextInitializer());
_dbContext = ContainerFactory.Container.GetInstance<IContext>();
_testConnection = _dbContext.ConnectionString;
}
[TearDown]
public void Cleanup()
{
_dbContext.Dispose();
System.Data.Entity.Database.Delete(_testConnection);
}
Issue is that System.Data.Entity.Database.SetInitializer does not call MyTestContextInitializer after the first test.
Hence the 2nd test then fails with:
System.Data.EntityException : The
underlying provider failed on Open.
----> System.Data.SqlServerCe.SqlCeException
: The database file cannot be found.
Check the path to the database
TIA for any pointers
I got around this by calling 'InitializeDatabase' manually. Like so:
[SetUp]
public void Init()
{
var initializer = new MyTestContextInitializer();
System.Data.Entity.Database.SetInitializer(initializer);
_dbContext = ContainerFactory.Container.GetInstance<IContext>();
initializer.InitializeDatabase((MyTestContext)_dbContext);
_testConnection = _dbContext.ConnectionString;
}
[TearDown]
public void Cleanup()
{
System.Data.Entity.Database.Delete(_testConnection);
_dbContext.Dispose();
}
I think it may be a bug with EF 4.1 RC.
It's not a bug, the initializer set with
System.Data.Entity.Database.SetInitializer
is only called when the context is created for the first time in the AppDomain. Hence, since you're running all your tests in a single AppDomain, it's only called when the first test is ran.
It took me almost a day to find out what caused my strange unittest behaviour: the database connection stayed open or the database was not created with a every new test. I searched everywhere for the root of the cause: MSTest (no Admin rights or where working copies of files somehow deleted?), SQL Server Express/CE (login failure?), Unity (objects not disposed?) or Entity Framework (no proper database initialization?). It turned out to be EF. Thanks a lot for the answer!