Mybatis query bigInteger error - mybatis

mybatis query bigInteger error
here is the stack information,no problem if you shorten the length,e.g 982544369348876697
Cause: java.sql.SQLException: java.lang.NumberFormatException: 9825443693488766976
; uncategorized SQLException for SQL []; SQL state [HY000]; error code [1105]; java.lang.NumberFormatException: 9825443693488766976; nested exception is java.sql.SQLException: java.lang.NumberFormatException: 9825443693488766976
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:84) ~[spring-jdbc-4.2.6.RELEASE.jar:4.2.6.RELEASE]
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:81) ~[spring-jdbc-4.2.6.RELEASE.jar:4.2.6.RELEASE]
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:81) ~[spring-jdbc-4.2.6.RELEASE.jar:4.2.6.RELEASE]
at org.mybatis.spring.MyBatisExceptionTranslator.translateExceptionIfPossible(MyBatisExceptionTranslator.java:73) ~[mybatis-spring-1.3.2.jar:1.3.2]
at org.mybatis.spring.SqlSessionTemplate$SqlSessionInterceptor.invoke(SqlSessionTemplate.java:446) ~[mybatis-spring-1.3.2.jar:1.3.2]
at com.sun.proxy.$Proxy35.selectList(Unknown Source) ~[?:?]
at org.mybatis.spring.SqlSessionTemplate.selectList(SqlSessionTemplate.java:230) ~[mybatis-spring-1.3.2.jar:1.3.2]
at org.apache.ibatis.binding.MapperMethod.executeForMany(MapperMethod.java:128) ~[mybatis-3.4.0.jar:3.4.0]
at org.apache.ibatis.binding.MapperMethod.execute(MapperMethod.java:68) ~[mybatis-3.4.0.jar:3.4.0]
at org.apache.ibatis.binding.MapperProxy.invoke(MapperProxy.java:53) ~[mybatis-3.4.0.jar:3.4.0]
at com.sun.proxy.$Proxy69.selectJobTransactionEvent(Unknown Source) ~[?:?]

The problem is that you are trying to retrieve a big number from the database, and it cannot be stored in a Java Long (that you specified in your VO).
You don't say what's the column data type but my guess is that it's something like a UNSIGNED BIGINT. These columns support numbers bigger than 9223372036854775807, that is the maximum for a Java Long.
Solution? Use a java.math.BigInteger instead of Long in the Java Value Object. I've tried it and works like charm.

Related

How to output nested Row from Beam SQL (SqlTransform)?

I want to have Row with nested Row from output of Beam SQL (SqlTransform), but failing.
Questions:
What is the proper way to output Row with nested Row from SqlTransform? (Row type is described in the docs, so I believe it's supported)
If this is a bug/missing feature, is the problem of Beam itself? Or runner-dependent? (I'm currently using on DirectRunner, but going to use DataflowRunner in future.)
Version info:
OS: macOS 10.15.7 (Catalina)
Java: 11.0.11 (AdoptOpenJDK)
Beam SDK: 2.32.0
Here's what I've tried, with no luck.
With Calcite dialect
SELECT ROW(foo, bar) as my_nested_row FROM PCOLLECTION
I was expecting this outputs row with following schema
Field{name=my_nested_row, description=, type=ROW<foo STRING NOT NULL, bar INT64 NOT NULL> NOT NULL, options={{}}}
but actually row is split into scalar fields like
Field{name=my_nested_row$$0, description=, type=STRING NOT NULL, options={{}}}
Field{name=my_nested_row$$1, description=, type=INT64 NOT NULL, options={{}}}
Zeta SQL
SELECT STRUCT(foo, bar) as my_nested_row FROM PCOLLECTION
I got an error
java.lang.UnsupportedOperationException: Does not support expr node kind RESOLVED_MAKE_STRUCT
at org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter.convertRexNodeFromResolvedExpr (ExpressionConverter.java:363)
at org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter.convertRexNodeFromResolvedExpr (ExpressionConverter.java:323)
at org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter.convertRexNodeFromComputedColumnWithFieldList (ExpressionConverter.java:375)
at org.apache.beam.sdk.extensions.sql.zetasql.translation.ExpressionConverter.retrieveRexNode (ExpressionConverter.java:203)
at org.apache.beam.sdk.extensions.sql.zetasql.translation.ProjectScanConverter.convert (ProjectScanConverter.java:45)
at org.apache.beam.sdk.extensions.sql.zetasql.translation.ProjectScanConverter.convert (ProjectScanConverter.java:29)
at org.apache.beam.sdk.extensions.sql.zetasql.translation.QueryStatementConverter.convertNode (QueryStatementConverter.java:102)
at org.apache.beam.sdk.extensions.sql.zetasql.translation.QueryStatementConverter.convert (QueryStatementConverter.java:89)
at org.apache.beam.sdk.extensions.sql.zetasql.translation.QueryStatementConverter.convertRootQuery (QueryStatementConverter.java:55)
at org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLPlannerImpl.rel (ZetaSQLPlannerImpl.java:98)
at org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner.convertToBeamRelInternal (ZetaSQLQueryPlanner.java:197)
at org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner.convertToBeamRel (ZetaSQLQueryPlanner.java:185)
at org.apache.beam.sdk.extensions.sql.impl.BeamSqlEnv.parseQuery (BeamSqlEnv.java:111)
at org.apache.beam.sdk.extensions.sql.SqlTransform.expand (SqlTransform.java:171)
at org.apache.beam.sdk.extensions.sql.SqlTransform.expand (SqlTransform.java:109)
at org.apache.beam.sdk.Pipeline.applyInternal (Pipeline.java:548)
at org.apache.beam.sdk.Pipeline.applyTransform (Pipeline.java:482)
at org.apache.beam.sdk.values.PCollection.apply (PCollection.java:363)
at dev.tmshn.playbeam.Main.main (Main.java:29)
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke (Method.java:566)
at org.codehaus.mojo.exec.ExecJavaMojo$1.run (ExecJavaMojo.java:282)
at java.lang.Thread.run (Thread.java:829)
Unfortunately Beam SQL does not yet support nested rows, mainly due to a lack of support in Calcite (and therefore a corresponding lack of support for the ZetaSQL implementation). See this similar question focused on Dataflow.
On the bright side, the Jira issue tracking this support seems to be resolved for 2.34.0, so proper support is likely upcoming.

Spark Streaming, reading from Socket: java.lang.ClassCastException: java.lang.String cannot be cast to org.apache.spark.unsafe.types.UTF8String

I am on Windows 10 trying to multiple read text lines, separated by '\n' from a TCPsocket source (test purpose so far) using Spark Streaming (Spark 2.4.4). Words should be counted and current word count regularly displayed on the console. This is a standard test of Spark streaming, found in several books and web posts, but seems to fail with the socket source:
Text strings are sent from a Java program like:
serverOutSock = new ServerSocket(9999);
// Establish connection; wait for Spark to connect
sockOut = serverOutSock.accept();
// Set UTF-8 as format
sockOutput = new OutputStreamWriter(sockOut.getOutputStream(),"UTF-8");
// Multiple Java Strings are now written (thousands of them) like
sockOutput.write(string+'\n');
On the Spark receiving side, the Scala code looks like:
val spark = SparkSession.builder.master("local[*]").getOrCreate()
import spark.implicits._
val socketDF = spark.readStream.format("socket").option("host","localhost").option("port",9999).load
val words = socketDF.as[String].flatMap(_.split(" ")).coalesce(1)
val wordCounts = words.groupBy("value").count()
val query = wordCounts.writeStream
.trigger(Trigger.Continuous("1 second"))
.outputMode("complete")
.format("console")
.start
.awaitTermination
So, I would like to get a once-a-second write out on the console of the current word count.
But I get an error:
java.lang.ClassCastException: java.lang.String cannot be cast to org.apache.spark.unsafe.types.UTF8String
and nothing seems to be processed by Spark from the source (due to cast exception of source input?). At least nothing is written out on the console. What can be the reason for this?
Full stack trace follows:
Exception in thread "null-4" java.lang.ClassCastException: java.lang.String cannot be cast to org.apache.spark.unsafe.types.UTF8String
at org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow$class.getUTF8String(rows.scala:46)
at org.apache.spark.sql.catalyst.expressions.GenericInternalRow.getUTF8String(rows.scala:195)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$11$$anon$1.hasNext(WholeStageCodegenExec.scala:619)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage2.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$11$$anon$1.hasNext(WholeStageCodegenExec.scala:619)
at org.apache.spark.sql.execution.streaming.continuous.shuffle.RPCContinuousShuffleWriter.write(RPCContinuousShuffleWriter.scala:51)
at org.apache.spark.sql.execution.streaming.continuous.ContinuousCoalesceRDD$$anonfun$4$$anon$1.run(ContinuousCoalesceRDD.scala:112)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
I have tried to remove coalesque(1) and replaced the Continuous trigger with a ProcessingTime trigger. This makes the error not to happen, but the console printout becomes:
Batch: 0
+-----+-----+
|value|count|
+-----+-----+
+-----+-----+
That is, no output, even though many words indeed are injected into the socket. Also, this output is shown only ondce, and much later than after 1 second.

javax.mail throws IndexOutOfBoundsException when index do exists

package using:
com.sun.mail:javax.mail:1.5.6 from maven
I wrote a scala program where I use javax.mail to deal with emails. In the first part I get some mail id by message.getMessageNumber and later when I tried to retrieve the mail by these id IndexOutOfBoundsException happened. Mail server has nothing changed during the process.
Here is the code I get the id of messages.
val Final = new AndTerm(Subject,Size)
//val FinalTerm = new AndTerm(From)
val messages = inbox.search(Final).map{
message=>
val date = trim(message.getSubject)
(date,message.getMessageNumber)
}.filter(_._1.isDefined).map(_._2)
inbox.close(true)
store.close
And here is the code Exception throwed.
//newed another store and Folder with the same name
val ContentType = messages.map(id=>inbox.getMessage(id).getContentType())
inbox.close(true)
store.close
The Exception Message:
Exception in thread "main" java.lang.IndexOutOfBoundsException: 416 > 64
at com.sun.mail.imap.IMAPFolder.checkRange(IMAPFolder.java:513)
at com.sun.mail.imap.IMAPFolder.getMessage(IMAPFolder.java:1770)
at EmailReader.MessageByNumber(EmailReader.scala:67)
at Main$$anonfun$main$1.apply(Main.scala:43)
at Main$$anonfun$main$1.apply(Main.scala:41)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
at Main$.main(Main.scala:40)
at Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
The 416 > 64 gives me a hint that maybe there is some server-side limitation, is that true?
It looks like you're closing the Folder after fetching the Message objects. Message numbers (and Message objects) are only valid while the folder is open.
I believe the numbers represent that you passed an ID of 416 whereas the count of the collection is 64. It appears that getMessage(id) may be attempting to look for a number to retrieve that message from the array; such as getMessage(0) would be the first in the collection. Instead, it appears the code is passing the Message ID which is not directly translatable to the number in the collection.

Openjpa-2.2.2-r422266:1468616 nonfatal user error caused by ArgumentException and InvalidDataAccessApiUsageException

We have a web application run on tomcat and has code as below to find a schema entity.
#Override
public Schema findSchemaByCategoryAndDomainId(String category, Integer domainId) throws Exception
{
return schemaDao.findByCategoryAndDomainId(category, domainId);
}
The database and ORM we used as follows
DataBase: PostgreSQL v9.4 (Windows version)
OpenJPA: version 2.2.2
SpringDataJPA: version 1.3.0.RELEASE
It works fine at the begining, but after amount of query have made, about 112000 times in 4 hours, it becomes failed and throw the exception as below:
[ERROR][datacollection.service.DataCollectionServiceImpl.postProbeData():711][16/06/15 11:53:27.147]
Exception:
org.springframework.dao.InvalidDataAccessApiUsageException: Parameter ParameterExpression<Integer> for query "null" exceeds the number of 2 bound parameters with following values "{ParameterExpression<Integer>=0, ParameterExpression<String>=PROBE_DATA}". This can happen if you have declared but missed to bind values for one or more parameters.; nested exception is <openjpa-2.2.2-r422266:1468616 nonfatal user error> org.apache.openjpa.persistence.ArgumentException: Parameter ParameterExpression<Integer> for query "null" exceeds the number of 2 bound parameters with following values "{ParameterExpression<Integer>=0, ParameterExpression<String>=PROBE_DATA}". This can happen if you have declared but missed to bind values for one or more parameters.
at org.springframework.orm.jpa.EntityManagerFactoryUtils.convertJpaAccessExceptionIfPossible(EntityManagerFactoryUtils.java:384)
at org.springframework.orm.jpa.DefaultJpaDialect.translateExceptionIfPossible(DefaultJpaDialect.java:122)
at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.translateExceptionIfPossible(AbstractEntityManagerFactoryBean.java:417)
at org.springframework.dao.support.ChainedPersistenceExceptionTranslator.translateExceptionIfPossible(ChainedPersistenceExceptionTranslator.java:59)
at org.springframework.dao.support.DataAccessUtils.translateIfNecessary(DataAccessUtils.java:213)
at org.springframework.dao.support.PersistenceExceptionTranslationInterceptor.invoke(PersistenceExceptionTranslationInterceptor.java:147)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.data.jpa.repository.support.LockModeRepositoryPostProcessor$LockModePopulatingMethodIntercceptor.invoke(LockModeRepositoryPostProcessor.java:92)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:92)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:207)
at com.sun.proxy.$Proxy104.findByCategoryAndDomainId(Unknown Source)
at devicemanage.service.appdeploy.AdminServiceImpl.findSchemaByCategoryAndDomainId(AdminServiceImpl.java:2080)
at devicemanage.service.appdeploy.AdminServiceImpl.findSchemaOIdByCategoryAndDomainId(AdminServiceImpl.java:2086)
at devicemanage.service.appdeploy.AdminServiceImpl$$FastClassBySpringCGLIB$$76519d18.invoke(<generated>)
at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:204)
at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:649)
at devicemanage.service.appdeploy.AdminServiceImpl$$EnhancerBySpringCGLIB$$7f8fa940.findSchemaOIdByCategoryAndDomainId(<generated>)
at datacollection.service.DataCollectionServiceImpl.postProbeData(DataCollectionServiceImpl.java:683)
...
Caused by: <openjpa-2.2.2-r422266:1468616 nonfatal user error> org.apache.openjpa.persistence.ArgumentException: Parameter ParameterExpression<Integer> for query "null" exceeds the number of 2 bound parameters with following values "{ParameterExpression<Integer>=0, ParameterExpression<String>=PROBE_DATA}". This can happen if you have declared but missed to bind values for one or more parameters.
at org.apache.openjpa.kernel.ExpressionStoreQuery$AbstractExpressionExecutor.toParameterArray(ExpressionStoreQuery.java:423)
at org.apache.openjpa.datacache.QueryCacheStoreQuery$QueryCacheExecutor.toParameterArray(QueryCacheStoreQuery.java:481)
at org.apache.openjpa.kernel.QueryImpl.execute(QueryImpl.java:857)
at org.apache.openjpa.kernel.QueryImpl.execute(QueryImpl.java:794)
at org.apache.openjpa.kernel.DelegatingQuery.execute(DelegatingQuery.java:542)
at org.apache.openjpa.persistence.QueryImpl.execute(QueryImpl.java:286)
at org.apache.openjpa.persistence.QueryImpl.getResultList(QueryImpl.java:302)
at org.apache.openjpa.persistence.QueryImpl.getSingleResult(QueryImpl.java:330)
at org.springframework.data.jpa.repository.query.JpaQueryExecution$SingleEntityExecution.doExecute(JpaQueryExecution.java:123)
at org.springframework.data.jpa.repository.query.JpaQueryExecution.execute(JpaQueryExecution.java:55)
at org.springframework.data.jpa.repository.query.AbstractJpaQuery.doExecute(AbstractJpaQuery.java:95)
at org.springframework.data.jpa.repository.query.AbstractJpaQuery.execute(AbstractJpaQuery.java:85)
at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.invoke(RepositoryFactorySupport.java:312)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:98)
at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:266)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:95)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.dao.support.PersistenceExceptionTranslationInterceptor.invoke(PersistenceExceptionTranslationInterceptor.java:136)
... 52 more
We have debugged it in debug mode, and it looks like the paramater String category and Integer domainId are both correct. After we have restart the tomcat, it works fine again. Is this a JPA issue or we need to tuning the JPA or database settings?
After we upgraded the spring-data-jpa lib to version 1.9.0.RELEASE or later, the issue is solved. Under 8 asynchronous multi-thread query testing at each 200 ms one times.

Database errors in Mirth channel

I want to use Mirth to connect to a database, then write a record to a table in that database.
The record contains a field "file_name", and this file name contain Date value, so a new file whose name would be like this:
temp_2015-08-10
This is what I passed to Mirth Destination SQL field:
INSERT INTO statutory_reports (str_est_id, str_type, str_create_date, str_created, str_record_status, str_file_path, str_file_name, str_created_by) VALUES (2, 'temp', CURDATE(), NOW(),'approved', 'C:/application/reports/temp reports/gumcad/', 'temp'+ ${date.get('yyyy-M-d hh:MM:ss')}, 'SHEP');
The problem is I get an error:
Database Writer error
ERROR MESSAGE: Failed to write to database
com.mirth.connect.connectors.jdbc.DatabaseDispatcherException: Failed to write to database
at com.mirth.connect.connectors.jdbc.DatabaseDispatcherQuery.send(DatabaseDispatcherQuery.java:143)
at com.mirth.connect.connectors.jdbc.DatabaseDispatcher.send(DatabaseDispatcher.java:103)
at com.mirth.connect.donkey.server.channel.DestinationConnector.handleSend(DestinationConnector.java:738)
at com.mirth.connect.donkey.server.channel.DestinationConnector.process(DestinationConnector.java:436)
at com.mirth.connect.donkey.server.channel.DestinationChain.call(DestinationChain.java:155)
at com.mirth.connect.donkey.server.channel.Channel.process(Channel.java:1656)
at com.mirth.connect.donkey.server.channel.Channel.dispatchRawMessage(Channel.java:1155)
at com.mirth.connect.donkey.server.channel.SourceConnector.dispatchRawMessage(SourceConnector.java:191)
at com.mirth.connect.donkey.server.channel.SourceConnector.dispatchRawMessage(SourceConnector.java:169)
at com.mirth.connect.connectors.jdbc.DatabaseReceiver.processRecord(DatabaseReceiver.java:200)
at com.mirth.connect.connectors.jdbc.DatabaseReceiver.processResultSet(DatabaseReceiver.java:160)
at com.mirth.connect.connectors.jdbc.DatabaseReceiver.poll(DatabaseReceiver.java:117)
at com.mirth.connect.donkey.server.channel.PollConnector$PollConnectorTask.run(PollConnector.java:131)
at java.util.TimerThread.mainLoop(Unknown Source)
at java.util.TimerThread.run(Unknown Source)
Caused by: com.mysql.jdbc.MysqlDataTruncation: Data truncation: Truncated incorrect DOUBLE value: '2015-8-10 09:08:44'
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4206)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4140)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2597)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2758)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2826)
at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:2082)
at com.mysql.jdbc.PreparedStatement.execute(PreparedStatement.java:1302)
at com.mirth.connect.connectors.jdbc.DatabaseDispatcherQuery.send(DatabaseDispatcherQuery.java:130)
The problem is the database is expecting yyyy-MM-dd for a DATE and you are providing yyyy-M-dd hh:mm:ss (note the month with one digit).
Format your date correctly with two digit month and remove the time part. If you want to provide the time, your database type should be DATETIME.
It's pretty descriptive: Truncated incorrect DOUBLE value: '2015-8-10 09:08:44'
The value (a date) is not of type double.
str_create_date or str_created is defined as double in your DB, but you are writing a Date type to it, which does not match.
If this is not the case, can you copy your DB schema here for validation?
vim to /opt/mirthconnect/conf/mirth.properties
under the database url copy this : jdbc:mysql://localhost/mirthdb?useUnicode=true&useJDBCCompliantTimezoneShift=true&useLegacyDatetimeCode=false&serverTimezone=UTC
The new JavaMysql odbc seem to have this as a requirement, I think for security reasins