JPA Date is output to wrong date/time when running on Weblogic server, but displays correctly when running locally embedded Tomcat - jpa

I have a spring boot application using spring data to retrieve data from an Oracle database. When I run the code locally in embedded Tomcat, the date appears correctly. However, the same code when deployed to Weblogic server gives a completely different date/time result. I have confirmed that the timezone on the Weblogic server is the same as my local timezone (US/Eastern).
What is odd is that the minutes are being stripped out and always set to 00, and also the time difference between the correct date and the date displayed is unpredictable (one example is 16 hours behind, and with a different example, it's 19 hours)
jpa mapping:
import java.util.Date;
...
#Temporal(TemporalType.TIMESTAMP)
#Column(name = "INITIAL_CREATE_DATE")
private Date initialCreateDate;
controller log statement:
SimpleDateFormat sdf = new SimpleDateFormat("MM/dd/yyyy hh:mm z");
...
log.info("~~create date: " + medley.getInitialCreateDate() + " sdf " + sdf.format(medley.getInitialCreateDate()));
output from Weblogic (WRONG):
~~create date: Thu Oct 26 20:00:00 EDT 2017 sdf 10/26/2017 08:00 EDT
output from local Tomcat and spring data unit tests (CORRECT):
~~create date: 2017-10-27 11:57:53.0 sdf 10/27/2017 11:57 EDT
Two things that stand out to me (besides the date/time being totally wrong)
1. The time is missing the minutes
2. The format is different on the .toString() output for the date
Any help or ideas how I can further troubleshoot this problem are very much appreciated!

I figured my problem out - posting my answer, maybe it will help someone else.
My "medley" object was not retrieved directly from the database here. It was retrieved by a job that then posted this medley object as JSON to this method here that outputs the date. While I still don't understand the odd date/time behavior, at least now I know the fix.
My previous version of the code was using #Temporal(TemporalType.DATE)
and serializing json as this...
"initialCreateDate": "2017-10-27",
which I don't understand why this was being converted to Thu Oct 26 20:00:00 - but at least I know that applying the #Temporal(TemporalType.TIMESTAMP) in my job fixed the problem. (i.e. updating the version of the repository code in pom.xml to the same version used by the service)
This is the correct serialization of the date/time:
"initialCreateDate":1509119873000

Related

Filenet Solution Deployment Take too much time

Deployment a solution with IBM Case Manager Builder take too mach time on deployment , When i look "Detail Deployment Log" file I can see following steps take too mach time.
11/5/18 8:55:55 PM GMT+05:30 FNRPA0120I The solution X001 pages are being deployed.
11/5/18 9:08:24 PM GMT+05:30 FNRPA0292I The following choice list is being deployed: X001_Category to.
and
11/5/18 9:14:22 PM GMT+05:30 FNRPA0116I The latest version of the Process Engine configuration document with a version series ID of {XXX-0000-C9B2-81BD-YYY} is being deployed.
11/5/18 9:22:46 PM GMT+05:30 FNRPA0118I The Process Engine XPDL document with an ID of {ZZZ-0100-C854-A789-HHH} is being deployed.
I feel like its because of too much versions , without copying solution with new solution ID what can I do ?Any best practices i can use to avoid this ?

Kafka Connect JSON format

I have a topic titled newtest in Kafka with three messages:
Hello
Is anybody out there
Can you hear me
...and I have the following config for a connect job:
{
"name":"connect-test-9",
"config":
{
"connector.class":"FileStreamSink",
"file":"connector-test",
"topics":"newtest",
"name":"connect-test-9",
"value.converter":"org.apache.kafka.connect.storage.StringConverter",
"value.converter.schemas.enable":"false",
"key.converter":"org.apache.kafka.connect.storage.StringConverter",
"key.converter.schemas.enable":"false",
"transforms":"Hoist, AddTimestamp",
"transforms.Hoist.type":"org.apache.kafka.connect.transforms.HoistField$Value",
"transforms.Hoist.field":"line",
"transforms.AddTimestamp.type":"org.apache.kafka.connect.transforms.InsertField$Value",
"transforms.AddTimestamp.timestamp.field":"Timestamp"
}
}
I'm getting the following output in file connector-test:
Struct{line=Hello,Timestamp=Mon Mar 12 14:50:34 PDT 2018}
Struct{line=Is anybody out there,Timestamp=Mon Mar 12 14:50:44 PDT 2018}
Struct{line=Can you hear me,Timestamp=Mon Mar 12 14:50:52 PDT 2018}
I would like to get this:
{"line":"Hello","Timestamp":"Mon Mar 12 14:50:34 PDT 2018"}
{"line":"Is anybody out there","Timestamp":"Mon Mar 12 14:50:44 PDT 2018"}
{"line":"Can you hear me","Timestamp":"Mon Mar 12 14:50:52 PDT 2018"}
I've tried changing the value.converter, no good (parse exception). I also have another topic where the message is already Json, and the parse succeeds there, and I can add a Timestamp without Hoist. But my output is the same non-Json format {key1=value1,key2=value2}.
Any way I can get the output in proper JSON?
This is the parse exception that I see:
com.fasterxml.jackson.core.JsonParseException: Unrecognized token 'Can': was expecting ('true', 'false', or 'null')
To get JSON output, you need to use the JsonConverter rather than the StringConverter the converter happens before the sink, and after the consumer deserialization
Your data is already in Kafka, and you used a Source Connector, perhaps with a StringConveter to ingest, then you convert to the internal Struct, which can be setup with a Sink Connector and a different Converter type

How to format the wildfly/Jboss startup time in the log?

The log of Wildfly shows the startup time in the format like started in 100000ms. The format I expected is ‘started in x minutes y seconds’, because ms unit is too long for a enterprise application. How to configure that?

Fail2Ban not working on Ubuntu 16.04 (Date issues)

I have a problem with Fail2Ban
2018-02-23 18:23:48,727 fail2ban.datedetector [4859]: DEBUG Matched time template (?:DAY )?MON Day 24hour:Minute:Second(?:\.Microseconds)?(?: Year)?
2018-02-23 18:23:48,727 fail2ban.datedetector [4859]: DEBUG Got time 1519352628.000000 for "'Feb 23 10:23:48'" using template (?:DAY )?MON Day 24hour:Minute:Second(?:\.Microseconds)?(?: Year)?
2018-02-23 18:23:48,727 fail2ban.filter [4859]: DEBUG Processing line with time:1519352628.0 and ip:158.140.140.217
2018-02-23 18:23:48,727 fail2ban.filter [4859]: DEBUG Ignore line since time 1519352628.0 < 1519381428.727771 - 600
It says "ignoring Line" because the time skew is greater than the inspection period. However, this is not the case.
If indeed 1519352628.0 is derived from Feb 23, 10:23:48, then the other date: 1519381428.727771 must be wrong.
I have run tests for 'invalid user' hitting this repeatedly. But Fail2ban is always ignoring the line.
I am positive I am getting Filter Matches within 1 second.
This is Ubuntu 16.04 and Fail2ban 0.9.3
Thanks for any help you might have!
Looks like there is a time zone issue on your machine that might cause the confusion. Try to set the correct time zone and restart both rsyslogd and fail2ban.
Regarding your debug log:
1519352628.0 = Feb 23 02:23:48
-> timestamp parsed from line in log file with time Feb 23 10:23:48 - 08:00 time zone offset!
1519381428.727771 = Feb 23 10:23:48
-> timestamp of current time when fail2ban processed the log.
Coincidently this is the same time as the time in the log file. That's what makes it so confusing in this case.
1519381428.727771 - 600 = Feb 23 10:13:48
-> limit for how long to look backwards in time in the log file since you've set findtime = 10m in jail.conf.
Fail2ban 'correctly' ignores the log entry that appears to be older than 10 minutes, because of the set time zone -08:00.
btw:
If you need IPv6 support for banning, consider upgrading fail2ban to v0.10.x.
And there is also a brand new fail2ban version v0.11 (not yet marked stable, but running without issue for 1+ month on my machines) that has this wonderful new auto-increment bantime feature.

OpsHub Visual Studio Online Migration Utility - Error saving configuration OpsHub-001105

In the OpsHub Visual Studio Online Migration Utility, we get an error when pressing "Finish" on a new migration.
The error message says:
Configuration failed due to following reason(s):
com.opshub.exceptions.DataValidationException: Opshub-001105: Can not
parse date "Thu Jan 01 1970 01:00:00". Expected format EE MMM d yyyy
H:m:s
We have tried changing the Regional settings to "English (United States)" and restarting the server, didn't help.
Last entry in the OpsHub.log:
06/23/2017 10:50:06,644 ERROR [http-nio-8989-exec-7] (com.opshub.eai.config.service.ConfigServiceImpl) - OpsHub-001105: Can not parse date "Thu Jan 01 1970 01:00:00". Expected format EEE MMM d yyyy H:m:s
com.opshub.exceptions.DataValidationException: OpsHub-001105: Can not parse date "Thu Jan 01 1970 01:00:00". Expected format EEE MMM d yyyy H:m:s
at com.opshub.utils.DateUtils.convertStringToCalendar(DateUtils.java:235)
at com.opshub.utils.DateUtils.convertDateStringToTimeStamp(DateUtils.java:218)
at com.opshub.eai.business.EaiConfigBO.insertPollingTimeKeyForAudit(EaiConfigBO.java:1072)
at com.opshub.eai.business.EaiConfigBO.createOrUpdateEAIConnector(EaiConfigBO.java:382)
at com.opshub.eai.config.business.ConfigServiceBusiness.createIntegration(ConfigServiceBusiness.java:1320)
at com.opshub.eai.config.business.ConfigServiceBusiness.generateIntegrationsAndMappings(ConfigServiceBusiness.java:1049)
at com.opshub.eai.config.business.ConfigServiceBusiness.generateIntegrationAndMappings(ConfigServiceBusiness.java:504)
at com.opshub.eai.config.service.ConfigServiceImpl.generateIntegration(ConfigServiceImpl.java:197)
at com.opshub.eai.config.service.ConfigServiceImpl.generateIntegrations(ConfigServiceImpl.java:159)
at com.opshub.eai.config.service.ConfigServiceImpl$$EnhancerByCGLIB$$93f9f889.CGLIB$generateIntegrations$11(<generated>)
at com.opshub.eai.config.service.ConfigServiceImpl$$EnhancerByCGLIB$$93f9f889$$FastClassByCGLIB$$d590bcb9.invoke(<generated>)
...
This is something that is being fixed in the tool. Until then the workaround for this issue would be to change the machine locale to 'US'.
Once you change the locale (by being logged in as some machine user), you'll have to configure your OS to run the tool as that user (since by default 'Local System' runs most services). So, open services.msc, find the service named 'OpsHub Visual Studio Online Migration Utility' and change it's log-on user as the one whose locale you changed.
Now, restart the utility and the workaround should help you avoid the error.