Perl parsing a log4j log [duplicate] - perl

We have several applications that use log4j for logging. I need to get a log4j parser working so we can combine multiple log files and run automated analysis on them. I'm not looking to reinvent the wheel, so can someone point me to a decent pre-existing parser? I do have the log4j conversion pattern if that helps.
If not, I'll have to roll our own.

I didn't realize that Log4J ships with an XML appender.
Solution was: specify an XML appender in the logging configuration file, include that output XML file as an entity into a well formed XML file, then parse the XML using your favorite technique.
The other methods had the following limitations:
Apache Chainsaw - not automated enough
jdbc - poor performance in a high performance distributed app

You can use OtrosLogViewer with batch processing. You have to:
Define you log format, you can use Log4j pattern layout parser or Log4j XmlLayout
Create java class that implements LogDataParsedListener. Method public void logDataParsed(LogData data, BatchProcessingContext context) will be called on every parsed log event.
Create jar
Run OtrosLogViewer with specifying your log processing jar, LogDataParsedListener implementation and log files.

What you are looking for is called SawMill, or something like it.

Log4j log files aren't really suitable for parsing, they're too complex and unstructured. There are third party tools that can do it, I believe (e.g. Sawmill).
If you need to perform automated, custom analysis of the logs, you should consider logging to a database, and analysing that. JDBC ships with the JdbcAppender which appends all messages to a database of your choice, but it has performance implications, and it's a bit flaky. There are other, similar, alternatives on the interweb, though (like this one).

You -can- use Log4j's Chainsaw V2 to process the various log files and collect them into one table, and either output those events as xml or use Chainsaw's built-in expression-based filtering, searching & colorizing support to slice & dice the logs.
Steps:
- Start Chainsaw V2
- Create a chainsaw configuration file by copying the example configuration file available from the Welcome tab - define one LogFilePatternReceiver 'plugin' entry for each log file that you want to process
- Start Chainsaw with that configuration
- Each log file will end up as a separate tab in the UI
- Pause the chainsaw-log tab and clear the events from that tab
- Create a new tab which aggregates the events from the various tabs by going to the 'view, crate custom expression logpanel' menu item and enter 'level >= DEBUG' in the box. It will create a new tab containing events from all of the tabs with level >= debug (which is why you cleared the chainsaw-log tab).
You can get an overview of the expression syntax used to filter, colorize and search from the tutorial (available from the Help menu).
If you don't want to use Chainsaw, you can do something similar - start a simple app that doesn't log but loads a log4j.xml config file with the 'plugin' entries you defined for the Chainsaw configuration, but also define a FileAppender with an xmllayout - all of the events received by the 'receivers' will be sent to the single appender.

Related

GSettings, glib-compile-schemas and Eclipse

I am building this Gtkmm3 application in Ubuntu and wanted to explore GSettings. All was going well while following the instructions at the 'Using GSettings' page and then it was time to configure the make files. I use Eclipse 2019-12 IDE with CDT (V9.10) and 'GNU Make Builder' as the builder. I'm totally perplexed as to how to introduce the macros listed in the GNOME page into the make files. I even tried changing the project to a 'C/C++ Autotools Project' using Eclipse but still the necessary make files to add the macros were missing. Creating a new project with GNU Autotools does create the necessary make files but I was not able to get pkg-config to work with it.
Can anyone point me to some resource which explains how to compile the schema and how & where to load the resultant binary file (externally if necessary). I'll consider myself blessed if someone has already made a Gtkmm3 C++ application with GSettings support using Eclipse IDE in Linux and can share the details.
Finally I figured. Thought I'll share my findings here. Actually some one out there had explained this for python (link below).
Using GSettings with Python/PyGObject
Creating the schema
For the developer the work starts with defining a schema for the settings. A schema is an XML file that looks something like this.
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE schemalist SYSTEM "gio_gschema.dtd" >
<schemalist>
<schema id="org.gtk.skanray.emlibrary"
path="/org/skanray/emlibrary/" gettext-domain="emlibrary">
<key name="wave-pressure-ptrach-visible" type="b">
<default>true</default>
<summary>Set visibility of 'Ptrach' trace in pressure waveform.</summary>
<description>The pressure waveform shows multiple traces where 'PAW' is always enabled and additionally 'Ptrach' can be displayed. This settings affects the visibility of the trachial pressure trace shown in this waveform channel.</description></key>
</schema>
</schemalist>
The file name has to have a ‘.gschema.xml’ suffix. The schema file should be in the project path, only so that it gets pushed to SVN.
Best would be to use an XML editor (e.g. Eclipse) that supports design of XML files from a DTD file. Use following DTD file.
gschema.dtd
It is possible to store anything derived from GVariant into GSettings. Refer to following page to understand the basic types and the ‘type’ attribute to be used in the schema.
GVariant Format Strings
Compiling the schema
With the schema ready, (sudo) copy it into /usr/share/glib-2.0/schemas/ then run,
> sudo glib-compile-schemas /usr/share/glib-2.0/schemas/
At this point, the newly added settings can be seen / modified using dconf editor.
Accessing GSettings from the application
Coming to the main event of the show, this is how an application can read ( and / or write) settings. It is not necessary that one needs to bind property of an object to a ‘key’ in GSettings, it may be queried and used as well. Refer to GSettings API reference for details.
Glib::RefPtr <Gio::Settings> refSettings = Gio::Settings::create(“org.gtk.skanray.emlibrary”);
CLineTrace * pTrace = NULL; // CLineTrace is derived from Gtk::Widget
…
pTrace = …
…
if(refSettings)
{
refSettings->bind("wave-pressure-ptrach-visible",
pTrace,
"visible",
Gio::SETTINGS_BIND_DEFAULT);
}
Now you can fire up dconf editor and test the settings.
NOTE
Bindings are usually preferred to be made in class constructors. However binding to ‘visible’ property of a widget could be a bit tricky. Typically the top level window does a show_all() as the last line in its constructor. However constructors of the children of top level window would have completed executing including making the bindings. If there were settings that had stored ‘visibility’ as false then the top level window’s call to show_all() would mess up with that setting. In such cases it is advised to perform the bind one time in the on_map() handler of the respective class.

Eclipse plugin Incubator's "Web Templates (Advanced)" plugin (with secured Redmine): Failed to parse RSS feed / invalid xml

Trying to connect some restricted Redmine instance to our Eclipse Mylyn environment it worked in the beginning, but the re-imports did not with some error "Failed to parse RSS feed".
I stumbled across this #246440 Eclipse Mylyn ticket where some workaround was to recreate the Task Repository including the Task List Queries by hand.
But this is not a nice solution.
So I played around a bit more and found the following that solved our import issues:
most likely for your needs: remove the key value (or other security-relevant data) from the exported <task list query>.xml.zip / tasklist.xml since the queries contain some user-dependent authentication API (e.g. if shared with other users)
it should anyways be configured on your related Task Repository for all dependent queries and will be re-imported automatically on later import
make sure that (e.g. through some used formatter, CTRL + F or manual formatting) there are no whitespaces in text-value XML nodes, because thus the queries may stop working after import:
e.g.
<Attribute Key="Regexp">^({Id}\d+);({Type}[^;]*);...$
</Attribute>
should be:
<Attribute Key="Regexp">^({Id}\d+);({Type}[^;]*);...$</Attribute>
go on Task List -> <your imported query> -> right click -> Properties -> Finish so some internal magic "fixes" your query
Another debugging hint: you can always check the retrieved files (and Query Pattern regexp using the Preview button) using the <your query -> Properties -> Advanced Configuration -> Open button, which should put the unparsed query result in e.g. c:\Users\<loginname>\AppData\Local\Temp\mylyn-web-connector4155864524987884464.html.
By the way: (If you are at the above point it may likely be useful for you or your team ...) Using the web connector I found the integration via the API key in combination with the .../issues.csv... format much more useful and configurable than the .../issues.xml... variant.
We used something like this for parsing the CSV (and generated the params, their order etc. via normal filter dialogs): ^({Id}\d+);({Type}[^;]*);({Status}[^;]*);"?({Owner}[^";]*)"?;({Description}[^;]*)$.
Advantages are: easier regexp, concatenatable data for Description via column-ordering and fetching of all data without paging (=> we could skip page, per_page, limit, offset).

Log timestamp of an exception in Play?

I have some play code that calls some spark functions. Sometimes things break and I want a timestamp associated to the event. I currently get the error messages printed to STDOUT without a timestamp and am wondering if there's a way to configure the logger.xml to associate timestamps with these??
Providing your logger.xml file would have been helpful but anyways look for the "pattern" element for your STDOUT in the logger.xml file and change it by prepending the %d{ISO8601}
Check the logback configuration documentation for more information.
Logback Configuration

How to call java code in a talend job

I'm new to talend, I have java code which need to get data from files. I want to use them in talend job. Now am facing problem how to use this java code in talend, I created routine but facing problem in creating jar files, and also how should I use this routine in my job.
I don't know what you want to do but normally you would use the build-in Talend components for reading a file.
Depending on your File you are going to read you can use:
tFileInputRaw - for reading a file line by line
tFileInputDelimited - for reading CSV files (getting a set of columns)
tFileInputExcel - for XLS/XLSX files (getting a set of columns)
If you want to use your code anyway you have to make your routine available to your job. To do that, close your job, click right on the job and choose "setup routine dependencies". You should be able to add a routine by click the green "+"-button.
After that you are able to use your functions in a tJava or tJavaRow component with routines.ClassName.functionName().

Cruise Control .net Changing Log File appereance

i would like to change the apperance of the log file, generated by ccnet. It is useful, if the error messages are separated from the original Log Messages, but in order to debug, it is a bit tricky to see, when the error really happened. Our powershell skript runs for 6-8 hours and creates about 38k lines in the log file, so i would really apprechiate a solution, how i could list the errors with the other lines in the log files. Additionally it would be cool, if all the errors would still appear separatedly.
So far i have not found a lot documentary that explained how to change the log file output...
Simon
Not sure how this is logged, but in the end, logs produced during the build are put into the build-log file, that you will find in artifacts folder.
Then this logs are transposed into html output using xsl transforms. If none of the built-in reports is useful to you, you can create a custom xsl and plug it in, see the dashboard.config file, the following section allows for adding additional xsl transforms:
<buildReportBuildPlugin>
<xslReportBuildPlugin description="MSBuild Log" actionName="MSBuildBuildReport" xslFileName="xsl\MSBuild4Log.xsl"/>
...
If you know what the error messages are going to be you can parse them with an xsl file and generate some html that will show up in the build emails. The following goes in ccservice.exe.config.
<xslFiles>
<file name="c:\path\to\custom_errors.xsl"/>
</xslFiles>
custom_errors.xsl is an xsl file that finds the error messages in the raw build log xml and then generates html from them. This html will show up in the build emails. You have to create custom_errors.xsl. It's a significant amount of work to get working the first time especially if you're new to xml/xsl/html/css. If you undertake this I suggest doing all the testing outside of ccnet using a xsl transformer and inputting a sample ccnet build log. ccnet uses a css file to style the html so be aware of that. You can edit this too.
Note you have to restart the ccnet service after editing ccservice.exe.config.