How to get application DEBUG and INFO messages properly logged in Play 2.4? - scala

From Play 2.4 documentation, the default application logging level should be DEBUG, right:
<logger name="play" level="INFO" />
<logger name="application" level="DEBUG" />
However, in my logs I only get WARN and ERROR level messages.
For example this code:
class Application extends Controller {
val log = Logger(this.getClass)
def index = Action {
log.debug("debug")
log.info("info")
log.warn("warn!")
log.error("ERROR")
Ok("ok")
}
}
...only yields this in stdout (ditto in logs/application.log):
[warn] c.Application - warn!
[error] c.Application - ERROR
How to get application DEBUG and INFO messages properly logged?
Using Play 2.4.3, with basically default configs, and no conf/logback.xml at all. (SBT-based project setup, no Typesafe Activator.)
To clarify, I know I can create a custom config file (conf/logback.xml) for Logback. That is obvious from the documentation I linked to in the very first sentence.
The point here was: if my needs are extremely ordinary (get my app's messages logged, also debug and info), do I really need to create a lengthy custom configuration file? One would assume a thing as basic as this would work by default, or with some minimal config option. If you’ve paid attention, Play Framework is touted as one with good developer experience, and many things with it follow the “convention over configuration” principle.

What I learned from a colleague in our backend chat:
Your Application controller probably resides in the controllers
package, right? When you do Logger(getClass) <- getClass is used to
look up the package path to Application, which then would be
controllers.Application so you can for example add a line <logger
name="controllers" level="DEBUG" /> to get debug output from classes
in the controllers package
There is one way without custom configs (which worked for INFO but not DEBUG with a quick experiment). But it has significant drawbacks compared to using more granual loggers (as in my question).
The "application" logger is the default logger name that's used if you
use the Logger object directly, like Logger.info("Hello, world!"), as
opposed to creating your own instances
[...]
But that will clearly backfire quickly since then you lose granular
configuration of your logs and can only filter the logging "globally",
so I never use that.
Also, your logs will not disclose where the log statement is
made but instead just print that it's in "application"
I don't want those drawbacks, so I did create conf/logback.xml (starting with a copy of the default) and added custom loggers along the lines of:
<logger name="controllers" level="DEBUG" />
<logger name="services" level="DEBUG" />
<logger name="repositories" level="DEBUG" />
So now my val log = Logger(this.getClass) approach works.
But I fail to see how requiring 30-40 lines of custom XML for pretty much the most basic thing imaginable is good developer experience. If some Play advocate or developer could justify why this doesn't work out of the box with default config, I'd be interested in hearing that.

You have no .xml file at all inside the conf folder?
Adding this line should fix it for you:
<logger name="controllers" level="DEBUG" />
You can also override it on the application.conf file, although it will be deprecated in the future:
logger.controllers=DEBUG

Related

GSettings, glib-compile-schemas and Eclipse

I am building this Gtkmm3 application in Ubuntu and wanted to explore GSettings. All was going well while following the instructions at the 'Using GSettings' page and then it was time to configure the make files. I use Eclipse 2019-12 IDE with CDT (V9.10) and 'GNU Make Builder' as the builder. I'm totally perplexed as to how to introduce the macros listed in the GNOME page into the make files. I even tried changing the project to a 'C/C++ Autotools Project' using Eclipse but still the necessary make files to add the macros were missing. Creating a new project with GNU Autotools does create the necessary make files but I was not able to get pkg-config to work with it.
Can anyone point me to some resource which explains how to compile the schema and how & where to load the resultant binary file (externally if necessary). I'll consider myself blessed if someone has already made a Gtkmm3 C++ application with GSettings support using Eclipse IDE in Linux and can share the details.
Finally I figured. Thought I'll share my findings here. Actually some one out there had explained this for python (link below).
Using GSettings with Python/PyGObject
Creating the schema
For the developer the work starts with defining a schema for the settings. A schema is an XML file that looks something like this.
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE schemalist SYSTEM "gio_gschema.dtd" >
<schemalist>
<schema id="org.gtk.skanray.emlibrary"
path="/org/skanray/emlibrary/" gettext-domain="emlibrary">
<key name="wave-pressure-ptrach-visible" type="b">
<default>true</default>
<summary>Set visibility of 'Ptrach' trace in pressure waveform.</summary>
<description>The pressure waveform shows multiple traces where 'PAW' is always enabled and additionally 'Ptrach' can be displayed. This settings affects the visibility of the trachial pressure trace shown in this waveform channel.</description></key>
</schema>
</schemalist>
The file name has to have a ‘.gschema.xml’ suffix. The schema file should be in the project path, only so that it gets pushed to SVN.
Best would be to use an XML editor (e.g. Eclipse) that supports design of XML files from a DTD file. Use following DTD file.
gschema.dtd
It is possible to store anything derived from GVariant into GSettings. Refer to following page to understand the basic types and the ‘type’ attribute to be used in the schema.
GVariant Format Strings
Compiling the schema
With the schema ready, (sudo) copy it into /usr/share/glib-2.0/schemas/ then run,
> sudo glib-compile-schemas /usr/share/glib-2.0/schemas/
At this point, the newly added settings can be seen / modified using dconf editor.
Accessing GSettings from the application
Coming to the main event of the show, this is how an application can read ( and / or write) settings. It is not necessary that one needs to bind property of an object to a ‘key’ in GSettings, it may be queried and used as well. Refer to GSettings API reference for details.
Glib::RefPtr <Gio::Settings> refSettings = Gio::Settings::create(“org.gtk.skanray.emlibrary”);
CLineTrace * pTrace = NULL; // CLineTrace is derived from Gtk::Widget
…
pTrace = …
…
if(refSettings)
{
refSettings->bind("wave-pressure-ptrach-visible",
pTrace,
"visible",
Gio::SETTINGS_BIND_DEFAULT);
}
Now you can fire up dconf editor and test the settings.
NOTE
Bindings are usually preferred to be made in class constructors. However binding to ‘visible’ property of a widget could be a bit tricky. Typically the top level window does a show_all() as the last line in its constructor. However constructors of the children of top level window would have completed executing including making the bindings. If there were settings that had stored ‘visibility’ as false then the top level window’s call to show_all() would mess up with that setting. In such cases it is advised to perform the bind one time in the on_map() handler of the respective class.

Log timestamp of an exception in Play?

I have some play code that calls some spark functions. Sometimes things break and I want a timestamp associated to the event. I currently get the error messages printed to STDOUT without a timestamp and am wondering if there's a way to configure the logger.xml to associate timestamps with these??
Providing your logger.xml file would have been helpful but anyways look for the "pattern" element for your STDOUT in the logger.xml file and change it by prepending the %d{ISO8601}
Check the logback configuration documentation for more information.
Logback Configuration

Vertica-Tableau error Multiple commands cannot be active

We have dataset in Vertica and Tableau is querying the data (4 Billions record) from vertica for dashboard as shown below :
All list and graphs are separate worksheets in tableau and using same connection to Vertica DB. Each list is a column in DB and list is descending order of # count of items in dataset's respective column. Graph also same as list but calculated in slightly different manner. Start Date and End Date is date range for Data to be query like data connection filter which will restrict the query to fixed amount of data example past week, last month, etc.
But I get this ERROR :
Vertica][VerticaDSII] (10) An error occurred during query preparation: Multiple commands cannot be active on the same connection. Consider increasing ResultBufferSize or fetching all results before initiating another command.
Is the any workaround this issue or any better way to do this
you'll need a TDC file which specifies a particular ODBC connection string option to get around the issue.
The guidance from Vertica was to add an ODBC Connect String parameter with the value “ResultBufferSize=0“. This apparently forces the result buffer to be unlimited, preventing the error. This is simple enough to accomplish when building a connection string manually or working with a DSN, but Vertica is one of Tableau’s native connectors. So how do you tell the native connector to do something else with its connection?
Native Connections in Tableau can be customized using TDC files
“Native connectors” still connect through the vendor’s ODBC drivers, and can be customized just the same as an “Other Databases” / ODBC connection. In the TDC files themselves, “ODBC” connections are referred to as “Generic ODBC”, which is a much more accurate way to think about the difference.
The full guide to TDC customizations, with all of the options, is available here although it is pretty dense reading. One thing that isn’t provided is an example of customizing a “native connector”. The basic structure of a TDC file is this
<?xml version='1.0' encoding='utf-8' ?>
<connection-customization class='genericodbc' enabled='true' version='7.7'>
<vendor name='' />
<driver name='' />
<customizations>
</customizations>
</connection-customization>
When using “Generic ODBC”, the class is “genericodbc” and then the vendor and driver name must be specified so that Tableau can know when the TDC file should be applied. It’s much simpler for a native connector — you just use the native connector name in all three places. The big list of native connector names is at the end of this article. Luckily for us, Vertica is simply referred to as “vertica”. So our Vertica TDC framework will look like:
<?xml version='1.0' encoding='utf-8' ?>
<connection-customization class='vertica' enabled='true' version='7.7'>
<vendor name='vertica' />
<driver name='vertica' />
<customizations>
</customizations>
</connection-customization>
This is a good start, but we need some actual customization tags to cause anything to happen. Per the documentation, to add additional elements to the ODBC connection string, we use a tag named ‘odbc-connect-string-extras‘. This would look like
<customization name='odbc-connect-string-extras' value='ResultBufferSize=0;' />
One important thing we discovered was that all ODBC connection extras need to be in this single tag. Because we wanted to turn on load balancing in the Vertica cluster, there was a second parameter recommended: ConnectionLoadBalance=1. To get both of these parameters in place, the correct way method is
<customization name='odbc-connect-string-extras' value='ResultBufferSize=0;ConnectionLoadBalance=1;' />
There are a whole set of other customizations you can put in to place to see how they affect performance. Make sure you understand the way the customization option is worded — if it starts with ‘SUPRESS’ then giving a ‘yes’ value will turn off the feature; other times you want to set the value to ‘no’ to turn off the feature. Some of the other ones we tried were
<customization name='CAP_SUPPRESS_DISCOVERY_QUERIES' value='yes' />
<customization name='CAP_ODBC_METADATA_SUPPRESS_PREPARED_QUERY' value='yes' />
<customization name='CAP_ODBC_METADATA_SUPPRESS_SELECT_STAR' value='yes' />
<customization name='CAP_ODBC_METADATA_SUPPRESS_EXECUTED_QUERY' value='yes' />
<customization name='CAP_ODBC_METADATA_SUPRESS_SQLSTATISTICS_API' value='yes' />
<customization name= 'CAP_CREATE_TEMP_TABLES' value='no' />
<customization name= 'CAP_SELECT_INTO' value='no' />
<customization name= 'CAP_SELECT_TOP_INTO' value='no' />
The first set were mostly about reducing the number of queries for metadata detection, while the second set tell Tableau not to use TEMP tables.
The best way to see the results of these customizations is to change the TDC file and restart Tableau Desktop Once you are satisfied with the changes, then move the TDC file to your Tableau Server and restart it.
Where to put the TDC files
Per the documentation ”
For Tableau Desktop on Windows: Documents\My Tableau Repository\Datasources
For Tableau Server: Program Files\Tableau\Tableau Server\\bin
Note: The file must be saved using a .tdc extension, but the name does not matter.”
If you are running a Tableau Server cluster, the .tdc file must be placed on every worker node in the bin folder so that the vizqlserver process can find it. I’ve also highlighted the biggest issue of all — you should edit these using a real text editor like Notepad++ or SublimeText rather than Notepad, because Notepad likes to save things with a hidden .TXT ending, and the TDC file will only be recognized if the ending is really .tdc, not .tdc.txt.
Restarting the taableau resolved my issue which was giving same error.

Perl parsing a log4j log [duplicate]

We have several applications that use log4j for logging. I need to get a log4j parser working so we can combine multiple log files and run automated analysis on them. I'm not looking to reinvent the wheel, so can someone point me to a decent pre-existing parser? I do have the log4j conversion pattern if that helps.
If not, I'll have to roll our own.
I didn't realize that Log4J ships with an XML appender.
Solution was: specify an XML appender in the logging configuration file, include that output XML file as an entity into a well formed XML file, then parse the XML using your favorite technique.
The other methods had the following limitations:
Apache Chainsaw - not automated enough
jdbc - poor performance in a high performance distributed app
You can use OtrosLogViewer with batch processing. You have to:
Define you log format, you can use Log4j pattern layout parser or Log4j XmlLayout
Create java class that implements LogDataParsedListener. Method public void logDataParsed(LogData data, BatchProcessingContext context) will be called on every parsed log event.
Create jar
Run OtrosLogViewer with specifying your log processing jar, LogDataParsedListener implementation and log files.
What you are looking for is called SawMill, or something like it.
Log4j log files aren't really suitable for parsing, they're too complex and unstructured. There are third party tools that can do it, I believe (e.g. Sawmill).
If you need to perform automated, custom analysis of the logs, you should consider logging to a database, and analysing that. JDBC ships with the JdbcAppender which appends all messages to a database of your choice, but it has performance implications, and it's a bit flaky. There are other, similar, alternatives on the interweb, though (like this one).
You -can- use Log4j's Chainsaw V2 to process the various log files and collect them into one table, and either output those events as xml or use Chainsaw's built-in expression-based filtering, searching & colorizing support to slice & dice the logs.
Steps:
- Start Chainsaw V2
- Create a chainsaw configuration file by copying the example configuration file available from the Welcome tab - define one LogFilePatternReceiver 'plugin' entry for each log file that you want to process
- Start Chainsaw with that configuration
- Each log file will end up as a separate tab in the UI
- Pause the chainsaw-log tab and clear the events from that tab
- Create a new tab which aggregates the events from the various tabs by going to the 'view, crate custom expression logpanel' menu item and enter 'level >= DEBUG' in the box. It will create a new tab containing events from all of the tabs with level >= debug (which is why you cleared the chainsaw-log tab).
You can get an overview of the expression syntax used to filter, colorize and search from the tutorial (available from the Help menu).
If you don't want to use Chainsaw, you can do something similar - start a simple app that doesn't log but loads a log4j.xml config file with the 'plugin' entries you defined for the Chainsaw configuration, but also define a FileAppender with an xmllayout - all of the events received by the 'receivers' will be sent to the single appender.

logback using external properties file not working correctly

I think I hit a playframework - logback specific bug, perhaps not, any assistance would be great!
I' ve needed an email appender to send me emails when an error occures.
So I created a simple one with all the properties defined within:
<appender name="EMAIL" class="ch.qos.logback.classic.net.SMTPAppender">
<smtpHost>myhost</smtpHost>
<smtpPort>25</smtpPort>
<STARTTLS>true</STARTTLS>
<username>username</username>
<password>pass</password>
<to>to</to>
<from>from</from>
<subject>Error: %logger{20} - %m</subject>
<layout class="ch.qos.logback.classic.PatternLayout">
<pattern>%date %-5level %logger{35} - %message%n</pattern>
</layout>
</appender>
This worked fine.
Then I wanted to refactor all of the properties cause obviously i don't want to commit passwords and stuff like that to the source control.
So i replaced all the properties by ${propname} and added this at the begining
<property file="conf/mail.properties"/>
now emails are not sent anymore.
I debugged and printed out the properties and saw that the problem has to do with quotes.
my to/from are email addresses, when I put them in my properties file I need to surround them with quotes otherwise i get an error like this:
Caused by: com.typesafe.config.ConfigException$Parse: /myconfdir../mail.properties: 19: Reserved character '#' is not allowed outside quotes
Also my password has a "+" symbol in it and if it's unquoted in the properties file i get the same error.
When debugging with these quoted values I see that the quote is included inside them.
So an email is trying to be sent to "a#b.com" and no a#b.com.
Same for from field and for the password.
So obviously it doesn't work.
Any idea how to prevent this?
I have done the same task by this :
In logback.xml file, Add this :
property resource="filename.properties"
<username>${username}</username>
<password>${password}</password>
<to>${email1}</to>
<from>${from}</from>