Change flink job pods names - kubernetes

I am using Ververica platform, community edition and I am trying to configure the pods' names. At the moment they look something like that:
job-1599c2bd-77b0-4f39-8fd9-029548079d31-jobmanager--1-lkk6v
job-1599c2bd-77b0-4f39-8fd9-029548079d31-taskmanager-5656db5kph
job-1599c2bd-77b0-4f39-8fd9-029548079d31-taskmanager-5656dd6988
I would like to have something like that
job-top-speed-windowing-jobmanager-1
job-top-speed-windowing-taskmanager-1
job-top-speed-windowing-taskmanager-2
I played around with what is specified in the documentation and this is an example of the yaml file I have tried last:
apiVersion: v1
kind: Deployment
metadata:
annotations:
com.dataartisans.appmanager.controller.deployment.spec.version: '265'
com.dataartisans.appmanager.controller.deployment.transitioning: 'true'
com.dataartisans.appmanager.controller.deployment.transitioning.since: '1642104082812'
createdAt: '2022-01-13T20:01:05.410547Z'
displayName: TopSpeedWindowing
id: b495303e-5df6-4a65-9b24-7345da225979
labels: {}
modifiedAt: '2022-01-13T20:01:22.812754Z'
name: top-speed-windowing
namespace: default
resourceVersion: 16
spec:
deploymentTargetName: LoggingTest
maxJobCreationAttempts: 4
maxSavepointCreationAttempts: 4
restoreStrategy:
allowNonRestoredState: false
kind: LATEST_STATE
state: RUNNING
template:
metadata:
annotations:
flink.queryable-state.enabled: 'false'
flink.security.ssl.enabled: 'false'
spec:
artifact:
flinkImageRegistry: registry.ververica.com/v2.6
flinkImageRepository: flink
flinkImageTag: 1.14.2-stream1-scala_2.12-java8
flinkVersion: '1.14'
jarUri: >-
https://repo1.maven.org/maven2/org/apache/flink/flink-examples-streaming_2.12/1.14.2/flink-examples-streaming_2.12-1.14.2-TopSpeedWindowing.jar
kind: JAR
flinkConfiguration:
execution.checkpointing.externalized-checkpoint-retention: RETAIN_ON_CANCELLATION
execution.checkpointing.interval: 10s
execution.checkpointing.min-pause: 10s
high-availability: vvp-kubernetes
metrics.reporter.prom.class: org.apache.flink.metrics.prometheus.PrometheusReporter
state.backend: filesystem
taskmanager.memory.managed.fraction: '0.0'
logging:
log4j2ConfigurationTemplate: >-
<?xml version="1.0" encoding="UTF-8" standalone="no"?> <Configuration
xmlns="http://logging.apache.org/log4j/2.0/config" strict="true">
<Appenders>
<Appender name="StdOut" type="Console">
<Layout pattern="%d{yyyy-MM-dd HH:mm:ss,SSS} %-5p %-60c %x - %m%n" type="PatternLayout"/>
</Appender>
<Appender name="RollingFile" type="RollingFile" fileName="${sys:log.file}" filePattern="${sys:log.file}-%d{yyyy-MM-dd}.log">
<Layout pattern="%d{yyyy-MM-dd HH:mm:ss,SSS} %-5p %-60c %x - %m%n" type="PatternLayout"/>
<Policies>
<TimeBasedTriggeringPolicy interval="6" modulate="true"/>
<SizeBasedTriggeringPolicy size="100 MB"/>
</Policies>
<DefaultRolloverStrategy max="2"/>
</Appender>
</Appenders>
<Loggers>
<Logger level="INFO" name="org.apache.hadoop"/>
<Logger level="INFO" name="org.apache.kafka"/>
<Logger level="INFO" name="org.apache.zookeeper"/>
<Logger level="INFO" name="akka"/>
<Logger level="ERROR" name="org.jboss.netty.channel.DefaultChannelPipeline"/>
<Logger level="OFF" name="org.apache.flink.runtime.rest.handler.job.JobDetailsHandler"/>
{%- for name, level in userConfiguredLoggers -%}
<Logger level="{{ level }}" name="{{ name }}"/>
{%- endfor -%}
<Root level="{{ rootLoggerLogLevel }}">
<AppenderRef ref="StdOut"/>
<AppenderRef ref="RollingFile"/>
</Root>
</Loggers>
</Configuration>
log4jLoggers:
'': INFO
parallelism: 2
resources:
jobmanager:
cpu: 1
memory: 1G
taskmanager:
cpu: 1
memory: 2G
upgradeStrategy:
kind: STATEFUL
Is that possible?

Related

LOG4Net: Cannot find Property [additivity] to set object on [log4net.Appender.RollingFileAppender]

Any help would be appreciated.
I am using GCP Kubernetes Engine.
Getting following error in POD:
log4net:ERROR XmlHierarchyConfigurator: Cannot find Property [additivity] to set object on [log4net.Appender.RollingFileAppender]
Here is my configuration file:
apiVersion: v1
kind: ConfigMap
metadata:
name: log4net-config
namespace: bold-services
data:
Log4Net.config: |
<?xml version="1.0" encoding="utf-8" ?>
<configuration>
<log4net threshold="ALL" debug="true">
<root>
<level value="ALL" />
<!-- <appender-ref ref="ConsoleAppender" /> -->
<appender-ref ref="FILE_DEBUG_APPENDER" />
<appender-ref ref="FILE_ERROR_APPENDER" />
</root>
<!-- === File Appender for NON-ERROR messages file === -->
<appender name="FILE_DEBUG_APPENDER" type="log4net.Appender.RollingFileAppender" class="ch.qos.logback.classic.AsyncAppender">
<file type="log4net.Util.PatternString" value="%property{AppDataPath}/logs/%property{loggername}/debug-info-%env{HOSTNAME}.txt" />
<filter type="log4net.Filter.LevelMatchFilter">
<levelToMatch value="INFO" />
</filter>
<filter type="log4net.Filter.DenyAllFilter" />
<additivity value="true" />
<appendToFile value="true" />
<maxSizeRollBackups value="1" />
<maximumFileSize value="300KB" />
<rollingStyle value="Size" />
<staticLogFileName value="true" />
<layout type="log4net.Layout.PatternLayout">
<header type="log4net.Util.PatternString" value="#Software: %property{loggername} %newline#Date: %date %newline#Fields: date thread namespace methodname message %newline" />
<conversionPattern value="%date [%thread] %message%newline" />
</layout>
</appender>
<!-- === File Appender for ERROR messages file === -->
<appender name="FILE_ERROR_APPENDER" type="log4net.Appender.RollingFileAppender" class="ch.qos.logback.classic.AsyncAppender">
<file type="log4net.Util.PatternString" value="%property{AppDataPath}/logs/%property{loggername}/errors-%env{HOSTNAME}.txt" />
<filter type="log4net.Filter.LevelMatchFilter">
<levelToMatch value="ERROR" />
</filter>
<filter type="log4net.Filter.DenyAllFilter" />
<additivity value="true" />
<appendToFile value="true" />
<maxSizeRollBackups value="10" />
<maximumFileSize value="100KB" />
<rollingStyle value="Size" />
<staticLogFileName value="true" />
<layout type="log4net.Layout.PatternLayout">
<header type="log4net.Util.PatternString" value="#Software: %property{loggername} %newline#Date: %date %newline#Fields: date thread namespace methodname message %newline" />
<conversionPattern value="%date [%thread] %-5level %message%newline" />
</layout>
</appender>
<!-- === Console Appender to use in BufferingForwardingAppender === -->
<appender name="ConsoleAppender" type="log4net.Appender.ConsoleAppender">
<layout type="log4net.Layout.PatternLayout">
<conversionPattern type="log4net.Util.PatternString" value="%newline%%-5level %property{loggername} %env{HOSTNAME} %%date [%%thread] %%message%newline" />
</layout>
</appender>
</log4net>
</configuration>

Scala Slick Driver Logging

I am using the Slick driver with the following versions:
"com.typesafe.slick" %% "slick" % "3.3.1",
"com.typesafe.slick" %% "slick-hikaricp" % "3.3.1"
I have imported these in my class AnimalCounter.scala
import slick.jdbc.PostgresProfile.api._
import slick.jdbc.GetResult
And, I have the following class structure ...
class AnimalCounter {
val db = Database.forConfig("animaldb")
def get(a: Animal): Future[Option[Animal]] =
db.run(....do something......)
def getOrCreate(a: Animal): Future[Option[Animal]] =
db.run(....do something......)
}
So, how do I setup the inbuilt slick logging to log all DB operations that are happening behind the scenes?
Here's the answer on how I got it working. I think #vamsi's approach as mentioned above would work too. However, the important thing to remember in my case was that I needed to remove this dependency since it's designed to supress logs from slick. So please remove this dependency if you already have it.
"org.slf4j" % "slf4j-nop" % "1.7.26"
And after removing the ^^ dependency, please configure your logback.xml like this:
<configuration>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>[%level] [%date{MM/dd/yyyy HH:mm:ss.SSS}] [SLICK] [%logger{1000}] %green(%X{debugId}) %msg%n</pattern>
</encoder>
</appender>
<root level="${SLICK_LOG_LEVEL:-INFO}">
<appender-ref ref="STDOUT" />
</root>
<logger name="slick.basic.BasicBackend.action" level="${log_action:-inherited}" />
<logger name="slick.basic.BasicBackend.stream" level="${log_stream:-inherited}" />
<logger name="slick.compiler" level="OFF" />
<logger name="slick.compiler.QueryCompiler" level="OFF" />
<logger name="slick.compiler.QueryCompilerBenchmark" level="OFF" />
<logger name="slick.jdbc.DriverDataSource" level="${log_jdbc_driver:-inherited}" />
<logger name="slick.jdbc.JdbcBackend.statement" level="${log_jdbc_statement:-inherited}" />
<logger name="slick.jdbc.JdbcBackend.parameter" level="${log_jdbc_parameter:-inherited}" />
<logger name="slick.jdbc.JdbcBackend.benchmark" level="${log_jdbc_bench:-inherited}" />
<logger name="slick.jdbc.StatementInvoker.result" level="${log_jdbc_result:-inherited}" />
<logger name="slick.jdbc.JdbcModelBuilder" level="${log_createModel:-inherited}" />
<logger name="slick.memory.HeapBackend" level="${log_heap:-inherited}" />
<logger name="slick.memory.QueryInterpreter" level="${log_interpreter:-inherited}" />
<logger name="slick.relational.ResultConverterCompiler" level="${log_resultConverter:-inherited}" />
<logger name="slick.util.AsyncExecutor" level="${log_asyncExecutor:-inherited}" />
</configuration>

how to "filter" Akka INFO logging output to different files?

I am logging INFO on "orders" and INFO on "transactions" generated by a trading application that I am developing using Scala/Akka. Currently all of the output is going into a single log file, but I would like the information to go to separate files.
I have specified my application.conf as follows...
akka {
loggers = ["akka.event.slf4j.Slf4jLogger"]
loglevel = "INFO"
logging-filter = "akka.event.slf4j.Slf4jLoggingFilter"
}
...and here is my logback.xml file...
<configuration>
<appender name="TRANSACTIONS" class="ch.qos.logback.core.FileAppender">
<file>log/transactions.log</file>
<append>true</append>
<encoder>
<pattern>%date{yyyy-MM-dd} %X{akkaTimestamp} %-5level[%thread] %logger{1} - %msg%n</pattern>
</encoder>
</appender>
<root level="INFO">
<appender-ref ref="TRANSACTIONS" />
</root>
<appender name="ORDERS" class="ch.qos.logback.core.FileAppender">
<file>log/orders.log</file>
<append>true</append>
<encoder>
<pattern>%date{yyyy-MM-dd} %X{akkaTimestamp} %-5level[%thread] %logger{1} - %msg%n</pattern>
</encoder>
</appender>
<root level="INFO">
<appender-ref ref="ORDERS" />
</root>
</configuration>
I have tried modifying my logback.xml file to include filters as follows
<configuration>
<appender name="TRANSACTIONS" class="ch.qos.logback.core.FileAppender">
<filter class="domain.trading.DoubleAuctionMarket"/>
<file>log/transactions.log</file>
<append>true</append>
<encoder>
<pattern>%date{yyyy-MM-dd} %X{akkaTimestamp} %-5level[%thread] %logger{1} - %msg%n</pattern>
</encoder>
</appender>
<root level="INFO">
<appender-ref ref="TRANSACTIONS" />
</root>
<appender name="ORDERS" class="ch.qos.logback.core.FileAppender">
<filter class="domain.trading.Exchange"/>
<file>log/orders.log</file>
<append>true</append>
<encoder>
<pattern>%date{yyyy-MM-dd} %X{akkaTimestamp} %-5level[%thread] %logger{1} - %msg%n</pattern>
</encoder>
</appender>
<root level="INFO">
<appender-ref ref="ORDERS" />
</root>
</configuration>
However, this generates runtime errors...
14:11:38,361 |-ERROR in ch.qos.logback.core.joran.action.NestedComplexPropertyIA - Could not create component [filter] of type [domain.trading.DoubleAuctionMarket] java.lang.ClassNotFoundException: domain.trading.DoubleAuctionMarket
at java.lang.ClassNotFoundException: domain.trading.DoubleAuctionMarket
and
14:11:38,417 |-ERROR in ch.qos.logback.core.joran.action.NestedComplexPropertyIA - Could not create component [filter] of type [domain.trading.Exchange] java.lang.ClassNotFoundException: domain.trading.Exchange
at java.lang.ClassNotFoundException: domain.trading.Exchange
Thoughts?
[EDIT: Solution] For completeness, I created two filters TransactionsFilter and OrdersFilter per the link provided in the answer below and stuck them in the src/main/java/ directory of my project. Example filter looks like...
package filters;
import ch.qos.logback.classic.spi.ILoggingEvent;
import ch.qos.logback.core.filter.Filter;
import ch.qos.logback.core.spi.FilterReply;
public class OrdersFilter extends Filter<ILoggingEvent> {
#Override
public FilterReply decide(ILoggingEvent event) {
boolean isAskOrder = event.getFormattedMessage().contains("AskOrder");
boolean isBidOrder = event.getFormattedMessage().contains("BidOrder");
if ( isAskOrder || isBidOrder) {
return FilterReply.ACCEPT;
} else {
return FilterReply.DENY;
}
}
}
I then edited my logback.xml file to to reference the newly created filters...
<configuration>
<appender name="TRANSACTIONS" class="ch.qos.logback.core.FileAppender">
<filter class="filters/TransactionFilter"/>
<file>log/transactions.log</file>
<append>true</append>
<encoder>
<pattern>%date{yyyy-MM-dd} %X{akkaTimestamp} %-5level[%thread] %logger{1} - %msg%n</pattern>
</encoder>
</appender>
<root level="INFO">
<appender-ref ref="TRANSACTIONS" />
</root>
<appender name="ORDERS" class="ch.qos.logback.core.FileAppender">
<filter class="filters/OrderFilter"/>
<file>log/orders.log</file>
<append>true</append>
<encoder>
<pattern>%date{yyyy-MM-dd} %X{akkaTimestamp} %-5level[%thread] %logger{1} - %msg%n</pattern>
</encoder>
</appender>
<root level="INFO">
<appender-ref ref="ORDERS" />
</root>
</configuration>
The class you specify in the filter tag is not the type of files you want to accept, but the class-path of a filter class (possibly a custom one) you want to use to filter your logs.
So I guess you get that error as your classes do not implement a filter.
You should build your filter class as readable here, one per filter, and then reference those.

How to store logging statements(ex: console.log("some statements")) of Javascript in seprate file in PlayFramewrok application?

I have one Activator project, the controllers which i have written for that i am able to log into seprate file. Now i have some JS file which contains some logging statments too which i want to store in same log file.
Application.java sample
public static Result index() throws Exception {
Logger.of(LoggerConstants.OTNlogger).debug(LoggerConstants.methodEntry);
// CommonUtils.createDBConnection();
if (isSessionExist()) {
Logger.of(LoggerConstants.OTNlogger).debug(
LoggerConstants.returnObj + LoggerConstants.loggerSpace);
Logger.of(LoggerConstants.OTNlogger).debug(
LoggerConstants.methodExit);
return ok(tool.render());
} else {
Logger.of(LoggerConstants.OTNlogger).debug(LoggerConstants.methodExit);
return badRequest(main.render());
}
}
Logger.xml
<configuration debug="true">
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<!-- encoders are assigned the type ch.qos.logback.classic.encoder.PatternLayoutEncoder
by default -->
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<appender name="FILE"
class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>logFile.log</file>
<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<!-- daily rollover -->
<fileNamePattern>logFile.%d{yyyy-MM-dd}.log</fileNamePattern>
<!-- keep 1 days' worth of history -->
<maxHistory>5</maxHistory>
</rollingPolicy>
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%caller{2} : %r] %p %logger{0} - %msg %n</pattern>
<!-- time, [caller level : excution time of method], level, loggerName, - message -->
</encoder>
</appender>
<logger name="ODE" level="TRACE" additivity="false">
<appender-ref ref="STDOUT" />
<appender-ref ref="FILE" />
</logger>
<logger name="play" level="INFO" additivity="false">
<appender-ref ref="STDOUT" />
</logger>
<root level="info" additivity="false">
<appender-ref ref="STDOUT" />
</root>
LogFile.log sample
15:58:22.491 [Caller+0 at play.Logger$ALogger.debug(Logger.java:332)
Caller+1 at com.egnaro.utils.CommonUtils.createDBConnection(CommonUtils.java:30)
: 81754] DEBUG ODE - Entered into method
15:58:22.512 [Caller+0 at play.Logger$ALogger.info(Logger.java:361)
Caller+1 at com.egnaro.utils.CommonUtils.createDBConnection(CommonUtils.java:33)
: 81775] INFO ODE - Morphia Object Intilizing
15:58:22.927 [Caller+0 at play.Logger$ALogger.info(Logger.java:361)
Caller+1 at com.egnaro.utils.CommonUtils.createDBConnection(CommonUtils.java:46)
: 82190] INFO ODE - Databse created
sample.js
function inputEditor(data, event){
var treeView = $("#tree").data("kendoTreeView");
var selectedNode = treeView.select(),
dataItem = treeView.dataItem(selectedNode);
//var span = $(this).find('.k-sprite').hide();
//var span = $(this).find('.k-sprite').detach();
$target = $(event.target);
$target.editable(function (value, settings)
{
console.log(this);
return value;
},
{
submit:'ok',
event: 'dblclick',
cssclass: 'editableTree'
});
//return;
//$(this).find('.k-sprite').append(span);
$target.trigger('dblclick', [event]);
//$(this).find('.k-sprite').append(span);
}
I want to store log for console.log() in my loggerfile. Please give step by step explanation, i'm new in playframework.

Append message to JBoss Console using jboss-log4j.xml

I'm unable to append messages from my application to the JBoss console. The following are the changes that I made to the jboss-log4j.xml configuration file:
<category name="com.tricubes">
<priority value="INFO"/>
<appender-ref ref="CONSOLE"/>
</category>
Here is my code:
public class OneToOneValidation2 {
private static final Logger logger = Logger.getLogger("com.tricubes");
public boolean validate(byte[] fpImage, byte[] fpTemplate, String desc, String ticket) {
...
logger.info("BES INFO: SOCKET MSG SENT " + intToByteArray(x));
...
return b;
}
}
What am I missing?
TIA!
Edited:
The console appender. Also is the default appender used by JBoss.
<appender name="CONSOLE" class="org.apache.log4j.ConsoleAppender">
<errorHandler class="org.jboss.logging.util.OnlyOnceErrorHandler"/>
<param name="Target" value="System.out"/>
<param name="Threshold" value="INFO"/>
<layout class="org.apache.log4j.PatternLayout">
<!-- The default pattern: Date Priority [Category] Message\n -->
<param name="ConversionPattern" value="%d{ABSOLUTE} %-5p [%c{1}] %m%n"/>
</layout>
I have tried with both org.jboss.logging.Logger and org.apache.log4j.Logger;
Category is deprecated (use Logger), and Priority is not recommended (use Level). So your config block should be:
<logger name="com.tricubes">
<level value="INFO"/>
<appender-ref ref="CONSOLE"/>
</logger>
Also, what is your CONSOLE appender defined as? If its not pointing at the JBoss console, it wont log there.