Simple logging example with sl4j/logback in scala not working - java

Logback classic 1.2.3 version works but if I use logback 1.3.0 alpha which uses sl4f 1.8 I get error :
SLF4J: No SLF4J providers were found. error
This happens only if I assemble the scala file, create a jar and execute it. If I run it from intellij IDE it works fine.
My sbt is:
libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.3.0-alpha4"
And my scala code is:
import org.slf4j.LoggerFactory
object Hello extends App{
print("Hi!!!")
val logback = LoggerFactory.getLogger("CloudSim+")
logback.info(" --- Wecome to cloudsim+ simulator --- ")
logback.info("Press 1 to start Load balancing simulator")
logback.info("Press 2 to start Network simulator")
}
logbacl.xml in resources folder has below content:
<configuration>
<appender name="console" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>[%date{HH:mm:ss}] %-5level %logger{0} {%class %method} - %msg%n</pattern>
</encoder>
</appender>
<appender name="file" class="ch.qos.logback.core.FileAppender">
<file>${log-file:-scala-logging.log}</file>
<encoder>
<pattern>[%date{HH:mm:ss}] %-5level %logger{0} {%class %method} - %msg%n</pattern>
</encoder>
</appender>
<root level="info">
<appender-ref ref="console"/>
<appender-ref ref="file"/>
</root>
</configuration>
I'm stuck I tried various things to see log statements even when I run a jar file.

Related

No evaluator set for filter null, using logback-spring.xml in Spring Boot Application

I am writing logback file for my spring boot application but facing this issue, found no solution on internet. It would be really helpful if someone can help.
Logback.xml
<appender class="ch.qos.logback.core.rolling.RollingFileAppender" name="INFO">
<File>${log.dir}/default.log</File>
<append>true</append>
<rollingPolicy class="ch.qos.logback.core.rolling.SizeAndTimeBasedRollingPolicy">
<fileNamePattern>${log.dir}/default.log.%d{yyyy-MM-dd}.%i.log.gz</fileNamePattern>
<maxFileSize>${log.default.maxFileSize}</maxFileSize>
<maxHistory>${log.maxHistory}</maxHistory>
</rollingPolicy>
<encoder>
<pattern>${log.pattern}</pattern>
</encoder>
</appender>
<appender class="ch.qos.logback.classic.AsyncAppender" name="ASYNC-INFO">
<discardingThreshold>${async.discardingThreshold}
</discardingThreshold>
<queueSize>${async.queueSize}</queueSize>
<neverBlock>${neverBlock}</neverBlock>
<filter class="ch.qos.logback.core.filter.EvaluatorFilter">
<OnMismatch>DENY</OnMismatch>
<OnMatch>NEUTRAL</OnMatch>
</filter>
<appender-ref ref="INFO"/>
</appender>
<root level="INFO">
<appender-ref ref="ASYNC-INFO"/>
</root>
ERROR
ERROR in ch.qos.logback.core.filter.EvaluatorFilter#13fd2ccd - No evaluator set for filter null
Note
I don't have much idea about code.
One More Question
I have multiple logback.xml files based on environments like a stage, production, etc. How can I pass them while running a jar?
Tried java - jar jarfile.jar --spring.config.location=application.yml,logback-dev.xml

How to select Logback appender based on property file or environment variable

I have configured logback xml for a spring boot project.
I want to configure another appender based on the property configured. We want to create an appender either for JSON logs or for text log, this will be decided either by property file or by environment variable.
So I am thinking about the best approach to do this.
Using filters to print logs to 1 of the file (either to JSON or to Txt). But this will create both of the appenders. I want to create only 1 appender.
Use "If else" blocks in logback XML file. To put if else around appenders, loggers seems untidy and error prone. So will try to avoid as much as possible.
So now exploring options where I can add appender at runtime.
So I want to know if it is possible to add appender at runtime. And will it be added before spring boots up or it could be done anytime in the project.
What could be the best approach to include this scenario.
As you're already using Spring, I suggest using Spring Profiles, lot cleaner than trying to do the same programmatically. This approach is also outlined in Spring Boot docs.
You can set an active profile from either property file:
spring.profiles.active=jsonlogs
or from environment value:
spring_profiles_active=jsonlogs
of from startup parameter:
-Dspring.profiles.active=jsonlogs
Then have separate configurations per profile:
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<appender name="stdout-classic" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d{dd-MM-yyyy HH:mm:ss.SSS} %magenta([%thread]) %highlight(%-5level) %logger{36}.%M - %msg%n</pattern>
</encoder>
</appender>
<appender name="stdout-json" class="ch.qos.logback.core.ConsoleAppender">
<encoder class="ch.qos.logback.core.encoder.LayoutWrappingEncoder">
<layout class="ch.qos.logback.contrib.json.classic.JsonLayout">
<timestampFormat>yyyy-MM-dd'T'HH:mm:ss.SSSX</timestampFormat>
<timestampFormatTimezoneId>Etc/UTC</timestampFormatTimezoneId>
<jsonFormatter class="ch.qos.logback.contrib.jackson.JacksonJsonFormatter">
<prettyPrint>true</prettyPrint>
</jsonFormatter>
</layout>
</encoder>
</appender>
<!-- begin profile-specific stuff -->
<springProfile name="jsonlogs">
<root level="info">
<appender-ref ref="stdout-json" />
</root>
</springProfile>
<springProfile name="classiclogs">
<root level="info">
<appender-ref ref="stdout-classic" />
</root>
</springProfile>
</configuration>
As the previous answer states, you can set different appenders based on Spring Profiles.
However, if you do not want to rely on that feature, you can use environments variables as described in the Logback manual. I.e.:
<appender name="json" class="ch.qos.logback.core.ConsoleAppender">
<encoder class="ch.qos.logback.core.encoder.LayoutWrappingEncoder">
<layout class="ch.qos.logback.contrib.json.classic.JsonLayout">
<jsonFormatter class="ch.qos.logback.contrib.jackson.JacksonJsonFormatter">
<prettyPrint>true</prettyPrint>
</jsonFormatter>
<appendLineSeparator>true</appendLineSeparator>
</layout>
</encoder>
</appender>
<appender name="console" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>
%cyan(%d{HH:mm:ss.SSS}) %gray([%thread]) %highlight(%-5level) %magenta(%logger{36}) - %msg%n
</pattern>
</encoder>
</appender>
<root level="info">
<!--
! Use the content of the LOGBACK_APPENDER environment variable falling back
! to 'json' if it is not defined
-->
<appender-ref ref="${LOGBACK_APPENDER:-json}"/>
</root>

Logging using Logback on Spark StandAlone

We are using Spark StandAlone 2.3.2 and logback-core/logback-classic with 1.2.3
Have very simple Logback configuration file which allows us to log the data to a specific directory and on local I can pass the vm parameters from editor
-Dlogback.configurationFile="C:\path\logback-local.xml"
and it works and logs properly
On Spark StandAlone I am trying to pass the arguments using external link
spark-submit
--master spark://127.0.0.1:7077
--driver-java-options "-Dlog4j.configuration=file:/path/logback.xml"
--conf "spark.executor.extraJavaOptions=-Dlogback.configurationFile=file:/path/logback.xml"
Here is the config file (bit ansibilized), have verified the actual paths and they exist, any idea what could be the issue on the cluster. I have verified the Environment variables on Spark UI and they reflect the same for drvier and executor options.
Any potential issues with Logback and Spark StandAlone together?
There is nothing specific to configuration file here, it just filters the data for json logging vs file for better visualization on log server
<configuration>
<appender name="FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>{{ app_log_file_path }}</file>
<rollingPolicy class="ch.qos.logback.core.rolling.SizeAndTimeBasedRollingPolicy">
<!--daily-->
<fileNamePattern>{{ app_log_dir }}/{{ app_name }}.%d{yyyy-MM-dd}.%i.log.gz</fileNamePattern>
<maxFileSize>100MB</maxFileSize>
<maxHistory>90</maxHistory>
<totalSizeCap>10GB</totalSizeCap>
</rollingPolicy>
<encoder>
<pattern>%d [%thread] %-5level %logger{36} %X{user} - %msg%n</pattern>
</encoder>
</appender>
<appender name="FILE_JSON" class="ch.qos.logback.core.rolling.RollingFileAppender">
<filter class="ch.qos.logback.core.filter.EvaluatorFilter">
<evaluator>
<expression>
return message.contains("timeStamp") &&
message.contains("logLevel") &&
message.contains("sourceLocation") &&
message.contains("exception");
</expression>
</evaluator>
<OnMismatch>DENY</OnMismatch>
<OnMatch>NEUTRAL</OnMatch>
</filter>
<file>{{ app_json_log_file_path }}</file>
<rollingPolicy class="ch.qos.logback.core.rolling.SizeAndTimeBasedRollingPolicy">
<!--daily-->
<fileNamePattern>{{ app_log_dir }}/{{ app_name }}_json.%d{yyyy-MM-dd}.%i.log.gz</fileNamePattern>
<maxFileSize>100MB</maxFileSize>
<maxHistory>90</maxHistory>
<totalSizeCap>10GB</totalSizeCap>
</rollingPolicy>
<encoder>
<pattern>%msg%n</pattern>
</encoder>
</appender>
<logger name="com.baml.ctrltech.greensheet.logging.GSJsonLogging" level="info" additivity="false">
<appender-ref ref="FILE_JSON" />
</logger>
<root level="INFO">
<appender-ref ref="FILE" />
<appender-ref ref="FILE_JSON"/>
</root>
</configuration>
We couldn't get Logback to work with Spark, as Spark uses Log4J internally, we had to switch to the same
I fixed adding one dependency for logback and excluding the transitive dependency from Spark in sbt:
val sparkV = "3.3.1"
val slf4jLogbackV = "2.1.4"
val slf4jLogback = "com.networknt" % "slf4j-logback" % slf4jLogbackV
val sparkSql = ("org.apache.spark" %% "spark-sql" % sparkV)
.cross(CrossVersion.for3Use2_13)
.exclude("org.apache.logging.log4j", "log4j-slf4j-impl")

Generated One extra blank log file For one Application Run

I am Configured Logger For my application And Give the Logfilename as Current Time Stamp So It Expected to create one log File with the name as current Time stamp BuT INSTEDE IT CREATE ONE LOGFILE WITH CURRENT TIMESTAMP AND ANOTHER FILE WHICH IS BLANK CANT FIGURE OUT WHY IT CREATING EXTRA FILE??
I am using Logback logger in my application and Here is my logback.xml looks like My application is simple core java application. where i user logger to log the statements
<?xml version="1.0" encoding="UTF-8"?>
<configuration>`enter code here`
<timestamp key="byDay" datePattern="yyyy'-'MM'-'dd'''AT'''HH'-'mm'-'ss"/>
<appender name="FILE" class="ch.qos.logback.core.FileAppender">
<file> Myapplication${byDay}.txt </file>
<append>FALSE</append>
<encoder>
<pattern>%-4relative [%thread] %-5level %logger{35} - %msg%n</pattern>
</encoder>
</appender>
<root level="DEBUG">
<appender-ref ref="FILE" />
</root>
</configuration>

Read the logback.xml from a system variable value

I have a project that I built it using maven jar plugin.For logging i am using logback.and I want my logback.xml be out of the executableJar.For reading the logback.xml i have set a new environment variable with a value that i put my logback.xml in it.how can I tell my project to go use this variable in order to find the config file?
Look at this manual:
https://logback.qos.ch/manual/configuration.html#variableSubstitution
You can use command like below:
java -Dlogback.configurationFile=/path/to/config.xml chapters.configuration.MyApp1
You can do something like this. In your jar (src/resources/) add a skeleton logback.xml like so:
<configuration>
<include resource="file:${LOG_CONFIG_DIR}/logback-app.xml"/>
</configuration>
Then at startup you can use a logback-app.xml outside the jar which looks like this
<included>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<layout class="ch.qos.logback.classic.PatternLayout">
<Pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{60} %mdc{applicationName} %mdc{environment}
%mdc{gearid} - %msg%n
</Pattern>
</layout>
</appender>
<logger name="org.apache" level="ERROR"/>
<root level="trace">
<appender-ref ref="STDOUT"/>
</root>
</included>
Then at start you can wire in the external file like this :
java ... -DLOG_CONFIG_DIR=/directory....

Categories

Resources