Add custom value to every log message - java

Let's say that I have a REST API with endpoint /user/{user_id}/foo.
Now when it is called I would like that all logs that come from handling this request contain information about {user_id}. Is it possible to achieve that without passing {user_id} to every method?
I'm using SLF4j for logging, my application is based on Spring Boot.

You could also use MDC for this, see here. It's essentially a map, you just put your contextual information in it (e.g. user id) and then you can use it in your log layout. Be aware that this only works with certain underlying frameworks like logback, where a sample layout pattern would look like this:
<Pattern>%X{user_id} %m%n</Pattern>
Check the logback manual for more details on this.

You can use Logback's Mapped Diagnotic Context to propagate the {user_id} to every log message.
There are two parts to this:
Push your {user_id} into MDC e.g. MDC.put("user_id", "Pawel");
Include the MDC entry in your log statements. You do this by specifying it in your logging pattern. So, if you store the user id in a MDC entry named "user_id" the you would set logging.pattern.level=user_id:%X{user_id} %5p to include the value of that entry in every log event.
More details in the docs

Related

Add trace.id and transaction.id Springboot

I have a Springboot micro-service. For logging I'm using Elastic common scheme, implemented using ecs-logging-java.
I want to set the trace.ID and a transaction.ID but I'm not sure how?
Bonus question, I'm I right in thinking trace.ID should be the ID to following the request through multiple system. transaction.ID is just for within the service?
Configure your logging patter as below
<pattern> %d{yyyy-MM-dd HH:mm:ss.SSS} %thread [%X{trace-id}] [%-5level] %class{0} - %msg%n </pattern>
Put trace Id in MDC. (MDC belongs to particular thread context)
`MDC.put("trace-id", "traceid1");`
So whenever your log will print a message, it will print trace id.
Follow below artical.http://logback.qos.ch/manual/mdc.html
Step 1: Add trace id in the thread context.
This can be done using MDC (manages contextual information on a per-thread basis).
Add the below line at the start of any method, from where you want to trace logs.
MDC.put("TRACE_ID", UUID.randomUUID().toString());
Step 2: Add trace id in log format
Logs in java do not add trace id by default, so to make this possible we can add the trace id we previously added in the thread context to the log.
This can be added to the application.properties I have added [%X{TRACE_ID}] in the default log console pattern.
logging.pattern.console=%clr(%d{${LOG_DATEFORMAT_PATTERN:-yyyy-MM-dd HH:mm:ss.SSS}}){faint} [%X{TRACE_ID}] %clr(${LOG_LEVEL_PATTERN:-%5p}) %clr(${PID:- }){magenta} %clr(---){faint} %clr([%15.15t]){faint} %clr(%-40.40logger{39}){cyan} %clr(:){faint} %m%n${LOG_EXCEPTION_CONVERSION_WORD:-%wEx}
I thought I had documented this but the closest I could come is in Log4j-Audit's RequestContext.. I guess I need to add a new entry to my blog. The short answer to this is that you use Log4j 2's ThreadContextMap. First, when a user logs in create a session map that contains the data you want to capture in each request, such as the user's ip address and loginId. Then create servlet Filter or Spring Interceptor to add that data as well as a unique request id to Log4j 2's Thread Context Map.
All Leg Events will include the data in the ThreadContext. The ECSLayout automatically includes all the fields in the ThreadContextMap.
Lastly, you need to propagate the RequestContext to downstream services. You do that by creating a Spring Interceptor that gets wired into the RestTemplate which converts the RequestContext fields into HTTP headers. The downstream service then has a Filter or Spring Interceptor that converts the headers back into RequestContext attributes. Log4j Audit (referenced above) has examples and implementations of all these components.
I should add that the method described above does not implement tracing as described by the WSC Trace Context spec so it is also not compatible with Elasticsearch's distributed tracing support. It is worth noting however, that if one were to include Elasticsearch's distributed tracing support along with New Relic's distributed tracing support they would step on each other.

how to use log4j2 properties inside mulesoft expression component

we use logging heavily in most of the application using the log component inside mule flows. But when I use expression component and manipulate payload according to destination system sometimes I need to validate data for that always I need to write system.out.println inside expression component. Is there any way we can invoke or use log4j2 properties like we do inside java component.
Here is the sample code I am looking for
//mulesoft payload
additionalfields = payload.additionalfields;
if(org.apache.commons.collections.MapUtils.isEmpty(additionalfields))
{
//System.out.println("we have no data "+additionalfields);
}
You can use groovy log like log.info("we have no data "+additionalfields)
or use logger like
import java.util.logging.Logger
Logger logger = Logger.getLogger("")
logger.info ("we have no data "+additionalfields)
logger.debug("we have no data "+additionalfields)
Hope this helps.

Play framework route parameter authorization

I have REST api on my page and for authentication I use the Play session.
Problem is with authorization, I have tens of endpoints looking like this:
GET /api/domains/:domainId/properties/:propertyId/reports
I could add and if statement on each controller method to check whether user has permissions to that domain or property, but can I handle it somehow globally?
I found this module, but it does not seem to handle parameters, just checks if user is in some group/role or not. https://www.playframework.com/documentation/1.0.2.1/secure
I solved this using a custom RequestHandler. There you can extract parameters from the path and validate them. (In scala I could even modify the request route in order to avoid repeating these parameters in all routes, I don't know if you can do it in java too).
(See:
https://www.playframework.com/documentation/2.4.x/JavaHttpRequestHandlers)
You can use the Security.Authenticated annotation as detailed here. For more specific permissions, I recommend Deadbolt

Is there a way to dyamic name log files in java?

I'm using logback as my logging framework and have a couple of jobs that run the same main function with different parameters and would like to create a log file for each job and name the log file with the job's name.
For example, if I had jobs a,b,c that all run MyClass.main() but with different parameters, then I'd like to see a-{date}.log, b-{date}.log, c-{date}.log.
I can achieve the {date} part by specifying a <fileNamePattern>myjob-%d{yyyy-MM-dd}.log</fileNamePattern> in my logback.xml, but I'm not sure how to (or if it is even possible) create the prefix of the file names dynamically (to be the job's name).
Is there a way to dynamically name logfiles in logback? Is there another logging framework that makes this possible?
As a follow up question, am I just taking a bad approach for having multiple jobs that call the same main function with different parameters and wanting a log file named after each job? If so is there a standard/best practice solution for this case?
EDIT: The reason why I want to name each log file after the name of the job is that each job naturally defines a "unit of work" and it is easier for me to find the appropriate log file in case one of the job fails. I could simply use a rolling log file for jobs a,b,c but I found it harder for me to look through the logs and pinpoint where each job started and ended.
I would use you own logging.
public static PrintWriter getLogerFor(String prefix) {
SimpleDatFormat sdf = new SimpleDateFormat("yyyy-MM-dd");
String filename= prefix + sdf.format(new Date());
return new PrintWriter(filename, true); // auto flush.
}
You can write a simple LRU cache e.g. with LinkedHashMap to reuse the PrintWriters.
Is there a way to dynamically name logfiles in logback? Is there another logging framework that makes this possible?
I don't believe this is possible using the out of the box appenders (File, RollingFile etc) configured by a standard logback.xml file. To do what you want, you would need to dynamically create appenders on the fly and assign loggers to different appenders. Or you would need to invent a new appender that was smart enough to write to multiple files at the same time, based on the logger name.
am I just taking a bad approach for having multiple jobs that call the same main function with different parameters and wanting a log file named after each job?
The authors of logback address this issue and slightly discourage it in the section on Mapped Diagnostic Context
A possible but slightly discouraged approach to differentiate the logging output of one client from another consists of instantiating a new and separate logger for each client. This technique promotes the proliferation of loggers and may increase their management overhead. ... A lighter technique consists of uniquely stamping each log request servicing a given client.
Then they go on to discuss mapped diagnostic contexts as a solution to this problem. They give an example of a NumberCruncherServer which is crunching numbers, for various clients in various threads simultaneously. By setting the mapped diagnostic context and an appropriate logging pattern it becomes easy to determine which log events originated from which client. Then you could simply use a grep tool to separate logging events of interest into a separate file for detailed analysis.
Yes you can.
First you have to familiarize your self with these 2 concepts: Logger and Appender. Generally speaking, your code obtains a Logger, and invoke logging method such as debug(), warn(), info() etc. Logger has Appender attached to it, and Appender presents the logging information to the user according to the configuration set to it.
Once you're familiar, what you need to do is to dynamically create a FileAppender with a different file name for each different job type, and attach it to your Logger.
I suggest you spend some time with logback manual if none of above make sense.
You can make use of the logback discriminators, as discriminators' keys can be used in the <FileNamePattern> tag. I can think of two options:
Option One:
You can use the Mapped Diagnostic Context discriminator to implement your logging separation, you'll need to set a distinct value from each job using MDC.put();
Once you've done that your appender on logback configuration would look something like:
<appender name="SIFT" class="ch.qos.logback.classic.sift.SiftingAppender">
<discriminator class="ch.qos.logback.classic.sift.MDCBasedDiscriminator">
<key>jobName</key> <!-- the key you used with MDC.put() -->
<defaultValue>none</defaultValue>
</discriminator>
<sift>
<appender name="jobsLogs-${jobName}" class="ch.qos.logback.core.rolling.RollingFileAppender">
<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<FileNamePattern>${jobName}.%d{dd-MM-yyyy}.log.zip</FileNamePattern>
.
.
.
</rollingPolicy>
<layout class="ch.qos.logback.classic.PatternLayout">
<Pattern>...</Pattern>
</layout>
</appender>
</sift>
</appender>
Option Two:
Implement your own discriminator -implementing ch.qos.logback.core.sift.Discriminator-, to discriminate based on the thread name. Would look something like this:
public class ThreadNameDiscriminator implements Discriminator<ILoggingEvent> {
private final String THREAD_NAME_KEY = "threadName";
#Override
public String getDiscriminatingValue(ILoggingEvent event) {
return Thread.currentThread().getName();
}
#Override
public String getKey() {
return THREAD_NAME_KEY;
}
// implementation for more methods
.
.
.
}
The logging appender would look like option one with the discriminator class being ThreadNameDiscriminator and the key being threadName. In this option there is no need to set a value to the MDC from your jobs, hence, no modification on them is required.

Playframework routes question

I have this in my applcation routes file:
GET /new Tweets.create
POST /new Tweets.save
And in my view I'm creating a form like this:
#{form #save()}
...
#{/form}
But once is submit the form it's sending me to /tweets/save and not to /new. Any ideas how I can fix this? Thanks!
If you have already tried the route below (which is the correct way to use routes)
#{form #Tweets.save()}
and this did not work, I think you may have put your route in the wrong place. Make sure it is at the top of the routes file, and not after the catch-all route. The routes file is processed in order, so if the catch-all is found, this is used first and your other route is ignored. The catch-all looks like
* /{controller}/{action} {controller}.{action}
Try using
#{form #Tweets.save()}
I think it is suggested to use class names with method names.
EDIT:
The way the play framework routing works is you define some route as
GET /clients Clients.index
If a request encountered with URI /clients then it will be intercepted to Clients.index(). If you have another routing such that
GET /clients Clients.save
Then the framework ignores this routing because /clients aready has a mapping. (Most probably it is giving some error in the console or logging stream, check your logs.)
Therefore you can't make it work like that. I see, you request a reverse mapping that will return the same URI for different methods. However the framework aims to intercept requests so that it will simply ignore your second routing.
Try to separate pages. Most probably what you want is to render the same views for two functions. You can do that without redirecting them to the same URI.
I think (if I did not misread) that the issue is you expecting the wrong behavior.
As I understand you expect that the submit will go to Tweet.save() (POST method) and then back to Tweet.create() (GET method) as both share the same path (/new).
In reality Play is calling Tweet.save() and it expects a render at the end of Tweet.save() to display some result. If you want to redirect to Tweet.create() you can do a call to that method at the end of the implementation of Tweet.save(), with either:
create(<params>);
or
render("#create", <params>);
and that should redirect (via 302) to the GET version.

Categories

Resources