Spring Integration: save integration flows logs to database - java

I have been required to log the integration flow requests and save to logs that I am getting to the database. Logging the request is working fine, but my question is how can I store this logs that I am getting from the logAndReply() to the database?
return IntegrationFlows.from(Http.inboundGateway("/foo")
.requestMapping(m -> m.methods(HttpMethod.POST))
.validator(new Validator()).errorChannel("errorHandler.input"))
.transform(Transformers.objectToString())
.transform(new TransformingMethod())
.logAndReply(LoggingHandler.Level.INFO);

This is the part of your logging framework and in most cases it is called JdbcAppender. For example Log4J:
https://howtodoinjava.com/log4j2/jdbcappender-example/
log4j2 JDBCAppender with Spring
http://smasue.github.io/log4j2-spring-database-appender
On the other hand, instead of relying on the logging framework, you can do it yourself and use a JdbcMessageHandler: https://docs.spring.io/spring-integration/docs/current/reference/html/jdbc.html#jdbc-outbound-channel-adapter

Related

Implement tracing in a spring boot application

I have a spring boot application and I want to configure log tracing to all the application, I added the setup for **datadog Agent ** as in in the documentation: https://docs.datadoghq.com/tracing/setup_overview/setup/java/?tabs=containers and I can see trace ids and spanIds associated with each request, there are other options such as Spring cloud Sleuth to add tracing and span ids as well:
https://spring.io/projects/spring-cloud-sleuth
and also in the datadog documentation, there is a way for connecting Java logs and tracing in https://docs.datadoghq.com/tracing/connect_logs_and_traces/java/?tabs=log4j2
I have a confusion with this three approaches and which one of them I can use for the tracing purpose, with spring sleuth I know that is easier to generate span a trace ids, but not sure if them are sent to datadog or I need to configure something additional.
Also the connecting Java logs and tracing vs datadog agent its not clear for me.
I am new with this topic and it is not clear for me how can I implement tracing for all the process included in a request.

Is it possible to create custom fields in a Kibana dashboard?

I am using a Java micro-service architecture in my application and generating separate log files for each micro-service.
I am using ELK stack approach to visualize the logs in Kibana, but the problem is whatever the fields that I'm getting from Elastic Search that are related to server logs fields. some example fields are #timestamp,#version,#path,#version.keyword,#host.
i want to customize this fields by adding some fields like customerId,txn-Id,mobile no so that we can analyze the data easily.
I'm using org.apache.logging.log4j2 to write the logs. Can I set above fields (customerId,txn-Id,mobile) to log files? And then Elastic will store these fields with the above default fields and then these custom fields should available in a Kibana dashboard. Is this possible?
It's definitely possible to do that. I've not done it with the log4j2 stack (I have with slf4j/logback), but the basic approach is:
set those fields in the Mapped Diagnostic Context (I'm fairly sure log4j2 supports that)
use a log appender which logs to logstash-structured JSON
configure filebeat to ship the JSON logs
if filebeat is shipping to logstash, you'll need to configure logstash to pass those preformatted JSON logs directly to elasticsearch
It is definitely possible. I am doing that now with my applications. However, the output looks a bit different from yours. The basic guide for doing this can be found at Logging in the Cloud on the Log4j2 web site.
The "normal" log view looks very similar to what you would see when logging to a file.
However, if you select a message you can see the individual fieds.
The Log4j2 configuration uses a TCP Socket appender that is configured to write to a cluster of Logstash servers that use a single DNS entry and to use the Gelf layout.
You can also use MapMessages to capture individual data elements and log them. While this currently works it is slightly cumbersome so I have recently committed improvements that will be available in Log4j 2.15.0.
It is important to note that the Logging in the Cloud page briefly mentions storing your logging configuration in Spring Cloud Config. If you want to have a common base configuration while allowing apps to do some customization this works very, very well. However, The Gelf, Json Template Layout and TCP Appender are all independent from that and can be used without Spring Boot.

Spring Webflux - logging connection ID and new Connection log not displayed when using Webflux webclient

Many tutorials online are pointing out the importance of having connection ID in a Spring Webflux application.
For instance, see this screenshot taken from a presentation in a conference.
However, I am not getting those IDs. I can only see the part with the time, up until the [ctor-http-nio5].
I cannot see the connection ID, I cannot see the statement "New http connection"
What can be the root cause of me not being able to display such? I really would like to see those interesting logs.
Thank you for your help
This ID is unique for each of the connections made to your server and you can extract it by calling the getLogPrefix() method of ServerWebExchange class and can append it to your logs by directly or by putting it to the MDC.
By default spring framework logs this id in the framework logs.
These framework logs are at the DEBUG level so you need to enable debug level to see the logs with ids.
By adding the following configuration to the application.properties you can see the logs with ids.
logging.level.org.springframework=DEBUG

How to check logging implementation details in Spring Boot

I am using Spring boot.
I want to check which logging implementation is printing the message - I know with Spring boot default is Logback, and I have excluded it as mentioned in this post so mostly Logback will not be printing the messages, but I want to show it as a proof that Logback implementation is not printing and probably Log4j is printing.
Basically I need an API which I can call and I can get the details of which is the logging implementation, the way we can know Java version etc.
You can enforce Spring Boot to use a certain implementation by setting this property:
org.springframework.boot.logging.LoggingSystem
with any of:
org.springframework.boot.logging.logback.LogbackLoggingSystem
org.springframework.boot.logging.log4j2.Log4J2LoggingSystem
org.springframework.boot.logging.java.JavaLoggingSystem
none (to swith off completely)
This explicit configuration would be your proof.
To check configuration you can install spring actuator framework. Through web endpoints all config params can be queried.

Dynamic | User case Based logging

Is there any logging framework, which helps me change logging levels dynamically based on the request parameters received ?
If request has a parameter with debug enabled to true, then only it should log, else not.
Does spring sleuth provide this feature in cloud environment?
You can use Spring Boot & Spring Cloud Config and standard Slf4j logging mechanism. You can check out this answer for more information - Managing logging.level using ConfigServer
If you just want conditional logging you would use a NDC/MDC and a filter using the frameworks that support that feature.
If you want something more general then for instance, setup a com.foo.request that is set to say INFO and a com.foo.request.debug that is set to some lower level. Pick and choose the logger on request parameter.

Categories

Resources