how to use log4j2 properties inside mulesoft expression component - java

we use logging heavily in most of the application using the log component inside mule flows. But when I use expression component and manipulate payload according to destination system sometimes I need to validate data for that always I need to write system.out.println inside expression component. Is there any way we can invoke or use log4j2 properties like we do inside java component.
Here is the sample code I am looking for
//mulesoft payload
additionalfields = payload.additionalfields;
if(org.apache.commons.collections.MapUtils.isEmpty(additionalfields))
{
//System.out.println("we have no data "+additionalfields);
}

You can use groovy log like log.info("we have no data "+additionalfields)
or use logger like
import java.util.logging.Logger
Logger logger = Logger.getLogger("")
logger.info ("we have no data "+additionalfields)
logger.debug("we have no data "+additionalfields)
Hope this helps.

Related

Is there a way to call a API only if the url is in application.yml?

i'm a neophyte in the field.
I'm making a simple Spring Boot application that calls a another API, edit the content and return the data at client. So i want it to be possible to call a new API from the application only if its URL is in the configuration file (application.yml)
For example, if the application.yml is
custom:
external-api:
firstapi: http://api1.com/data/
secondapi: https://api2.com/data
and if I try to create a new RequestMapping (http://api3.com/) in the code, it will not return anything or will not be possible anyway.
Is there a way?
One has to design the configuration class to read the properties file in spring-boot.
#Value("${custom.external-api.firstapi}")
private String firstApi;
Read more here
If you want to check if there is a property defined in your application.yml file
You can do like this :
#Value("${custom.external-api.thirdapi:#{null}")
private String thirdApi;
And you will be able to check if thirdApi is not null
if(thirdApi != null){
// call the third api
}

Send log messages as JSON

I am using spring boot in a project and currently exploring the logging behaviour. For that I'm using the zipkin service.
I have exported my logs to a json file using proper logback.xml:
{"#timestamp":"2018-07-12T17:53:44.613+05:30","#version":"1","message":"in meth3","logger_name":"com.example.demo.Controller.Controller","thread_name":"http-nio-8089-exec-3","level":"INFO","level_value":20000,"traceId":"62bcfafad3a37a09","spanId":"62bcfafad3a37a09","spanExportable":"true","X-Span-Export":"true","X-B3-SpanId":"62bcfafad3a37a09","X-B3-TraceId":"62bcfafad3a37a09","caller_class_name":"com.example.demo.Controller.Controller","caller_method_name":"meth3","caller_file_name":"Controller.java","caller_line_number":43,"appname":"pom.artifactId_IS_UNDEFINED","version":"pom.version_IS_UNDEFINED"}
Is there a way so that I could insert a jsonObject in my message part of the log. Something like:
logger.info(<some_json_object>)
I have tried searching a way extensively but to no avail. Is it even possible?
The slf4j API only takes String as the input to the info, debug, warn, error messages.
What you could do is create your own JsonLogger wrapper, which takes a normal Logger (maybe wraps around it), which you could include at the top of your classes like:
private static final JsonLogger logger = new JsonLogger(LoggerFactory.getLogger(MyClass.class));
You can then use Jackson, GSON or your favourite object to JSON mapper inside your JsonLogger so that you could do what you want. It can then offer the info, debug, warn, error methods like a normal logger.
You can also create your own JsonLoggerFactory which encapsulates this for you so that the line to include in each class is more concise.

Add custom value to every log message

Let's say that I have a REST API with endpoint /user/{user_id}/foo.
Now when it is called I would like that all logs that come from handling this request contain information about {user_id}. Is it possible to achieve that without passing {user_id} to every method?
I'm using SLF4j for logging, my application is based on Spring Boot.
You could also use MDC for this, see here. It's essentially a map, you just put your contextual information in it (e.g. user id) and then you can use it in your log layout. Be aware that this only works with certain underlying frameworks like logback, where a sample layout pattern would look like this:
<Pattern>%X{user_id} %m%n</Pattern>
Check the logback manual for more details on this.
You can use Logback's Mapped Diagnotic Context to propagate the {user_id} to every log message.
There are two parts to this:
Push your {user_id} into MDC e.g. MDC.put("user_id", "Pawel");
Include the MDC entry in your log statements. You do this by specifying it in your logging pattern. So, if you store the user id in a MDC entry named "user_id" the you would set logging.pattern.level=user_id:%X{user_id} %5p to include the value of that entry in every log event.
More details in the docs

How should I build my Messages in Spring Integration?

I have an application I coded which I am refactoring to make better use of Spring Integration. The application processes the contents of files.
The problem (as I see it) is that my current implementation passes Files instead of Messages, i.e. Spring Integration Messages.
In order to avoid further rolling my own code, which I then have to maintain later, I'm wondering if there is a recommended structure for constructing Messages in Spring Integration. What I wonder is if there is some recommended combination of channel with something like MessageBuilder that I should use.
Process/Code (eventually)
I don't yet have the code to configure it but I would like to end up with the following components/processes:
Receive a file, remove header and footer of the file, take each line and convert it into a Message<String> (This it seems will actually be a Splitter) which I send on to...
Channel/Endpoint sends message to Router
Router detects format String in Payload and routes to the appropriate channel similar to Order Router here...
Selected channel then builds appropriate type of Message, specifically typed messages. For example I have the following builder to build a Message...
public class ShippedBoxMessageBuilder implements CustomMessageBuilder {
#Override
public Message buildMessage(String input) {
ShippedBox shippedBox = (ShippedBox) ShippedBoxFactory.manufactureShippedFile(input);
return MessageBuilder.withPayload(shippedBox).build();
}
...
Message is routed by type to the appropriate processing channel
My intended solution does seem like I've complicated it. However, I've purposefully separated two tasks 1) Breaking a file into many lines of Messages<String> and 2) Converting Messages<String> into Messages<someType>. Because of that I think I need an additional router/Message builder for the second task.
Actually, there is MessageBuilder support in the Spring Integration.
The general purpose of such Frameworks is to help back-end developers to decouple their domain code from messaging infrastructure. Finally, to work with Spring Integration you need to follow the POJO and Method Invocation principles.
You write your own services, transformers and domain models. Then you just use some out of the box compoenents (e.g. <int-file:inbound-channel-adapter>) and just refer from there to your POJOs, but not vise versa.
I recommend you to read Spring Integration in Action book to have more pictures on the matter.
Can you explain the reason to get deal with Spring Integration components directly?
UPDATE
1) Breaking a file into many lines of Messages
The <splitter> is for you. You should write some POJO which returns List<String> - the lines from your file without header and footer. How to read lines from File isn't a task of Spring Integration. Especially, if the "line" is something logical, not the real file line.
2) Converting Messages into Messages
One more time: there is no reason to build Message object. It's just enough to build new payload in some transformer (again POJO) and framework wrap to its Message to send.
Payload Type Router speaks for itself: it checks a payload type, but not Message type.
Of course, payload can be Message too, and even any header can be as well.
Anyway your Builder snapshot shows exactly a creation of plain Spring Integration Message in the end. And as I said: it will be enough just to transform one payload to another and return it from some POJO, which you will use as a transformer reference.

Logging in spring MVC

I'm currently working on a web application using Spring MVC, and I use the #ExceptionHandler annotation in every controllers of the application.
So basically I got a method like this:
#ExceptionHandler(RuntimeException.class)
public String handleException(RuntimeException ex) {
injectedService.notifyAndLogException(ex.getMessage());
return ("error_page");
}
My idea is to log and send an email to an application administrator in the injected service.
For now, I've tried to read some documentation about logging in spring application, and all the things I've seen is setting a static logger in each controller.
Like this:
private final Logger log = LoggerFactory.getLogger(Controller.class);
#ExceptionHandler(RuntimeException.class)
public String handleException(RuntimeException ex) {
log.info("Logging error");
injectedService.notifyException(ex.getMessage());
return ("error_page");
}
I'd like to know what is the point to use a logger in each controller instead of using it in one point only (the service)?
I'd like to know what is the point to use a logger in each controller instead of using it in one point only
If you use a single logger for the whole application, then every log message will be logged as coming from the same component. By using a logger per class or component, then your log files will contain information about which component logged the message.
For example, when you do:
Logger log = LoggerFactory.getLogger(Controller.class);
This creates a logger with the name of the Controller class, which will generally be displayed in the log file, e.g.
2012-03-07:12:59:00 com.x.y.Controller Hello!
This is just a convention, but it's a good one that I advise you follow.
a logger in each of your class files enables you get 'debug' or 'info' level when you are in production, or not able to connect a debugger.
Since you can limit via package or even class file name, what is logged, you can pin point to find errors, or to see what is happening under different load situations (concurrency problems, resources used ). If you use one generic logger, then you may flood your log file.
With the logger in the class that received the exception, you may be able to get at class variables that are not being passed into your exception handler.
I would also recommend that you do not do
injectedService.notifyAndLogException(ex.getMessage());
but pass the exception into your notify. While stack traces can be notorious verbose, the messages usually are not very help full ( NullPointerException without a stacktrace ? ). In your notify service you can set the subject to ex.getMessage() and the body have the entire stack trace.
Your controllers can extend an abstract class that declares a logger like that
protected Logger logger = LoggerFactory.getLogger( getClass() );
This logger can be used in all controller and it will prefix the log message with the controller class name.

Categories

Resources